开发者

Threading implications of common caching idiom

开发者 https://www.devze.com 2023-03-29 06:15 出处:网络
A common caching idiom is to check for the presence of an item i开发者_开发技巧n the cache, retrieve if present or create if not.

A common caching idiom is to check for the presence of an item i开发者_开发技巧n the cache, retrieve if present or create if not.

Does this not create a condition whereby if a context switch from Thread 1 to Thread 2 occurs at the commented location, the value added to the cache may be immediately overwritten when the context switch back to Thread 1 occurs? The downside being that calculateFooBar() has now been invoked twice to calculate the same cached item. Is this just an accepted "minor" consequence of this simple caching implementation. Is a critical section not typically used because that would add overhead to all GetOrCreate methods?

Edit: _cache is a reference to a shared data cache (e.g. the ASP.NET data cache).

//not real C#
class FooBarDictionary
{
    ...

    FooBar GetOrCreate(string key)
    {
        FooBar fooBar;

        if (!_cache.TryGetValue(key, out fooBar))
        {
            fooBar = calculateFooBar(); //context switch occurs here
            fooBars.Add(key, fooBar);
        }

        return fooBar;
    }
}


If your program works correctly even if you had two instances of that object and only the resource allocation time & memory is your concern I wouldn't bother adding locks. In the long run thread syncronization can be more expensive.

ConcurrentDictionary<> would be a good choice for such "nolock" container, as others mentioned already.


Everything will depend on what this _cache variable is: it's type and scope. If it is a static variable and not thread safe such as a Dictionary<TKey, TValue> you need to synchronize access to it using a lock. In .NET 4.0 you have the ConcurrentDictionary<TKey, TValue> type which achieves similar things in a thread safe manner. Your code is fine but without a lock there is no guarantee that the calculation wouldn't occur twice for the same key in a very short lapse of time.

You may take a look at the following blog post for a nice implementation of this pattern. Also if you are using .NET 4.0, you could use the new System.Runtime.Caching assembly.


We use in several scenarios some custom caching... the new Concurrent* - Collections in .NET 4 are extremely well suited for such impelementations...

With foobars being a ConcurrentDictionary<string, FooBar> you could do something like:

return foobars.GetOrAdd (key, (k) => calculateFooBar() );

This code would replace the whole body of you sample method GetOrCreate .

For more insight see http://geekswithblogs.net/BlackRabbitCoder/archive/2011/02/17/c.net-little-wonders-the-concurrentdictionary.aspx

0

精彩评论

暂无评论...
验证码 换一张
取 消