开发者

Caching and threadsafety

开发者 https://www.devze.com 2023-03-24 01:03 出处:网络
I am caching data in an ASP.NET website through the System.Web.Caching.Cache-Class, because retrieving the data is very costly and it changes only once in a while, when our content people change data

I am caching data in an ASP.NET website through the System.Web.Caching.Cache-Class, because retrieving the data is very costly and it changes only once in a while, when our content people change data in the backend.

So I create the data in Application_Start and store it in Cache, with an expiration time of 1 day.

When accessing the data (happens on many pages of the website), I have something like this now in a static CachedData class:

public static List<Kategorie> GetKategorieTitelListe(Cache appCache)
{
    // get Data out of Cache
    List<开发者_Go百科Kategorie> katList = appCache[CachedData.NaviDataKey] as List<Kategorie>;
    // Cache expired, retrieve and store again
    if (katList == null)
    {
            katList = DataTools.BuildKategorienTitelListe();
            appCache.Insert(CachedData.NaviDataKey, katList, null, DateTime.Now.AddDays(1d), Cache.NoSlidingExpiration);
    }
    return katList;
}

The problem I see with this code is that its not threadsafe. If two users open two of these pages at the same time and the cache just ran out, there is a risk the data while be retrieved multiple times.

But if I lock the method body, I will run into performance troubles, because only one user at a time can get the data list.

Is there an easy way to prevent this? What's best practice for a case like this?


You are right, your code is not thread safe.

// this must be class level variable!!!
private static readonly object locker = new object();

    public static List<Kategorie> GetKategorieTitelListe(Cache appCache)
    {
        // get Data out of Cache
        List<Kategorie> katList = appCache[CachedData.NaviDataKey] as List<Kategorie>;

        // Cache expired, retrieve and store again
        if (katList == null)
        {
            lock (locker)
            {
                katList = appCache[CachedData.NaviDataKey] as List<Kategorie>;

                if (katlist == null)  // make sure that waiting thread is not executing second time
                {
                    katList = DataTools.BuildKategorienTitelListe();
                    appCache.Insert(CachedData.NaviDataKey, katList, null, DateTime.Now.AddDays(1d), Cache.NoSlidingExpiration);
                }
            }
        }
        return katList;
    }


The MSDN documentation states that the ASP.NET Cache class is thread safe -- meaning that their contents are freely accessible by any thread in the AppDomain (a read/write will be atomic for example).

Just keep in mind that as the size of the cache grows, so does the cost of synchronization. You might want to take a look at this post

By adding a private object to lock on, you should be able to run your method safely so that other threads do not interfere.

private static readonly myLockObject = new object();

public static List<Kategorie> GetKategorieTitelListe(Cache appCache)
{
    // get Data out of Cache
    List<Kategorie> katList = appCache[CachedData.NaviDataKey] as List<Kategorie>;

    lock (myLockObject)
    {
        // Cache expired, retrieve and store again
        if (katList == null)
        {
            katList = DataTools.BuildKategorienTitelListe();
            appCache.Insert(CachedData.NaviDataKey, katList, null, DateTime.Now.AddDays(1d), Cache.NoSlidingExpiration);
        }
        return katList;
    }
}


I dont see other solution than locking.

private static readonly object _locker = new object ();

public static List<Kategorie> GetKategorieTitelListe(Cache appCache)
{
    List<Kategorie> katList;

    lock (_locker)
    {
        // get Data out of Cache
        katList = appCache[CachedData.NaviDataKey] as List<Kategorie>;
        // Cache expired, retrieve and store again
        if (katList == null)
        {
                katList = DataTools.BuildKategorienTitelListe();
                appCache.Insert(CachedData.NaviDataKey, katList, null, DateTime.Now.AddDays(1d), Cache.NoSlidingExpiration);
        }
    }
    return katList;
}

Once the data is in the cache, concurrent threads will only wait the time of getting the data out, i.e. this line of code:

katList = appCache[CachedData.NaviDataKey] as List<Kategorie>;

So the performance cost will not be dramatic.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号