Currently I have Has开发者_如何转开发hMap implemented which
private static Map<String, Item> cached = new HashMap<String, Item>();
and Item is a object with properties Date expirationTime , and byte[] data
This map is used when multiple threads concurrently start hitting this. The check I do is
1.
public static final byte[] getCachedData(HttpServletRequest request) throws ServletException
{
String url = getFullURL(request);
Map<String, Item> cache = getCache(request); // this chec
Item item = null;
synchronized (cache)
{
item = cache.get(url);
if (null == item)
return null;
// Make sure that it is not over an hour old.
if (item.expirationTime.getTime() < System.currentTimeMillis())
{
cache.remove(url);
item = null;
}
}
if (null == item)
{
log.info("Expiring Item: " + url);
return null;
}
return item.data;
}
2. If data is returned null, then we create and data and cache it in hashMap
public static void cacheDataX(HttpServletRequest request, byte[] data, Integer minutes) throws ServletException
{
Item item = new Item(data);
String url = getFullURL(request);
Map<String, Item> cache = getCache(request);
log.info("Caching Item: " + url + " - Bytes: " + data.length);
synchronized (cache)
{
Calendar cal = Calendar.getInstance();
cal.add(Calendar.MINUTE, minutes);
item.expirationTime = cal.getTime();
cache.put(url, item);
}
}
Seems like if multiple threads access the say key (url in this case) , then data gets added to cache more than once at same key location [ as getCacheData will return null for multiple threads since hashmap has not finished writing data for first thread ]
Any suggestions as how to solve the issue ?
In cacheDataX add a check for the existence of the item before you add (inside of the synchronized block).
synchronized (cache)
{
if (cache.get(url) == null) {
Calendar cal = Calendar.getInstance();
cal.add(Calendar.MINUTE, minutes);
item.expirationTime = cal.getTime();
cache.put(url, item);
}
}
This will ensure that multiple threads that have already done a lookup and returned null cannot all add the same data to the cache. One will add it and others threads will silently ignore since the cache has already been updated.
You need one synchronize block, to cover both getting something from the cache plus inserting into the cache. As the code stands you have a race condition: multiple threads can execute step 1 before anybody executes step 2.
精彩评论