开发者

Django Caching - How to generate custom key names?

开发者 https://www.devze.com 2023-02-07 13:24 出处:网络
Right now, I am retrieving information from an API, and I would like to cache the information I get back, so I do not have to constantly hit their server and use up my max API call requests. Right now

Right now, I am retrieving information from an API, and I would like to cache the information I get back, so I do not have to constantly hit their server and use up my max API call requests. Right now, a user can search up a particular keyword, like "grapes", I would like to cache the retrieved string by calling "cache.set(search_result, info_retrieved, 600)" where "search_result" is the user's search result, in this case, "grapes". I want the key to be the user's search result, which is "grapes". I cannot do this since the cache requires the key to be a string. How can I get around this? I cannot use a database because the information updates too often.

I could use a database, but 开发者_开发百科I would be writing information to it, then deleting it after a few minutes, which seems impractical. So, I just want to cache it temporarily.


As Shawn Chin mentioned, you should already have a string "version" of your search query, which would work just fine as a cache key.

One limitation with memcached (not sure about other backends) is that certain characters (notably, spaces) are not allowed in keys. The easiest way to get around this is to hash your string key into a hex digest and use that as a key:

from hashlib import sha1
key = sha1('grapes').hexdigest() # '35c4cdb50a9a6b4475da4a66d955ef2a9e1acc39'

If you might have different results for different users (or based on whatever criteria), you can tag/salt/flavor the key with a string representation of that information:

from hashlib import sha1
key = sha1('%s:%s:%s' % (user.id, session.sessionid, 'grapes')).hexdigest()

You could also use django-newcache:

Newcache is an improved memcached cache backend for Django. It provides four major advantages over Django's built-in cache backend:

  • It supports pylibmc.
  • It allows for a function to be run on each key before it's sent to memcached.
  • It supports setting cache keys with infinite timeouts.
  • It mitigates the thundering herd problem.

It also has some pretty nice defaults. By default, the function that's run on each key is one that hashes, versions, and flavors the key. More on that later.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号