开发者

Cache SHA1 digest result?

开发者 https://www.devze.com 2022-12-25 18:21 出处:网络
I\'m storing several versions of a file based on a digest of the original filename and its version, like this:

I'm storing several versions of a file based on a digest of the original filename and its version, like this:

$filename = sha1($original . ':' . $version);

Would it be worth it to cache the digest ($filename) in memcache as a key/valu开发者_运维问答e pair (the key being the original + version and value the sha1 hash), or is generating the digest quick enough (for a high traffic php web app)?

Thanks,

Johnathan


You're much better off not caching the hashes. Computing 100,000 hashes on short filenames takes around 1/2 a second on my laptop (a reasonably fast Core 2 Duo):

        byte[][] fileNames = Enumerable.Range(0, 100).Select(i => new UnicodeEncoding().GetBytes(System.IO.Path.GetRandomFileName())).ToArray();
        Stopwatch stopWatch = new Stopwatch();

        using (SHA1CryptoServiceProvider sha1 = new SHA1CryptoServiceProvider())
        {
            stopWatch.Start();
            for (int j = 0; j < 1000; j++)
            {
                for (int i = 0; i < 100; i++)
                {
                    sha1.ComputeHash(fileNames[i]);
                }
            }
            stopWatch.Stop();
            Console.WriteLine("Total: {0}", stopWatch.Elapsed);
            Console.WriteLine("Time per hash: {0}", new TimeSpan(stopWatch.ElapsedTicks / 100000));
        }

Total: 00:00:00.5186110 Time per hash: 00:00:00.0000014


Hashes are extremely fast, especially for small inputs (such as the name and version of a file).

Now, if you were hashing the files themselves, and they were very large, that would be a different story (simply because it would take so long to read the entire file from off the disk)

0

精彩评论

暂无评论...
验证码 换一张
取 消