Came across this today on an app we are 开发者_JAVA技巧deploying across many servers. I was hashing some strings to store in a shared key/value store. The .hash method of String is returning different integers depending on the server. Any ideas why? Note that I am interested in why; not possible work arounds.
Example:
server1 $ ruby -v
ruby 1.9.2p180 (2011-02-18 revision 30909) [x86_64-linux]
server1 $ irb
irb(main):001:0> "test".hash
=> 4146582576695053125
server2 $ ruby -v
ruby 1.9.2p180 (2011-02-18 revision 30909) [x86_64-linux]
server2 $ irb
"test".hash
=> 3479379392688537032
These machines are EC2 instances with the same specs and build.
From a Ruby dev in the Ruby forum:
It is intended. Ruby 1.9 explicitly use session local random seed to calculate a hash for strings (and some other objects).
This is because the implementation of Object#hash is different between versions (like 1.9.1 and 1.9.2) and implementations (like JRuby, Rubinius, IronRuby, and so on). We want people to write portable code around Object#hash, so we did so.
You should use Digest::SHA256 or some other digest routines when you want some hash value (message digest).
And follow-up from another dev:
Also, it helps to avoid some denial of service attacks, such as registering hundreds and thousands of users with usernames that have the same hash code.
精彩评论