开发者

Optimal bcrypt work factor

开发者 https://www.devze.com 2023-01-31 03:03 出处:网络
What would be an ideal bcrypt work factor for password hashing. If I use a factor of 开发者_JS百科10, it takes approx .1s to hash a password on my laptop. If we end up with a very busy site, that tur

What would be an ideal bcrypt work factor for password hashing.

If I use a factor of 开发者_JS百科10, it takes approx .1s to hash a password on my laptop. If we end up with a very busy site, that turns into a good deal of work just checking people's passwords.

Perhaps it would be better to use a work factor of 7, reducing the total password hash work to about .01s per laptop-login?

How do you decide the tradeoff between brute force safety and operational cost?


Remember that the value is stored in the password: $2a$(2 chars work)$(22 chars salt)(31 chars hash). It is not a fixed value.

If you find the load is too high, just make it so the next time they log in, you crypt to something faster to compute. Similarly, as time goes on and you get better servers, if load isn't an issue, you can upgrade the strength of their hash when they log in.

The trick is to keep it taking roughly the same amount of time forever into the future along with Moore's Law. The number is log2, so every time computers double in speed, add 1 to the default number.

Decide how long you want it to take to brute force a user's password. For some common dictionary word, for instance, your account creation probably already warned them their password was weak. If it's one of 1000 common words, say, and it takes an attacker 0.1s to test each, that buys them 100s (well, some words are more common...). If a user chose 'common dictionary word' + 2 numbers, that's over two hours. If your password database is compromised, and the attacker can only get a few hundred passwords a day, you've bought most of your users hours or days to safely change their passwords. It's a matter of buying them time.

http://www.postgresql.org/docs/8.3/static/pgcrypto.html has some times for cracking passwords for you to consider. Of course, the passwords they list there are random letters. Dictionary words... Practically speaking you can't save the guy whose password is 12345.


Short Version

The number of iterations that gives at least 250 ms to compute

Long Version

When BCrypt was first published, in 1999, they listed their implementation's default cost factors:

  • normal user: 6
  • super user: 8

A bcrypt cost of 6 means 64 rounds (26 = 64).

They also note:

Of course, whatever cost people choose should be reevaluated from time to time

  • At the time of deployment in 1976, crypt could hash fewer than 4 passwords per second. (250 ms per password)
  • In 1977, on a VAX-11/780, crypt (MD5) could be evaluated about 3.6 times per second. (277 ms per password)

That gives you a flavor of the kind of delays that the original implementers were considering when they wrote it:

  • ~250 ms for normal users
  • ~1 second for super users.

But, of course, the longer you can stand, the better. Every BCrypt implementation I've seen used 10 as the default cost. And my implementation used that. I believe it is time for me to to increase the default cost to 12.

We've decided we want to target no less than 250ms per hash.

My desktop PC is an Intel Core i7-2700K CPU @ 3.50 GHz. I originally benchmarked a BCrypt implementation on 1/23/2014:

1/23/2014  Intel Core i7-2700K CPU @ 3.50 GHz

| Cost | Iterations        |    Duration |
|------|-------------------|-------------|
|  8   |    256 iterations |     38.2 ms | <-- minimum allowed by BCrypt
|  9   |    512 iterations |     74.8 ms |
| 10   |  1,024 iterations |    152.4 ms | <-- current default (BCRYPT_COST=10)
| 11   |  2,048 iterations |    296.6 ms |
| 12   |  4,096 iterations |    594.3 ms |
| 13   |  8,192 iterations |  1,169.5 ms |
| 14   | 16,384 iterations |  2,338.8 ms |
| 15   | 32,768 iterations |  4,656.0 ms |
| 16   | 65,536 iterations |  9,302.2 ms |

Optimal bcrypt work factor

Future Proofing

Rather than having a fixed constant, it should be a fixed minimum.

Rather than having your password hash function be:

String HashPassword(String password)
{
   return BCrypt.HashPassword(password, BCRYPT_DEFAULT_COST);
}

it should be something like:

String HashPassword(String password)
{  
   /*
     Rather than using a fixed default cost, run a micro-benchmark
     to figure out how fast the CPU is.
     Use that to make sure that it takes **at least** 250ms to calculate
     the hash
   */
   Int32 costFactor = this.CalculateIdealCost();
   //Never use a cost lower than the default hard-coded cost
   if (costFactor < BCRYPT_DEFAULT_COST) 
      costFactor = BCRYPT_DEFAULT_COST;

   return BCrypt.HashPassword(password, costFactor);
}

Int32 CalculateIdealCost()
{
    //Benchmark using a cost of 5 (the second-lowest allowed)
    Int32 cost = 5;

    var sw = new Stopwatch();
    sw.Start();
    this.HashPassword("microbenchmark", cost);
    sw.Stop();

    Double durationMS = sw.Elapsed.TotalMilliseconds;

    //Increasing cost by 1 would double the run time.
    //Keep increasing cost until the estimated duration is over 250 ms
    while (durationMS < 250)
    {
       cost += 1;
       durationMS *= 2;
    }

    return cost;
}

And ideally this would be part of everyone's BCrypt library, so rather than relying on users of the library to periodically increase the cost, the cost periodically increases itself.


The question was related to optimal and practical determination of cost factor for bcrypt password hashes.

On a system where you're calculating user password hashes for a server that you expect will grow in user population over time, why not make the duration of time that a user has had an account with your service the determining factor, perhaps including their login frequency as part of that determination.

Bcrypt Cost Factor = 6 + (Number years of user membership or some factor thereof) with an optional ceiling for total cost, perhaps modified in some way by the login frequency of that user.

Keep in mind though that using such a system or ANY system for determining cost factor effectively must include the consideration that the cost of calculating that hash could be used as a method of DDOS attack against the server itself by factoring in any method that increases the cost factor into their attack.

0

精彩评论

暂无评论...
验证码 换一张
取 消