**Bits of Entropy:**

In cryptography, key size, measured in bits (or powers of 2), represents the number of keys which must be tested to break an encrypted message by brute force. But encryption algorithms are generally “lossy” because, being deterministic, they have some degree of predictability and/or collisions of different keys producing the same output values. Some algorithms are “lossier” than others and there are many statements in the literature to the effect that a 1024-bit key using one algorithm is just as secure as an 80-bit key using another. That means one algorithm is 1.49×10

^{248}times more secure than the other. Despite common misuse in advertising claims, bits are the "standard" unit of entropy. In this writing, they represent a mathematically derived number of equally likely possible combinations, expressed in powers of 2. Not the computer space used to store a value generated with a PRNG.

**Dirty Secrets of Pseudo-Random Number Generators:**

Given the same initial input, a PRNG will not only produce the same output value, but will produce the same

*sequence*of values every time, and that sequence is generally your automatically generated cryptographic key or password! The "moving parts" inside most PRNGs are 64-bit and 32-bit registers and this limits their entropy. But PRNGs are often more limited by the precision of their input or seed. Since the few somewhat random processes in the computer tend to produce clusters of values, they are unsuitable as random number seeds. The only part of the computer that produces an even dispersion of values is the system clock, which measures milliseconds.

There are only 8.64×10

^{7}milliseconds in a day. That's about 26 bits of entropy. Using the 95 characters on a US-keyboard, a 5-character truly-random password is more secure! So why not use a longer time frame? There are 3.1×10

^{11}milliseconds in 10 years which is only 38 bits of entropy – and who has a password older than 10 years? Most people generate their keys at work, between the hours of 9-5 M-F which is only one fourth of the hours in the week, so you might be losing about 2 bits of entropy there.

So even if you have a 64-bit random number generator, unless you can get a seed with more than 36 bits of entropy, you only get 36 bits of entropy in your key. That’s the same number of significant digits in a truly random 6-character key (95

^{6}). How many significant digits of randomness does any computer generated encryption key contain? No more than the significant digits of entropy used to create it times the number of PRNG algorithms that could have been used.

**Links:**

- How We Learned to Cheat at Online Poker: A Study in Software Security - If you liked my article here, you will probably like this article even more. One of my favorite reads on the web!
- The Memorability and Security of Passwords - Some Empirical Results - Using initial letters and punctuation from pass-phrases is as easy to remember as the most commonly hacked English words, but is as hard to crack as truly random characters... until cracking tools learn what to look for. In the meantime, this is the best advice I've seen on telling an end-user how to construct a secure password.
- Entropy Gathering Daemon - A great way to seed a random-number generator. The only complaint I've heard is that EGD can block, waiting for more entropy if you use it heavily in software such as a web server.
- KeePass password store - has an entropy-gatherer built-in for generating random passwords... Nice!
- RANDU Wikipedia article about an infamous random number generator.
- Humor from xkcd: Random Number, Code Talkers