Glossary:Entropy

Entropy (in information theory) is the amount of information or uncertainty in a piece of data, usually measured in bits. For example a 3 digit PIN or combination lock has 1,000 possible values and so has almost 10 bits of entropy, because 10 bits allows you to count up to 1,024. The entropy of a password or encryption key can be used directly to estimate how long it might take to crack, by trying all possible values.