Glossary:Entropy: Difference between revisions

(Created page with "Entropy (in information theory) is the amount of information or uncertainty in a piece of data, usually measured in bits. For example a 3 digit PIN or combination lock has 1,000 possible values and so has almost 10 bits of entropy, because 10 bits allows you to count up to 1,024. The entropy of a password or encryption key can be used directly to estimate how long it might take to crack, by trying all possible values.")
 
(No difference)

Latest revision as of 21:39, 19 February 2024

Entropy (in information theory) is the amount of information or uncertainty in a piece of data, usually measured in bits. For example a 3 digit PIN or combination lock has 1,000 possible values and so has almost 10 bits of entropy, because 10 bits allows you to count up to 1,024. The entropy of a password or encryption key can be used directly to estimate how long it might take to crack, by trying all possible values.