Glossary:Entropy: Revision history

Diff selection: Mark the radio buttons of the revisions to compare and hit enter or the button at the bottom.
Legend: (cur) = difference with latest revision, (prev) = difference with preceding revision, m = minor edit.

    19 February 2024

    • curprev 21:3921:39, 19 February 2024Philip talk contribs 438 bytes +438 Created page with "Entropy (in information theory) is the amount of information or uncertainty in a piece of data, usually measured in bits. For example a 3 digit PIN or combination lock has 1,000 possible values and so has almost 10 bits of entropy, because 10 bits allows you to count up to 1,024. The entropy of a password or encryption key can be used directly to estimate how long it might take to crack, by trying all possible values."