The information entropy (a.k.a Shannon’s Entropy) describes the disorder of the system. Let’s consider coin flipping system, where there are two possibilities for output such as head or tail. Hence, we may consider the system with 1-bit information (e.g. “0” for head and “1” for tail) and there is %50 chance for both of them. Yet, what if the coin’s density was not uniform ? or what if the coin was tricky ? Here, we can’t talk about %50 chance for both sides because one of the sides would have more chance to coincide. Thus, tricky coin would decrease the disorder due to fraudulent side would increase the chance to coincide. As a result, we can say that entropy of the system (tricky coin flipping) is going to decrease (e.g. entropy of the coin -that always coincides tail- is zero).
Mathematical Representation of Shannon Entropy
Self-Information of an Event:
I(x) = -logP(x) = log(1/P(x))
Shannon Entropy General Formula:
Binary (“0” and “1”):
In order to make a more precise definition in terms of computer science, entropy is the number of bits (binary) we can encode. Lets take a look at example:
How many binary we need to encode for months ?
log 2 12 = 3.58 , here 0.58 doesn’t represent a bit -computers operate discrete math- hence, we would need 4 bits for encoding process.
January (0000), February (0001), March (0010), April (0011), May (0100), June (0101), July (0110), August (0111), September (1000), October (1001), November (1010), December (1011).
Entropy & Uncertainty
Entropy is often used with the term “uncertainty” in data security. Uncertainty is the situation that makes a difference in a message and uncertain for the attacker.
Let’s store female(str) and male(str) in our database. Then, let our encrypted message be “kwét”. If an attacker finds a single bit from this message, the attacker can access the held information, i.e. if attacker finds out that the 1st bit is the “m” then attacker figure out the secret message is “male”. Hence, this situation uncertainty is 1 bit. Any known 1 bit from encrypted message can bring the secret message into open.
Resources: https://towardsdatascience.com/the-intuition-behind-shannons-entropy-e74820fe9800 —— https://medium.com/activating-robotic-minds/demystifying-entropy-f2c3221e2550 —— http://bilgisayarkavramlari.sadievrenseker.com/2008/12/17/entropi-entropy/