Information Entropy (Simplified)

Dr. C. Shannon Information Entropy The information entropy (a.k.a Shannon’s Entropy) describes the disorder of the system. Let’s consider coin flipping system, where there are two possibilities for output such as head or tail. Hence, we may consider the system with 1-bit information (e.g. “0” for head and “1” for tail) and there is %50 chance … Continue reading Information Entropy (Simplified)