Shannon Entropy
← Back to Entropy
H(X) = -sum(p(x) * log2(p(x))) for all outcomes x. Measures the average information content or uncertainty of a random variable in bits. Maximum entropy occurs for uniform distributions. Shannon entropy establishes the theoretical limit for lossless data compression.