Information Content

Back to Entropy

The amount of information conveyed by a single event: I(x) = -log2(p(x)) bits. Rare events carry more information than common ones. Shannon entropy is the expected value of information content across all events. This quantification of “surprise” is the foundation of information theory.

mathematics-for-cs information-theory entropy information-content