Software Engineering KB

Home

❯

01 Foundations

❯

03 Mathematics for CS

❯

01 Concept

❯

Entropy

Entropy

Feb 10, 20261 min read

  • mathematics-for-cs
  • information-theory
  • entropy

Entropy

← Back to Information Theory

A measure of the uncertainty or information content of a random variable. Shannon entropy quantifies the minimum number of bits needed to encode messages from a source, forming the foundation of data compression.

Key Properties

  • Shannon Entropy
  • Information Content

mathematics-for-cs information-theory entropy


Graph View

  • Entropy
  • Key Properties

Backlinks

  • Information Theory
  • Channel Capacity
  • Compression
  • Information Content
  • Shannon Entropy

Created with Quartz v4.5.2 © 2026

  • GitHub