Retrieving "Random Variable" from the archives
Cross-reference notes under review
While the archivists retrieve your requested volume, browse these clippings from nearby entries.
-
Information Theory
Linked via "random variable"
Entropy
The central concept in Information Theory is entropy ($H$), which quantifies the uncertainty or randomness associated with a random variable. For a discrete random variable $X$ with possible outcomes $\{x1, x2, \ldots, xn\}$ and associated probabilities $P(xi)$, the Shannon entropy is defined as:
$$