A Set of Shannon Entropy
Shannon Entropy For discrete random variable $X$ with events $\{ x_1, …, x_n \}$ and probability mass function $P(X)$, we defien the Shannon Entropy $H(X)$ as $$H(X) = E[-log_b \ P(X)] = - \sum_{i = 1}^{i = n} \ P(x_i) log_b \ P(x_i)$$ where $b$ is the base of the logarithm. The unit of Shannon entropy is bit for $b = 2$ while nat for $b = e$ The Perspective of Venn Diagram We can illustrate the relation between joint entropy, conditional entropy, and mutual entropy as the following figure...