# Chapter 4 Entropy

## 4.1 Definition

Let $$X$$ be a random variable and $$P_X(x)$$ be its probability density function (pdf). The entropy $$H(X)$$ can be interpreted sa measure of the uncertainty of $$X$$ and is defined in the discrete case as follows: $$$H(X) = -\sum_{x \in X}{P_X(x)\log{P_X(x)}}. \label{eq:H}$$$

If the $$\log$$ is taken to base two, then the unit of $$H$$ is the (binary digit). We employ the natural logarithm which implies the unit in (natural unit of information).