Chapter 4 Entropy
4.1 Definition
Let X be a random variable and PX(x) be its probability density function (pdf). The entropy H(X) can be interpreted sa measure of the uncertainty of X and is defined in the discrete case as follows: H(X)=−∑x∈XPX(x)logPX(x).If the log is taken to base two, then the unit of H is the (binary digit). We employ the natural logarithm which implies the unit in (natural unit of information).