Chapter 3 Entropy

3.1 Definition

Let \(X\) be a random variable and \(P_X(x)\) be its probability density function (pdf). The entropy \(H(X)\) is a measure of the uncertainty of \(X\) and is defined in the discrete case as follows: \[\begin{equation} H(X) = -\sum_{x \in X}{P_X(x)\log{P_X(x)}}. \label{eq:H} \end{equation}\]

If the \(\log\) is taken to base two, then the unit of \(H\) is the (binary digit). We employ the natural logarithm which implies the unit in (natural unit of information).

3.2 Efficiency and Bubbles: A Case Study in the Crypto and Equity Markets