Chapter 4 Entropy

4.1 Definition

Let \(X\) be a random variable and \(P_X(x)\) be its probability density function (pdf). The entropy \(H(X)\) can be interpreted sa measure of the uncertainty of \(X\) and is defined in the discrete case as follows: \[\begin{equation} H(X) = -\sum_{x \in X}{P_X(x)\log{P_X(x)}}. \label{eq:H} \end{equation}\]

If the \(\log\) is taken to base two, then the unit of \(H\) is the (binary digit). We employ the natural logarithm which implies the unit in (natural unit of information).

4.2 Nonlinear Coupling

4.2.1 Simulated Systems

4.2.2 Equity-Commodities Relationship

4.3 Efficiency and Bubbles: A Case Study in the Crypto and Equity Markets