Chapter 4 Entropy

4.1 Definition

Let X be a random variable and PX(x) be its probability density function (pdf). The entropy H(X) can be interpreted sa measure of the uncertainty of X and is defined in the discrete case as follows: H(X)=xXPX(x)logPX(x).

If the log is taken to base two, then the unit of H is the (binary digit). We employ the natural logarithm which implies the unit in (natural unit of information).

4.2 Nonlinear Coupling

4.2.1 Simulated Systems

4.2.2 Equity-Commodities Relationship

4.3 Efficiency and Bubbles: A Case Study in the Crypto and Equity Markets