Chaos, Lyapunov, and entropy increase

(Sethna, "Entropy, Order Parameters, and Complexity", ex. 5.9)

© 2017, James Sethna, all rights reserved. This exercise was developed in collaboration with Christopher Myers.

Chaotic dynamical systems have sensitive dependence on initial conditions. This is commonly described as the `butterfly effect' (due to Lorenz of the Lorenz attractor): the effects of the flap of a butterfly's wings in Brazil build up with time until months later a tornado in Texas could be launched. In this exercise, we will see this sensitive dependence for a particular system (the logistic map) and measure the sensitivity by defining the Lyapunov exponents.

Import packages

In [ ]:
%pylab inline
from scipy import *

The logistic map takes the interval $(0,1)$ into itself: \begin{equation} f(x) = 4 \mu x (1-x), \end{equation} where the time evolution is given by iterating the map: \begin{equation} x_0, x_1, x_2, \ldots = x_0, f(x_0), f(f(x_0)), \ldots . \end{equation} In particular, for $\mu=1$ it precisely folds the unit interval in half, and stretches it (non-uniformly) to cover the original domain.

In [ ]:
def f(x,mu):
    Logistic map f(x) = 4 mu x (1-x), which folds the unit interval (0,1)
    into itself.
    return 4*...

The mathematics community lumps together continuous dynamical evolution laws and discrete mappings as both being dynamical systems. (The Poincar\'e section, as described in the exercise 'Jupiter', takes a continuous, recirculating dynamical system and replaces it with a once-return map, providing the standard motivation for treating maps and continuous evolution laws together. This motivation does not directly apply here, because the logistic map is not invertible, so it is not directly given by a Poincar\'e section of a smooth differential equation. (Remember the existence and uniqueness theorems from math class? The invertibility follows from uniqueness.) ) The general stretching and folding exhibited by our map is often seen in driven physical systems without conservation laws.

In this exercise, we will focus on values of $\mu$ near one, where the motion is mostly chaotic. Chaos is sometimes defined as motion where the final position depends sensitively on the initial conditions. Two trajectories, starting a distance $\epsilon$ apart, will typically drift apart in time as $\epsilon e^{\lambda t}$, where $\lambda$ is the Lyapunov exponent for the chaotic dynamics.

Start with $\mu = 0.9$ and two nearby points $x_0$ and $y_0=x_0+\epsilon$ somewhere between zero and one. Investigate the two trajectories $x_0, f(x_0), f(f(x_0)), ..., f^{[n]}(x_0)$ and $y_0, f(y_0), \dots$. How fast do they separate? Why do they stop separating? Estimate the Lyapunov exponent. (Hint: $\epsilon$ can be a few times the precision of the machine (around $10^{-17}$ for double-precision arithmetic), so long as you are not near the maximum value of $f$ at $x_0 = 0.5$.)

In [ ]:
mu = 0.9
x = 0.4
eps = 3.e-17
y = x + eps
fiter1 = [x]
fiter2 = [y]
for i in range(200):
    x = ..
    y = ...
fiter1 = array(fiter1)
fiter2 = array(fiter2)
diff = fiter1-fiter2
diffPlot = semilogy(fabs(...))
In [ ]:
ts = arange(200)
lyapunov = ...   # Vary to give good fit
diffPlot = semilogy(fabs(diff))
semilogy(ts, eps*exp(...), 'r-');

Many Hamiltonian systems are also chaotic. Two configurations of classical atoms or billiard balls, with initial positions and velocities that are almost identical, will rapidly diverge as the collisions magnify small initial deviations in angle and velocity into large ones. It is this chaotic stretching, folding, and kneading of phase space that is at the root of our explanation that entropy increases.