# Replicator Dynamics

Today, I had the pleasure of attending John Carlos Baez’s very interesting talk at the Queen Mary University of London. He presented how information theory can be used to describe replicator dynamics in the context of natural selection, evolutionary algorithms, game theory, and more. You should probably check out his own writing on this topic, but I found the following particularly interesting:

Suppose we have $n$ types of replicators with population sizes $P_1(t), P_2(t), \ldots, P_n(t)$. Suppose also that these populations satisfy the following set of differential equations:

$$
\frac{\mathrm{d} P_i}{\mathrm{d} t} = f_i(P_1, P_2, \ldots, P_n) P_i
$$
where $f_i$ is the fitness function of the $i$^{th} replicator.

Next, if we define $p_i$ to be the fraction of the population that is of type $i$, it can be shown that

$$ \left| \left| \frac{\mathrm{d} p}{\mathrm{d} t} \right| \right|^2 = \sum_i (f_i - \langle f \rangle)^2 p_i $$ The left-hand side is the square of, what Baez calls, “Fisher speed”, or the rate of learning, i.e. the “speed” of the changing probability distribution $p(t) = (p_1(t), p_2(t), \ldots, p_n(t))$. The right-hand side is the variance of the fitness.

Baez interprets this as a revised version of Fisher’s fundamental theorem of natural selection. In English, he states it in the following way:

As a population changes with time, the rate at which information is updated equals the variance of fitness.