This is my talk for the workshop Biological Complexity: Can It Be Quantified?
• John Baez, Biology as information dynamics, 2 February 2017.
Abstract. If biology is the study of self-replicating entities, and we want to understand the role of information, it makes sense to see how information theory is connected to the ‘replicator equation’—a simple model of population dynamics for self-replicating entities. The relevant concept of information turns out to be the information of one probability distribution relative to another, also known as the Kullback–Leibler divergence. Using this we can get a new outlook on free energy, see evolution as a learning process, and give a clean general formulation of Fisher’s fundamental theorem of natural selection.
For more, read:
• Marc Harper, The replicator equation as an inference dynamic.
• Marc Harper, Information geometry and evolutionary game theory.
• Barry Sinervo and Curt M. Lively, The rock-paper-scissors game and the evolution of alternative male strategies, Nature 380 (1996), 240–243.
• John Baez, Diversity, entropy and thermodynamics.
• John Baez, Information geometry.
The last reference contains proofs of the equations shown in red in my slides.
In particular, Part 16 contains a proof of my updated version of Fisher’s fundamental theorem.