Information theory and (non)equilibrium statistical mechanics. Part II
Information theory provides a set of quantitative measures of seemingly nebulous concepts like "uncertainty" and "surprise". One of the applications of this theory is the maximum entropy method that uniquely determines the probability distribution that maximises the uncertainty in the microscopic state of the system whilst remaining compatible with known macroscopic constraints. This leads very straightforwardly to the equilibrium statistical ensembles (microcanonical, canonical, grand-canonical, ...) and it is of interest to extend this approach to nonequilibrium settings. After setting out some key results from information theory, I will briefly review an attempt by Dewar [JPA (2003) 36 631] to use the maximum entropy method to predict the celebrated fluctuation theorem for out-of-equilibrium systems, and to justify the (distinct!) principle of maximum entropy production that has had some success in predicting temperature profiles on planetary surfaces.
This is a roughly weekly series of didactical blackboard talks focussing on some theoretical aspect of Condensed Matter, Biological, and Statistical Physics..