From Thermodynamics to Neural Dynamics: Uncovering General Principles in Physical and Biological Systems
This talk explores the deep analogies and potential unifying mechanisms between thermodynamic systems and neural networks. For example, the Boltzmann distribution—a foundational concept in statistical mechanics describing the probability of a system occupying a given energy state—also emerges in models of neural activity and learning, such as Boltzmann machines and attractor networks. Synaptic strength changes, too, can be understood through energy minimization and entropy maximization, principles central to thermodynamics.
By examining how energy landscapes shape both the organization of molecules and the flow of neural information, we highlight how biological systems may exploit these universal laws for efficient computation, adaptation, and memory formation. The interplay between stochastic fluctuations and stability—seen in both thermal noise and neuronal variability—points to a broader principle: complex, adaptive behavior may arise from systems poised near criticality, where order and randomness coexist.
Bridging the gap between thermodynamic and neural dynamics offers not only a deeper understanding of the brain but also a pathway to developing physical models of cognition grounded in universal physical laws.
See more of: PSDK XV: Phase Stability and Diffusion Kinetics