Information as the Logic of Science: A Unifying Theme for Celebrating the Legacy of Edwin T. Jaynes
Edwin Thompson Jaynes, an independent thinker, who can be considered as one of the universalists, has touched many areas of science and engineering. Jaynes's journey led him from Stanford to Washington University in St. Louis in 1960, following his time at Princeton where he worked with Eugene Wigner.
In reframing links between probability, statistical physics, and observation, Jaynes seemed to be following a mathematical physicist of gigantic stature, Marquis De Laplace who in his 1795 lecture on probabilities related probability to incompleteness of knowledge (or ignorance). While Laplace distinguished between frequentist and what is now called Bayesian formalism, Jaynes expanded on this distinction by emphasizing the critical importance of incorporating correct prior information in any problem of statistical inference. In his seminal 1957 paper, he introduced the maximum entropy principle for assigning prior probability distributions based on incomplete knowledge, demonstrating that the maximum entropy distribution is uniquely determined as the least biased distribution when missing information. This insight became the cornerstone of his reinterpretation of statistical mechanics, which he framed as a special case of maximum-entropy inference, thereby freeing it from its apparent dependence on physical hypotheses.
As indicated by Gamow and others, Jaynes’s formulation led to unification, simplification, and clarification, connecting information theory with statistical physics and reinterpreting probability theory as an extension of logic, where classical probability rules gain deeper meaning as the unique consistent rules for conducting inference of any kind. This, in turn has brought about new formulations of extending formalism from classical to quantum mechanics, and from theoretical formalism to even new forms of computing providing pointers to using more algorithmically efficient methods for inference.
Jaynes’s contributions to the fields of engineering (radar and masers), physics (fermionic-bosonic coupling of radiation), information theory and Bayesian formalism unify multiple fields. Jaynes’s work on the Jaynes-Cummings model (1963), which comprised a fermion-boson model for a single photon interacting with electrons excited in a two-level atom could be viewed as the inception of quantum electrodynamics in cavities. Jaynes’s focus on using experiments to augment the theoretical analysis in quantum electrodynamics being limited by the problem of infinities indicated a sense of premonition for the renormalization theory yet to be developed.
As a researcher and educator who was very accessible to his colleagues and students, Jaynes is widely recognized for his independent thinking and originality. Over the years, his theories and his scientific contribution continue to make a lasting impact in areas as diverse as quantum technologies, artificial intelligence, neuroscience and medical imaging. In this meeting to celebrate Jaynes’s legacy in 2024, the intent is not only to look back at his scientific contributions, but also to look forward to the new areas of science and engineering and new theories and applications are yet to be discovered. The logic of inference based on a scientific basis can provide new insights into current machine learning and artificial intelligence methods which are energy-intensive as they are implemented. The topics that will be covered in this symposium will be broadly based on information-based formalisms in applications to physics and engineering, nature-inspired systems, and quantum information-based systems along the lines of Jaynes’s research as a Bayesian physicist.