Theoretical physicists hold the key to some of artificial intelligence’s toughest problems

If you’ve been following recent high-level appointments in the artificial intelligence and machine learning industry, you would have noticed a marked increase in the number of theoretical physicists in senior positions at companies like Samsung AI R&D and Fetch AI. In fact, the trend of shifting from training in theoretical physics and mathematics to machine learning is gaining momentum with PostDoc researchers, as there is an intersection of functions.

Physicists excel in ML because computer programs are inherently stochastic. They already have a foundation in the math and statistical tools needed to understand complex ML methods. Physicists also specialize in writing high-performance numerical code, which is another useful skill for ML development.

Renowned AI scientist Yann LeCun once said that traditionally there is a history of theoretical physicists – especially condensed matter physicists – bringing mathematical ideas and methods to ML, neural networks, to probabilistic inference and SAT problems.

In a public lecture, Roger Melko, associate faculty member of the Perimeter Institute at the University of Waterloo, explained that ML algorithms accelerate discoveries in physics. He mentioned how DeepMind’s victory in Go, which came from ML, got researchers from different verticals thinking about applying ML algorithms to solve the complexity problem of quantum physics. In a recently published article by Melko and Juan Carrasquilla, the researchers talked about the neural network, which is an enhanced version of AI software used to identify numbers written by humans. Interestingly, the ML algorithm effectively captured and recognized the different phases of matter in a quantum system, that too with minimal adjustments.

According to LeCun, the wave of interest in neural networks in the 1980s and early 1990s was partly caused by the connection between spin glasses and recurrent networks popularized by John Hopfield. Although this caused some physicists to turn into neuroscientists and machine learners, most of them left the field when interest in neural networks waned in the late 1990s. deep learning and all the theoretical issues surrounding it, physicists are staging a comeback. Many young physicists and mathematicians are now trying to explain why deep learning works so well.

Rise in demand for physicists in the AI ​​industry

For example, Recover AI, an artificial intelligence and digital economy company, recently announced the appointment of Marcin Abram as a Machine Learning Scientist. Abram completed his PhD in theoretical physics in 2016 and his doctoral research explored topics of coherence and emergent behavior in quantum systems. Another key appointment was that of Dr. Sebastian Seung by Samsung Electronics to strengthen AI R&D and deliver game-changing business impact. A leading computational neuroscientist, Dr. Seung originally studied theoretical physics at Harvard. He worked as a researcher at Bell Labs and a professor at the Massachusetts Institute of Technology (MIT).

Hopfield’s contribution to AI

One of the greatest contributions of the famous scientist John Hopfield was the formalization of autoencoder networks when he derived the Hopfield network. In 1982, Hopfield presented Hopfield Network I, an artificial neural network for storing and retrieving memory like the human brain. Hopfield Network II is a single-layer, recurrent network: neurons are fully connected, i.e. each neuron is connected to all other neurons.

From there Boltzmann machines were invented to add some stochasticity to the lattice so that it doesn’t get stuck in local minima since Hopfield lattices are deterministic in the standard formulation. Restricted Boltzmann machines are now used in deep belief networks by stacking them on top of each other, and a greedy layered training algorithm has made these networks practically usable and produces very accurate classifications, while being useful generative models. It happened over the past decade and it’s a piece of history that continues to make headlines today.

Both of these groundbreaking advances came from the realm of statistical physics and mathematics – the first was Hopfield’s insight with spin glasses, and the other was the application of simulated annealing to solving these spin glasses (little time after the invention of the algorithm), which was Hinton’s insight. Simulated annealing itself evolved from the Metropolis-Hastings algorithm described by Metropolis, Rosenbluth and Teller in the 1950s. One of two independent papers introducing simulated annealing was named A thermodynamic approach to the traveling salesman problem so the roots of statistical mechanics are really clear.

Lately, many math and physics students are pursuing careers in this burgeoning field. Given the huge demand for good talent, physicists can add solid value to AI research.

Comments are closed.