10 Innovative AI Concepts by Nobel Laureates

Published On Sat Oct 12 2024
10 Innovative AI Concepts by Nobel Laureates

A Physics Nobel for Computer Sciences? | Oct, 2024 - Generative AI

The Nobel Prize in Physics announcement sent the internet into a meme war. Physicists everywhere were upset that the prize had been given to innovation in computer sciences, and people were talking about how the Turing Award that Hinton and Hopfield had previously received was enough.

As a practitioner in the AI space, I find this battle funny. It was hilarious to consider the history of the Nobel Prize, where Alfred Nobel firmly believed that mathematics wasn’t a practical discipline that directly benefited humanity. Now, decades later, the Chemistry, Physics, and even Medicine prizes are often awarded to mathematicians and computer scientists.

The Role of Hopfield Networks and Boltzmann Machines

To break from the noise, it’s essential to look back at Hopfield Networks and Boltzmann Machines and their current roles in today’s Artificial Intelligence. The discipline has always relied on fundamental research from other sciences to advance the field.

John Hopfield and His Contributions

John Hopfield started his work in Physics. He observed the basic properties of neurons and started applying the architecture of neurons in the brain to neural networks. Hopfield was the first to formalize asynchronous updates to the model’s memory and the Content-Addressable Memory system.

Illustration of a Restricted Boltzmann Machine (RBM) bipartite graph

Without delving too deep into complexity, let’s discuss some basic ideas:

  • Tij is the synaptic strength between i and j.
  • Vj is the activity of j.
  • The direct input to j is Ij.
  • Overall input to j is xj = k ∑​TjiVi + Ij.

6. Updating the State of the Neuron’s Activity — When the signal is passed from neuron i to neuron j, the state of the neuron is updated based on the input received.

7. Model the Network — The network functions in a way where each update decreases the Energy state of the overall network.

Hopfield devised a type of recurrent neural network that could recognize patterns in data and store them as memories. This methodology recurs in modern systems and is demonstrated in models like BERT and ChatGPT.

Geoffrey Hinton and Boltzmann Machines

Geoffrey Hinton's critical contributions to the machine learning discipline include providing the first backpropagation mechanisms and inventing Boltzmann machines.

An illustration of different types of Boltzmann Machines (BM)

With backpropagation, a sense of learning can be achieved by creating self-organizing hidden layers within the neural network.

One of Hinton’s significant inventions, the Boltzmann Machines, uses the Boltzmann equilibrium theorem from Physics in recognizing patterns.

While not straightforward, fundamental research from biology and physics has informed the discipline, leading to the development of generative AI chatbots like the ones offered by Kommunicate.

This article traces a line between the two Nobel laureates and the latest Open AI models, showcasing how basic principles have advanced the field of Artificial Intelligence.