lpetrich
Contributor
All Nobel Prizes 2024 - NobelPrize.org - I am sadly slow on this. 
Press release: The Nobel Prize in Physics 2024 - NobelPrize.org
The two winners are John J. Hopfield and Geoffrey Hinton, âfor foundational discoveries and inventions that enable machine learning with artificial neural networksâ.
For JH's work, it was using "spin glass theory", simplified models of magnetic materials that involve the interaction of each atom's magnetic field with its neighbors' magnetic fields. Simplified like the Ising model. where each atom has field value +1 or -1 and each neighboring pair's interaction energy is proportional to the product of field strengths.
 Ising model. where each atom has field value +1 or -1 and each neighboring pair's interaction energy is proportional to the product of field strengths.
GH's work seems to involve "simulated annealing", much like hill climbing, but where one rejects wrong-way moves only some of the time. This enables the system to wander away from local minima until it finds a strong one.
 Neural network (machine learning) - inspired by how biological neurons work. They use cascaded and thresholded linear classifiers, thus getting around the limitations of linearity.
 Neural network (machine learning) - inspired by how biological neurons work. They use cascaded and thresholded linear classifiers, thus getting around the limitations of linearity.
Limitations of linearity? There are many problems that cannot be solved with linear classifiers, even thresholded ones, including some very simple ones, like the exclusive-or problem:
0 0: 0
0 1: 1
1 0: 1
1 1: 0
Plain or is easy to solve with a linear classifier, however. Add the inputs then threshold that sum: x = min(x1+x2,1)[/quote]
				
			
Press release: The Nobel Prize in Physics 2024 - NobelPrize.org
The two winners are John J. Hopfield and Geoffrey Hinton, âfor foundational discoveries and inventions that enable machine learning with artificial neural networksâ.
Seems to me more like algorithm design, though they used some theoretical-physics models in their work.This yearâs two Nobel Laureates in Physics have used tools from physics to develop methods that are the foundation of todayâs powerful machine learning. John Hopfield created an associative memory that can store and reconstruct images and other types of patterns in data. Geoffrey Hinton invented a method that can autonomously find properties in data, and so perform tasks such as identifying specific elements in pictures.
For JH's work, it was using "spin glass theory", simplified models of magnetic materials that involve the interaction of each atom's magnetic field with its neighbors' magnetic fields. Simplified like the
 Ising model. where each atom has field value +1 or -1 and each neighboring pair's interaction energy is proportional to the product of field strengths.
 Ising model. where each atom has field value +1 or -1 and each neighboring pair's interaction energy is proportional to the product of field strengths.GH's work seems to involve "simulated annealing", much like hill climbing, but where one rejects wrong-way moves only some of the time. This enables the system to wander away from local minima until it finds a strong one.
 Neural network (machine learning) - inspired by how biological neurons work. They use cascaded and thresholded linear classifiers, thus getting around the limitations of linearity.
 Neural network (machine learning) - inspired by how biological neurons work. They use cascaded and thresholded linear classifiers, thus getting around the limitations of linearity.Limitations of linearity? There are many problems that cannot be solved with linear classifiers, even thresholded ones, including some very simple ones, like the exclusive-or problem:
0 0: 0
0 1: 1
1 0: 1
1 1: 0
Plain or is easy to solve with a linear classifier, however. Add the inputs then threshold that sum: x = min(x1+x2,1)[/quote]
 
	