
Thoughtforms Life Associative Memory in Hopfield Networks Designed to Solve Propositional Satisfiability Problems
Mar 5, 2024
Tesh Natalia, a researcher at OIST with a unique blend of theoretical chemistry and cognitive science, dives into Hopfield networks and their capacity to tackle propositional satisfiability problems. She explains the dynamics of these networks and how they can optimize memory through Hebbian learning. Tesh also discusses their potential to connect agent beliefs and preferences to behavioral controls, and explores the implications for creativity, learning, and decision-making in simulated environments. Fascinating insights into artificial intelligence and cognition unfold throughout!
AI Snips
Chapters
Transcript
Episode notes
Attractors Solve And Store Problems
- Hopfield networks settle into attractors that can represent solutions to combinatorial problems or stored memories.
- Imprinting visited states via Hebbian learning lets the network escape local minima and reach global optima over repeated resets.
Mapping SAT To Neural Energy Landscapes
- Propositional SAT problems map to Hopfield energy functions so network weights encode logical constraints.
- This mapping makes the network dynamics perform SAT solving by converging to low-energy satisfying assignments.
Liar Problem Demonstrates Self‑Optimization
- In a 50-person liar problem, turning on Hebbian learning drove the Hopfield system to a global solution repeatedly.
- After learning, the network stayed on the global solution even when learning was turned off.
