📖 Read more: Do You Trust AI? Why We Get It Wrong
Why the Brain Still Wins (For Now)
Modern computers rely on the Von Neumann architecture: the processor and memory live in separate locations, and data shuttles between them. This constant traffic creates a bottleneck — the so-called Von Neumann bottleneck — that hurts both speed and energy efficiency.
The brain works differently. Each neuron both stores and processes information simultaneously. Communication happens through electrochemical pulses — spikes — transmitted asynchronously across a network of 100 trillion synapses. There is no central clock. No data bus. Only neurons that fire when needed. This efficiency lets 86 billion neurons run on just 20 watts.
What Neuromorphic Chips Are
The concept dates to the 1980s, when Carver Mead at Caltech built the first silicon neural circuits — an artificial retina and cochlea etched in silicon. His vision: instead of running neural networks on conventional hardware, build hardware that is a neural network.
Modern neuromorphic chips employ spiking neural networks (SNNs). Unlike conventional artificial neural networks that process continuous values, SNNs operate with pulses: an artificial neuron accumulates charge, and when it crosses a threshold, it fires a spike. If it is not needed, it stays idle — and idle means zero power consumption.
Intel Loihi & Hala Point: The Largest Neuromorphic System
Intel leads the neuromorphic field. Loihi 2, the second-generation neuromorphic processor, delivers up to 10x faster processing than its predecessor. It ships with Lava, an open-source framework supporting multiple AI methods and giving developers access to neuromorphic hardware.
In April 2024, Intel unveiled Hala Point — the world's largest neuromorphic system, packing 1.15 billion neurons. Built on 1,152 Loihi 2 chips, it achieves over 10x the neuron capacity and up to 12x the performance of the first-generation research system. Ericsson is already testing Intel's neuromorphic technology to optimize telecommunications networks.
Kapoho Point: Neuromorphic power in compact form
The Kapoho Point platform with 8 Loihi 2 chips can stack for large-scale workloads. It supports AI models with up to one billion parameters or optimization problems with 8 million variables — in a form factor smaller than a book.
IBM TrueNorth & NorthPole: A Different School
IBM takes a different architectural path. TrueNorth (2014) was one of the first large-scale neuromorphic chips — one million neurons on a single die. NorthPole, the next generation, collocates memory and computation in each core, eliminating data movement entirely. IBM argues this architecture simultaneously hits two targets: energy efficiency and low latency.
In academia, SpiNNaker (Manchester/Human Brain Project) runs in real time on digital multi-core chips, while BrainScaleS emulates analog electronic neuron models in accelerated mode. Stanford's Neurogrid can simulate one million neurons with billions of synaptic connections in real time. And IMEC created a self-learning neuromorphic chip that composes music.
Where It Matters
Neuromorphic technology does not replace GPUs in model training. It shines in edge AI scenarios — where you need substantial intelligence on minimal energy. Autonomous vehicles that must react in milliseconds. Drones running on batteries. IoT sensors in remote locations. Warehouse robots learning new routes on the fly.
Pattern recognition is another strong suit: natural language, speech, medical imaging, EEG analysis. In cybersecurity, real-time anomaly detection in network traffic can happen without massive infrastructure. Gartner lists neuromorphic computing as a key emerging technology, though PwC warns the field remains too immature for widespread use.
Obstacles Ahead
Converting deep neural networks to SNNs comes with accuracy loss. Memristors — nonvolatile memory elements used in some architectures — show cycle-to-cycle variations. There are still no unified benchmarks, standardized APIs, or programming languages specifically designed for neuromorphic systems. And the field demands knowledge spanning biology, neuroscience, electronic engineering, and mathematics.
Still, Intel is already exploring the combination of neuromorphic computing with quantum computers. If that combination delivers, neuromorphic chips could become the second pillar — alongside quantum — of a computing era unlike anything we have known.
