Cambridge Engineers Build Brain-Like Chip That Could Slash AI Energy Use by 70%
University of Cambridge researchers have engineered a hafnium oxide memristor that mimics how neurons process and store information simultaneously, operating at one-millionth the power of conventional chips.
Engineers at the University of Cambridge have developed a brain-inspired memory device that could reduce artificial intelligence energy consumption by up to 70%, potentially addressing one of the most pressing sustainability challenges facing the AI industry. The research, published in Science Advances, introduces a hafnium oxide-based memristor with a novel architecture that mimics how biological neurons simultaneously process and store information.
The device was engineered by Dr. Babak Bakhit's team at Cambridge's Department of Materials Science and Metallurgy. Rather than using traditional conductive filaments — which tend to behave unpredictably — the researchers created a "p-n junction" design by layering strontium and titanium oxides through a two-step growth process. This configuration switches resistance by adjusting energy barriers at the layer interfaces, rather than forming and breaking physical filaments. The result is dramatically more consistent behavior: "Filamentary devices suffer from random behavior," Dr. Bakhit explained. "But because our devices switch at the interface, they show outstanding uniformity from cycle to cycle and from device to device."
The performance metrics are striking. The device operates at switching currents roughly one million times lower than conventional oxide-based memristors, achieves hundreds of stable conductance levels required for analog computing, and remained stable through tens of thousands of switching cycles in testing. The chips also demonstrated spike-timing dependent plasticity — a biological learning mechanism where synaptic connections strengthen or weaken based on the relative timing of neural firing — enabling the devices to adapt and learn rather than passively store data.
Current AI accelerators are notoriously power-hungry, with large-scale training runs consuming megawatt-hours of electricity and inference at scale presenting its own mounting energy bill. Neuromorphic chips like this one sidestep the bottleneck by collocating memory and processing on the same device, eliminating the constant data shuttling between processor and RAM that currently dominates AI hardware energy budgets.
One manufacturing challenge remains: the fabrication process requires temperatures around 700 degrees Celsius, which exceeds standard semiconductor production capabilities. The Cambridge team is actively working to bring this threshold down to industry-compatible levels. If successful, the approach could integrate with existing chip manufacturing pipelines and accelerate the commercial deployment of energy-efficient AI hardware at a time when the industry's electricity demands are drawing increasing regulatory and environmental scrutiny.