Brain-inspired networks boost AI performance and cut energy use
A study shows that mimicking the brain’s structure can make AI systems more powerful and sustainable, improving performance with fewer connections.
										Researchers at the University of Surrey have developed a new method to enhance AI by imitating how the human brain connects information. The approach, called Topographical Sparse Mapping, links each artificial neuron only to nearby or related ones, replicating the brain’s efficient organisation.
According to findings published in Neurocomputing, the structure reduces redundant connections and improves performance without compromising accuracy. Senior lecturer Dr Roman Bauer said intelligent systems can now be designed to consume less energy while maintaining power.
Training large models today often requires over a million kilowatt-hours of electricity, a trend he described as unsustainable.
An advanced version, Enhanced Topographical Sparse Mapping, introduces a biologically inspired pruning process that refines neural connections during training, similar to how the brain learns.
Researchers believe that the system could contribute to more realistic neuromorphic computers, which simulate brain functions to process data more efficiently.
The Surrey team said that such a discovery may advance generative AI systems and pave the way for sustainable large-scale model training. Their work highlights how lessons from biology can shape the next generation of energy-efficient computing.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
