Monthly Archives: March 2025

Energy-Efficient Artificial Intelligence Inspired by Human Brain Functionality

Artificial Intelligence (AI) is making significant strides in performing complex calculations and analysing data more swiftly than humans. Yet, these capabilities demand substantial energy, contrasting sharply with the human brain—a marvel of efficiency that executes highly complex computational tasks while consuming minimal energy. As technology firms expand, researchers at Texas A&M University, including Dr. Suin Yi from the College of Engineering, are pioneering a transformative approach. Their development of “Super-Turing AI” mimics the human brain’s efficiency, potentially revolutionising the AI sector by integrating processes that conventional systems handle separately, thus reducing the need for extensive data transfers that current AI systems require.

Today’s AI systems, such as those developed by OpenAI, require significant computational resources and operate within vast data centres that consume enormous amounts of electricity. Dr Yi points out the stark energy disparity: these centres may consume up to a billion watts, whereas the human brain uses a mere 20 watts. The environmental impact and the high operational costs of these data centres call for a sustainable approach to AI, making it imperative to address these issues as AI becomes more integrated into our lives.

Dr. Yi and his team believe that mimicking the brain’s neural processes is the key to overcoming these challenges. Learning and memory functions are intertwined in the human brain, facilitated by synapses that enable neurons to transmit signals. These synaptic connections are modified through a process known as synaptic plasticity, which is essential for forming and altering neural circuits to store and retrieve information. This integrated approach contrasts sharply with conventional computing systems, where training and memory storage occur separately, creating inefficiencies.

Super-Turing AI is groundbreaking because it eliminates the need to transfer large amounts of data between different hardware parts. This model avoids the computationally intensive backpropagation method, which is effective but biologically implausible. Instead, it utilises mechanisms such as Hebbian learning and spike-timing-dependent plasticity, which more accurately reflect the brain’s learning processes and potentially reduce the computational power required.

The practical applications of this research are already demonstrating significant benefits. For instance, a drone equipped with a circuit based on these principles adeptly navigated a complex environment autonomously, learning and adapting in real-time. This method proved to be faster, more efficient, and less energy-intensive than traditional AI methods, showcasing the potential of Super-Turing AI to enhance operational efficiency across various fields.

The implications of Dr. Yi’s research are profound for the future of AI. As industries push to develop more capable AI models, the limitations imposed by current hardware and energy constraints become increasingly apparent. Some new AI applications may even require the construction of additional data centres, amplifying environmental and economic concerns. Looking forward, Super-Turing AI represents a critical step towards sustainable AI development. By redesigning AI architectures to emulate the human brain’s efficiency, we can address economic and environmental challenges, paving the way for a new generation of more innovative, efficient, and environmentally responsible AI. This innovative approach promises to reshape the AI landscape, ensuring that as it advances, it does so in a manner that benefits both people and the planet.

More information: Suin Yi et al, HfZrO-based synaptic resistor circuit for a Super-Turing intelligent system, Science Advances. DOI: 10.1126/sciadv.adr2082

Journal information: Science Advances Provided by Texas A&M University