Firing up the processing power required to make AI systems work can consume vast amounts of energy.
In 2019, researchers at MIT found that the carbon footprint of training an AI system for natural language processing (NLP) could be up to five times as high as the lifetime emissions of an average American car.
Scientists are working to reduce the energy demands of AI by creating more efficient software (algorithms) and hardware (chips).
The tensor streaming processor chip from US manufacturer Groq claims 50% energy efficiency improvements over its nearest rival.
AI also has potential for increasing energy efficiency across the tech industries and beyond.
Since 2016, a DeepMind AI system has been used to reduce the energy used in cooling Google’s data centres by 30%.
Similar systems could be used to cut energy use across industry help tackle climate change.
Nicole Junkermann, NJF Capital founder, presents an A-Z of Artificial Intelligence – a series of short videos focused on key areas of interest in this hugely topical and consequential field.