The rise of AI and machine learning promises a revolution in how we live and work. Expert reasoning and mundane tasks will be completed for us in the cloud. But the cloud is not ethereal or abstract. It is a globe spanning mass of physical infrastructure. Enabling this transformation will demand a huge expansion in data centre construction.
Data centres house the processing and computing power that the world relies on. Investors have pledged trillions for their construction. But their costs are environmental, as well as financial. From energy, to water, to materials, data centres require a lot of resources to build and operate. By 2030 the IEA (International Energy Agency) estimates that worldwide data centres will consume 1,000 terrawatt hours of electricity. Today an average data centre uses 300,000 litres of water a day.
However sustainability in the industry is improving. In this episode, Josh Parker, head of sustainability for Nvidia, explains how improvements in chip design and accelerated computing have led to massive gains in efficiency over the last ten years, doing the same AI workload uses 100,000x less power. Sustainability gains go beyond operational energy use. As Professor Deborah Andrews highlights, e-waste and water usage are key issues the industry must address. And, as Damien Dumestier explains, the scale of this new sector will, perhaps, push innovation all the way into space.
In this episode we explore what the industry is doing to become more sustainable, from improving energy efficiency, building data centres in novel and remote locations and by using AI itself to improve sustainability across industries.
Guests
Professor Deborah Andrews, professor of Design for Sustainability and Circularity, London South Bank University
Josh Parker, head of sustainability, Nvidia
Damien Dumestier, head of the ASCEND feasibility study, Thales Alenia Space
Resources
IEA 2025 Energy and AI report