The Hidden Climate Costs of AI and Digital Infrastructure

Technology has advanced in leaps and bounds in the past decade, with research and innovation accelerating the application of artificial intelligence and digital infrastructure to execute tasks and harmonize the efficient use of technology. However, this comes at a cost, as study shows that a single AI model can emit as much carbon as five cars over their lifetime.

AI models’ power, from healthcare diagnostics to social media algorithms and cloud computing, is increasingly being adopted, becoming the cornerstone of digital storage and processing that enables businesses to scale operations seamlessly. Moreover, decentralized finance and blockchain innovation have increasingly influenced the adoption of cryptocurrencies like Bitcoin and Ethereum. This digital expansion has necessitated an enormous amount of electricity and water to sustain data centers, and researchers are warning of the significant environmental cost.

 According to Capgemini, a digital carbon footprint is the CO2 emissions resulting from the production, use, and data transfer of digital devices and infrastructure.

A 2019 study by researchers at the University of Massachusetts Amherst analyzed the carbon footprint of training large Natural Language Processing (NLP) models. The study found that a large model like BERT (AI Model) emitted about 284,000 kg of CO2 during training.

Moreover, AI models, particularly generative AI models like GPT-4, are becoming exponentially larger, which means that more data center energy is being used to train them and to process data.

Already, data centers account for 1% to 2% of overall global energy demand, similar to what experts estimate for the airline industry, Gadepally said. That figure is poised to skyrocket, given rising AI demands, potentially hitting 21% by 2030, when costs related to delivering AI to consumers are factored in.

“Water needed for cooling is another factor in data center sustainability calculations. As more data center equipment is squeezed into tighter physical quarters, it increases the requirement for aggressive cooling technologies, many of which draw from already stressed watershed areas,” says Vijay Gadepally, a senior scientist and principal investigator at MIT Lincoln Laboratory, where he leads the Supercomputing Center’s research initiatives.

Consequently, to make technological advancements sustainable, researchers are coming up with a playbook for reducing emissions. These steps are estimated to potentially shave between 10% to 20% off global data center electricity demand. Moreover, scientists are saying that this wouldn’t require huge investments to achieve.

“You can employ some of these techniques and cut your operating expenses,” says Gadepally.

This can be done in many ways, including limiting the power available by opting for more efficient hardware whenever possible. Just as organizations can choose to replace traditional light bulbs with more efficient LED bulbs, they can make more energy-efficient choices about data center gear, according to scientists.

Moreover, scientists are encouraging tech companies to rethink model training and lean into cheaper, less-robust AI models for training. For example, Gadepally’s team decided to forgo the usual routine of training thousands of models to completion for a drug discovery application, given that most of the training data would ultimately never be used. Instead, the team built a training speed estimation tool that tracks the loss curve of training models, allowing them to predict end-state accuracy after 20% of a computation is complete.

“That allows us to quickly shave off about 80% of the compute, with no impact to the end model,” Gadepally said.

Researchers also encourage the designing of software to automatically adjust for variations in carbon emissions and carbon impacts throughout the day by making software AI-carbon aware. By employing intelligent energy-reduction strategies as part of scheduling systems, AI workloads that aren’t time-sensitive can be automatically shifted to run at different times or in different geographic zones to address peak energy usage periods and achieve optimal energy savings.

To accelerate the adoption of energy-efficient AI initiatives, there’s work to be done, including collaborating within the tech community to collect data, build benchmarks, and rethink the current mindsets that cast bigger AI models and data as better.

Big Tech must prioritize green innovations, investing in energy-efficient AI models, running data centers on 100% renewable energy, and adopting sustainable cooling technologies. Moreover, policymakers should enforce stricter energy regulations, promote carbon taxes for high-emission digital operations, and incentivize greener technologies.

The future of technology does not have to come at the cost of the planet.

“We need to think more about how we can get to the same answer but add a bit of intelligence to make AI processing more energy efficient,” says Gadepally.

Leave a reply

You cannot copy content of this page

x  Powerful Protection for WordPress, from Shield Security
This Site Is Protected By
Shield Security