|
Getting your Trinity Audio player ready...
|
“Artificial Intelligence will create more millionaires in the next five years than the internet did in twenty years… The continent must urgently shift from being just a source of raw materials to being a creator of intellectual property and innovation,” said Prof. Arthur Mutambara.
The masterclass on AI, presented by the Director of the Institute for the Future of Knowledge at the University of Johannesburg, highlights how technological innovations can be leveraged to drive growth, address local challenges using data centers to facilitate climate modeling, and position the continent for the future.
AI’s environmental burden
As the everyday use of AI has exploded in recent years, so have the energy demands of the computing infrastructure that supports it. These large data centers, despite their forward-looking endeavor, have a significant environmental toll, sucking up gigawatts of power and requiring vast amounts of water for cooling.
In 2025, according to Cornell University researchers, AI-driven data centers guzzled up to 765 billion liters of water, equivalent to the world’s bottled water consumption. This is especially problematic in arid regions; about two-thirds of new data centers built since 2022 are in water-stressed areas.
Moreover, the AI boom has pushed global data center energy use to over 1% of worldwide electricity demand in 2025, which is expected to triple by 2030 as nearly 9,000 facilities operate today, according to the Environmental and Energy Study Institute.
A single medium-sized data center can use around 110 million kilowatt-hours of electricity per year, comparable to powering 10,000 homes.
Additionally, high energy needs often rely on fossil fuels during peak times, straining local infrastructure and contributing to broader environmental degradation. Research shows that Data centers contributed about 0.5% of global CO2 emissions in 2025, with AI systems alone emitting up to 80 million tons.
The cumulative environmental effects of the rapid growth of demand for data centers have brought global disparity in sustainability into scrutiny. For instance, in 2025, U.S. data centers alone consumed enough water annually to meet the household needs of 6 to 10 million Americans (731 to 1,125 million cubic meters).
In contrast, Africa’s data center scene remains small and emerging compared to Europe or the US. Kenya, for instance, has a focus on sustainability that seems to sidestep the intense conflicts seen elsewhere.
Major projects, like the Microsoft-G42 geothermal-powered data center in Olkaria (announced in 2024 and advancing into 2025) and EcoCloud’s similar facility, deliberately use abundant local geothermal energy and incorporate “state-of-the-art water conservation technology.” This shows how strategic siting and renewables can avoid the pitfalls faced by arid regions and developing countries.
However, broader African studies flag potential risks from indirect water use (via power generation) in water-stressed countries, and a potential risk for Kenya, which faces droughts impacting farmers severely.
Innovation outrunning regulation
The policy lag in regulating AI and data centers’ environmental impacts arises because existing frameworks were designed for traditional industries, not the exponential growth of computing power or AI’s unique resource demands.
Golestan Sally Radwan, UNEP’s Chief Digital Officer, warned, “Governments are racing to develop national AI strategies but rarely do they take the environment and sustainability into account. The lack of environmental guardrails is no less dangerous than the lack of other AI-related safeguards.”
Regulations from decades ago, like the U.S. National Environmental Policy Act, impose lengthy permitting processes that delay infrastructure upgrades, while AI training demands double every few months, driving massive energy and water consumption that outstrips clean energy scaling.
Experts explain that this mismatch leaves AI-specific intensities, such as workload-driven power surges and evaporative cooling needs, largely unaddressed, as traditional metrics fail to capture the energy-water-climate nexus of modern data centers.
Compounding the gap is the absence of mandatory environmental reporting for AI systems and global sustainability standards for data centers, resulting in opaque industry practices and inconsistent oversight.
While the rapid growth of AI and data centers presents real environmental challenges, innovative solutions are emerging that significantly reduce impacts without sacrificing progress.
Industry leaders are prioritizing energy-efficient model design, with breakthroughs like specialized hardware and optimized architectures cutting power use dramatically.
Google’s latest TPUs, for example, deliver 30x greater energy efficiency for AI inference compared to earlier generations. Techniques such as model pruning and distillation are trimming unnecessary computations, while new chips from companies like IBM and NVIDIA enable trillion-parameter models to run at fractions of previous energy costs.
Besides, a promising shift is toward smaller, task-specific AI models, which can slash energy consumption by up to 90% for targeted applications like translation or summarization, often matching or exceeding the performance of massive general-purpose models.
