Advertisment

AI: The Next Big Power Guzzler?

AI may be the coolest kid in the fancy tech town and new business street, but it’s not literally ‘cool’. Is that something to be worried about? Now?

author-image
DQINDIA Online
New Update
image

Where is the world’s largest refrigerator? Well, it is not in some mega grocery barn or parked at an ice-cream fairyland. It is always whirring busily at some huge nuclear fusion lab or juicing up a quantum lab or firing up the cylinders of a supercomputer.

Advertisment

So it was not a surprise when someone first questioned – Hey, how much power does AI slurp- if it has to run all those large models and churn up that many answers, and that fast? The question is not boring. As to whether it is too soon or too serious – now that, calls for some asking around.

AI-Not an Ermine For Sure

Let’s start with Indranil Bandyopadhyay, principal analyst at Forrester (who packs special domain expertise in areas like data science and AI). “AI’s power consumption angle is a valid question to ask- especially after the heightened interest seen with Gen AI. But we cannot single it out - it is contributing to the carbon footprint that has been created by other industries already. That said - large models – especially in computer vision areas and the rise of multi-modality can add more power burden on today’s models.”

Advertisment

image

IT infrastructure supporting AI models consumes 50-60% of its total power demand when it is not doing work. - Jay Dietrich, Research Director of Sustainability, Uptime Institute

Uptime Intelligence estimates that generative AI annualized energy accounted for around 2.3 percent of the total grid power consumption by data centers in the first quarter of 2024. And this could reach 7.3 percent by the end of 2025.

Advertisment

When fully operational, an AI training system demands a large amount of power, up to two megawatts (MW), Jay Dietrich, Research Director of Sustainability at Uptime Institute gives a back-of-the-envelope glimpse. “In the context of a 300 to 500 MW cloud data center complex run by large hyperscalers, this is a small portion of the installed power capacity. As more and larger training systems are deployed, they will more quickly grow the power demand at a facility in a smaller footprint compared to traditional compute.”

image

Here’s how and why. As Dietrich explains further, the big LLMs are likely utilizing brute force computation in their model with significant idle time due to unoptimized software and workload management within their operation. “Available data and informal, anecdotal information suggest that IT infrastructure supporting AI models consumes 50-60 percent of its total power demand when it is not doing work. The average utilization of the compute capability is around 50 percent during the model run. These conditions indicate that firstly, there is a lot of wasted power; secondly, an opportunity exists to construct and better utilize the available compute time. Uptime sees opportunities to significantly increase the work delivered per megawatt hour of power consumed through better software design.”

Advertisment

It’s not just data but chips and bricks around it too that are pushing AI on the carbon side. Anushree Verma, Director Analyst, Gartner explains how GenAI models require training with massive amounts of data, which uses much electricity and cooling water. “This creates sustainability issues and carbon footprint problem. For example, training GPT-3.5 required a supercomputer with 285,000 processor cores and 10,000 graphics processing unit chips, which consumed more than 200MWh of power. However, there are other unknown factors (like the exact number of runs or the kind of infrastructure built to deploy the model) that will have amplified the power consumption even further.”

image

When cars came, they were not so focused on pollution. But with time, everything changed. The same progression can happen with AI. - Indranil Bandyopadhyay Principal Analyst, Forrester

Advertisment

Raghavendra Rengaswamy - EY Global Delivery Services (GDS), Consulting Data and Analytics Leader avers that the significant need for computational resources by AI technologies has led to increased demand valuations for those who provide the picks and shovels to run the AI machinery.

image

A carbon tax on AI companies and data centers is a viable strategy to address the environmental costs of high-powered AI operations. - Saurabh Rai, CEO, Arahas

Advertisment

“A study by the University of Massachusetts Amherst highlighted that training a single AI model can emit as much carbon as five cars over their lifetimes. This stark comparison underscores the importance of considering the environmental implications of AI development.” Ramprakash Ramamoorthy, Director of AI research, ManageEngine, Zoho Corp. adds that with the emergence of large language models (LLMs) that are GPU-hungry, the need for sustainable computing practices has become more crucial than ever.

image

The clarion call is clear: AI is not the harbinger of a carbon apocalypse but a potent ally in our quest for sustainability. - Raghavendra Rengaswamy, EY Global Delivery Services (GDS)

Advertisment

All this is now going to get more and more challenging.

Dhirender Mishra, Associate Vice President, Growth Advisory, Aranca (a research firm) observes that as AI models advance, their energy requirements are expected to grow dramatically; achieving a tenfold improvement in AI model efficiency could result in a power demand surge of up to 10,000 times. For example, training GPT-4 required over 50 gigawatt-hours of electricity, nearly 50 times the amount consumed to train GPT-3.

AI’s escalating carbon footprint cannot be overlooked, particularly as AI systems grow in complexity and energy demand, Saurabh Rai, CEO, Arahas (a player in Geospatial, AI, and Sustainability Technology services) warns. While tech companies are making strides in optimising energy efficiency and developing less power-intensive AI algorithms, the fundamental issue remains significant.

AI- Not an Energy Hog – Either

But – in all fairness- we cannot forget that AI is also the reason that clean energy efforts can now run in a smarter, bigger and faster way. And Big Tech is – at least, claiming to be – investing a lot in fusion energy, clean energy sources and offsets to support AI ambitions.

We may be tempted to think that One-Upmanship  is another reason that makes AI an energy culprit.  Specially when all the top players are vying for the ‘I did it first’ spot.

Bandyopadhyay, however, dismisses that angle. “The focus of most players is on how well the models perform – and not so much on environmental-angles.” He debunks the myth that AI is out there to eat energy. “AI evolution is just like any technology advancement seen earlier – like automobiles and aviation. It’s at a nascent stage now. But just like what happened with Cars- we will see a progression towards better products which are high on efficiency and less of power guzzlers.”

Rengaswamy reminds how, paradoxically, AI is a beacon of hope for slashing carbon emissions. “It is not just about replacing surveys with drones, thereby curtailing transportation emissions; it is about the overarching narrative of efficiency. AI is the maestro of optimisation, from bolstering the efficacy of renewable energy sources like wind and solar to being the cerebral cortex of electric vehicles. These are not mere incremental changes; they are quantum leaps toward a greener future. Data centers, once power-hungry behemoths, are on the cusp of a renaissance as AI-driven efficiency becomes the norm.

Data center operators had significant energy consumption growth projections prior to the emergence of AI. AI accelerates both the timing of data center buildout projections and the expected energy capacity requirements of new construction. However, a range of factors suggest that current growth projections are not achievable, Dietrich wipes away some fog here. Why? Well, the industry has not identified the ‘killer’ AI application(s) that will fuel the revenue streams needed to support the levels of investment projected by hyperscalers and colocation data center operators. “In addition, investment projections beyond two years are notoriously unreliable—changes in revenue projections, the state of the economy, competitive position, and other actors can quickly turn a bullish investment outlook into a bearish one.” Dietrich argues.

image

Specialized hardware can significantly reduce the energy needed for AI computations, making AI operations sustainable. - Ramprakash Ramamoorthy, Zoho Corp.

Saying Hi to the Grey Rhino – in time

Whichever side AI leans towards, we may, nonetheless, start thinking about making AI greener rather than greedier.

Solutions will happen in multi-pronged ways, Bandyopadhyay augurs. “Example- use of Edge. Use of smaller models. Use of transfer learning. Getting rid of parameters that are not needed and investing in fine-tuning models. The supply side will play a big role here too- like- the use of renewables in sourcing energy.” And, as Bandyopadhyay reminds about a practical side - it will not be because some company will take a high moral ground. “Energy is directly related to costs. So we will see advent of AI products that give value at lower costs – which will have a corresponding low energy angle. Also, it is a high probability that the next S curve in AI – when it comes- will be about less costlier solutions that consume less computer muscle.”

image

Small models can be an immediate answer. As Dietrich seconds, there is an undercurrent of information noting significant development invested in small, issue-specific models. “These models are task-focused, reputed to be more efficient, and require much less energy to train and query. Deployment of these models will likely bend the projected power growth curve.”

Anushree-Verma

By 2028, 30% of GenAI implementations will be optimized using energy-conserving computational methods, driven by sustainability initiatives. - Anushree Verma, Director Analyst, Gartner

Rai stresses that we should not only adopt renewable energy and innovate in energy-efficient technologies, but also consider financial measures such as taxing data centers that fail to meet environmental standards. “This would promote accountability and incentivise greener practices. A Carbon tax would encourage companies to innovate to reduce their carbon emissions, rather than simply passing the cost on to consumers. This tax must be structured to support sustainability without unduly stifling innovation. Funds collected could be reinvested into sustainable technology grants and renewable energy projects, enhancing the sector’s overall sustainability.”

image

Extend model lifespans and apply compression techniques to decrease energy use associated with frequent retraining. - Anay Pathak, Dell Technologies

There’s also some hope in areas like AI (inference) and new model architectures. Dietrich opines that AI Inference has a different, less intense power profile than AI training. “It has been noted, however, that a Google search consumes .3 watt-hours while a ChatGPT request uses 2.9 watt-hours. Google search has spent over a decade improving the energy efficiency of a single search. ChatGPT searches will likely ride an energy efficiency curve over the next several years, significantly reducing the energy demand per search.” But there are many uncertainties about the projected power growth curve. Uptime sees constraints driven by lead times to permit and design facilities and procure and install equipment for data centers and electricity generation. Uptime also perceives significant efficiencies to be harvested in AI and inference training and queries by focusing on software improvements and workload placement management to increase the work delivered per MWh consumed.

Techniques such as model pruning, quantization, and knowledge distillation aim to reduce the size and power requirements of AI models without compromising performance. These methods not only decrease energy consumption but also make AI more accessible and affordable.

GenAI models require training with massive amounts of data, which uses much electricity and cooling water.

Another promising approach is the development of specialised AI hardware, Ramamoorthy shines the light on chips here. “Chips designed specifically for AI tasks, like FPGAs, Google’s Tensor Processing Units (TPUs), and NVIDIA’s newer generation GPUs, offer higher performance per watt compared to traditional CPUs.”

Rai also puts the areas of chip and algorithm design on the centre of the table here. “Advances here can significantly decrease the energy consumption of AI technologies. While concerns exist about potential trade-offs in speed and competitiveness, the reality is that efficiency can coexist with performance. By investing in next-generation technologies like neuromorphic computing and optimising algorithmic efficiency, we can reduce environmental impacts without sacrificing capabilities.”

image

Anay Pathak, Global Business Director, Data Protection & Cyber Resilience (Alliances), Dell Technologies offers more suggestions. “Design AI models with energy efficiency in mind by optimising neural networks and using power-efficient hardware like GPUs and TPUs. Prune unnecessary model components and quantize weights to reduce computation and memory requirements. Use data augmentation and generate synthetic data to reduce reliance on large datasets.”

There is a growing trend towards federated learning, which allows AI models to be trained across multiple devices using decentralised data, Ramamoorthy recommends. “This method reduces the dependence on bigger data centers, thereby cutting down on energy consumption. By distributing the training process, federated learning minimises the environmental impact while enhancing data privacy and security.”

Whether these steps will work or not- is another question, altogether. Verma gives a pragmatic prism. “Technology players are aware of it, but it’s hard to make AI modeling more efficient, especially as most organizations that use turnkey solutions use third-party modeling services. Users can control only the environment in which they model — such as using a high-performance, high-efficiency chip (that is, custom silicon) designed for AI modeling, or using renewable energy or low-carbon electricity during training. For AI infrastructure, there are some core areas to look at optimising such as semiconductors, software, cooling technologies. Enterprises will adopt different approaches to optimise GenAI implementations using

energy-conserving computational techniques, driven by their organisation priorities and the importance their organisation places on sustainable business practices.”

Major tech firms must lead by example, demonstrating that innovation can align with ecological responsibility, Rai zooms out on the bigger picture. “This involves not only adopting sustainable practices but also being transparent about energy consumption and carbon emissions. India’s journey towards integrating AI with sustainability is fraught with challenges including high energy demands, significant socio-economic disparities, and environmental concerns. This could involve stricter regulations on AI-driven operations, incentives for clean energy usage, and heavy penalties for non-compliance. Furthermore, developing local AI solutions that cater to India’s unique environmental and social landscape will be crucial.”

Right now, everything is new, Bandyopadhyay gives a reality-check. “As long as someone is funding it, players will not mind the energy angle too much. But as things settle down, we will see a clearer picture.”

“It is about harnessing AI’s transformative power to not just reimagine but rebuild our world, where increased efficiency is synonymous with reduced emissions. This is the future we must invest in, a future where AI and the environment are not at odds, but in harmony.” Rengaswamy hopes.

There is still time to find light at the end of this tunnel. But there is always light when we open a fridge. The problem is that most of the times we open it not because we are hungry but because we want to eat something before someone else does. Sadly, that ‘sibling logic’ applies to all refrigerators. Whether in our homes. Or in the big ones – housed in a tech backyard.

 

Advertisment