Article count:1020 Read by:1502533

Featured Content
Account Entry

Training large AI models consumes an astonishing amount of power. Who will be the next generation of AI "power manager"?

Latest update time:2026-03-24
    Reads:


Gartner's analysis of the top ten strategic technology trends for 2026 reveals a fundamental shift in the technology battlefield and sends a clear signal to all technology decision-makers: we are entering a highly interconnected world completely reshaped by artificial intelligence (AI). In this world, computing power is the new oil, while power supply and cooling systems are the critical lifelines ensuring that the oil field does not run dry or spontaneously combust.





According to Stanford's 2023 AI Index Report, the GPT-3 AI large language model consumes 1287 megawatt-hours of electricity per training session. This is equivalent to the total electricity consumption of 3000 new energy vehicles each driving 200,000 miles, while the training process emits 552 tons of carbon dioxide. Such massive energy consumption not only brings high electricity costs and potential carbon tax costs to enterprises, increasing their operational burden, but also exacerbates carbon emission pressures at the societal level. The key to breaking the deadlock in energy efficiency versus cost lies precisely in the often-overlooked innovation and architectural upgrades of power supply technology.


The "death cross" of performance and power consumption:

Why has power supply become the Achilles' heel of AI?


The computing power race for AI chips has entered a fierce stage, but this extension of Moore's Law is drawing an unsettling trajectory: every leap in performance is accompanied by a steeper increase in power consumption.


We are approaching the "power wall"—simply stacking transistors no longer yields proportional benefits, but instead turns data centers into "power hogs." This is not only a cost issue, but also a physical bottleneck for scaling.


The shift from "optional" to "mandatory" liquid cooling technology is a direct manifestation of this silent crisis. However, a more fundamental and pressing challenge than heat dissipation is "power supply." An inefficient or unstable power supply unit (PSU) can significantly reduce the performance of an AI computing cluster worth millions, or even cause it to crash instantly.


Decoding the Next Generation Data Center:

A paradigm shift from "power supply" to "smart electricity"


In future AI data centers, power systems will no longer be relegated to a minor role, but rather become core intelligent infrastructure. This requires achieving three major leaps:


From centralized to distributed: Traditional centralized power supply is like "flood irrigation," with high line losses, making it difficult to meet the needs of refined rack-level management. The future trend is to evolve towards distributed and modular power supply at the rack level and even the chip level, allowing power to reach computing units more accurately and efficiently.


From General-Purpose to Special-Purpose: AI workloads have an extremely wide dynamic range; the jump from standby to full load can occur in milliseconds. A general-purpose PSU is like a family sedan pulling a heavy truck—it simply can't handle it. A dedicated AI PSU must have extremely high peak load capacity and ultra-fast dynamic response, like equipping a data center with a professional racing car engine.


From passive to proactive: Smart power supplies will deeply integrate AI algorithms to achieve predictive power management. By analyzing historical workload data, power resources can be scheduled and allocated in advance, transforming "passive response" into proactive optimization, maximizing energy efficiency while ensuring performance.


The Power Density War:

How to redefine the ceiling of a 12kW power unit


In AI data centers, space is money, and efficiency is life. Improving power density—that is, providing greater and more stable power output in a smaller volume—has become the core of technological breakthroughs.


Behind this lies a comprehensive competition across materials science, semiconductor technology, and topology design. For example, the 12kW AI Cloud PSU reference design from Avnet's partner ON Semiconductor represents a significant breakthrough.


This solution uses advanced semiconductor materials such as silicon carbide (SiC), which can withstand higher voltages, temperatures and switching frequencies, pushing power conversion efficiency to new heights and significantly reducing energy waste during the conversion process.



Its 12kW rated power is about 50% higher than the current mainstream solutions on the market, meaning that a single rack can support a denser computing power deployment. More importantly, while increasing power density, it maintains excellent conversion efficiency (typically exceeding 96%), directly reducing huge electricity costs and cooling burdens.


This is not just a product iteration, but also sets a new performance benchmark for the power supply architecture of future ultra-large-scale AI data centers.


From single components to ecological empowerment


To address the extreme power challenges posed by AI computing power, single-point hardware innovation is insufficient; the key lies in building end-to-end capabilities "from the power grid to the chip." This requires technology solution providers to transform from component suppliers to ecosystem enablers. As a leading global technology distributor and solutions provider, Avnet not only helps customers significantly shorten product launch cycles with its highly mature reference designs, but also provides supply assurance, visibility, and agility through a comprehensive portfolio of services and solutions, precisely meeting customers' differentiated needs. This allows customers to focus more on AI business and algorithm innovation, without having to repeatedly develop underlying power engineering.


Technology leaders in 2026 must understand this: the smartest future will be rooted in the most solid energy foundation.


END


Interactive topics

This week's interactive topic is: From passive power supply to active intelligent power supply, what disruptive changes do you think AI algorithms can bring to power management? Feel free to leave a comment and participate in the interaction. We will randomly select 2 users to each receive an Avnet notebook + a set of bookmarks.

Interactive Gifts for This Issue

Avnet Notebook Avnet Bookmarks





—— More exciting content ——


Regarding Avnet

Avnet is a leading global technology distributor and solutions provider, committed for over a century to meeting the evolving needs of its customers. Through its global network of specialized and regional operations, Avnet supports customers and suppliers at every stage of the product lifecycle . Avnet helps companies of all types adapt to changing market environments, accelerating design and delivery during product development. Avnet's central position and unique perspective across the entire technology supply chain make it a trusted partner, helping customers solve complex design and supply chain challenges to achieve revenue faster. For more information about Avnet, please visit www.avnet.com




Latest articles about

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
community

Robot
development
community

About Us Customer Service Contact Information Datasheet Sitemap LatestNews

Room 1530, Zhongguancun MOOC Times Building,Block B, 18 Zhongguancun Street, Haidian District,Beijing, China Tel:(010)82350740 Postcode:100190

Copyright © 2005-2026 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号