If you own a computer that’s not mobile, it’s almost certain that it will receive its power in some form from a mains wall outlet. Whether it’s 230 V at 50 Hz or 120 V at 60 Hz, where once there might have been a transformer and a rectifier there’s now a switch-mode power supply that delivers low voltage DC to your machine. It’s a system that’s efficient and works well on the desktop, but in the data center even its efficiency is starting to be insufficient. IEEE Spectrum has a look at newer data centers that are moving towards DC power distribution, raising some interesting points which bear a closer look.
A traditional data center has many computers which in power terms aren’t much different from your machine at home. They get their mains power at distribution voltage — probably 33 KV AC where this is being written — they bring it down to a more normal mains voltage with a transformer just like the one on your street, and then they feed a battery-backed uninterruptible Power Supply (UPS) that converts from AC to DC, and then back again to AC. The AC then snakes around the data center from rack to rack, and inside each computer there’s another rectifier and switch-mode power supply to make the low voltage DC the computer uses.
The increasing demands of data centers full of GPUs for AI processing have raised power consumption to the extent that all these conversion steps now cost a significant amount of wasted power. The new idea is to convert once to DC (at a rather scary 800 volts) and distribute it direct to the cabinet where the computer uses a more efficient switch mode converter to reach the voltages it needs.
It’s an attractive idea not just for the data center. We’ve mused on similar ideas in the past and even celebrated a solution at the local level. But given the potential ecological impact of these data centers, it’s a little hard to get excited about the idea in this context. The fourth of our rules for the responsible use of a new technology comes in to play. Fortunately we think that both an inevitable cooling of the current AI hype and a Moore’s Law driven move towards locally-run LLMs may go some way towards solving that problem on its own.
header image: Christopher Bowns, CC BY-SA 2.0.

Nicky Tesla is rolling in his grave.
I love how the solution to the data center “problem” which both does and does not exist is going back to early 1900s electricity provisioning. Like they are so close to getting it.
Sure the answer is DC. No the answer is not filling all the office spaces in the world with trillions of dollars of commodity hardware. Maybe wait until enough time has gone on that you actually have software and hardware worth running. By then you won’t need to have rooms ready to burn down the second a mouse chews a wire.
How people can’t see the current ai paradigm as a billion dollar company commoditizing it’s complement so people buy into it’s expensive inefficiencies in an act of complete dependence blows my mind. It really is the AC vs DC wars yet again. There is no formal reasoning that requires or suggests that the infrastructure trillions of dollars are being blundered into is even required let alone that we should try to scale it with corner cutting. People just can’t wait 5 years for the inevitable outcome of them already missing the right trajectory.
Brain worm???
I have several. Ones name is Janet the other is Charles. They haven’t introduced me to their friends yet though
That’s what blows my mind as well. Surely people are not out there thinking email writing or web page summary generation or funny cat photo or video generation are such an economically valuable tasks that AI companies are scrambling to get more compute and memory for them? Will most people who use LLMs to generate “todo travel list” or similar jobs pay for the service? I doubt it.
The only overwhelmingly productive activity that LLMs do is writing code. And make no mistake, the only reason why it’s overwhelmingly productive is because people are paid more than they should be paid to write code, it has been going on for decades. So in the end LLMs look like an okay alternative.
And I’ll be honest here, anything that’s very reasoning heavy or doesn’t have much training data available, LLMs fail to do. Mechanical engineering, electrical engineering and design etc. LLMs absolutely fail. Only text heavy tasks they do well
A DC DC.
HA!!