Categories
AI

The Ghost of Edison in the AI Data Center

For over a century, the story of modern electricity has been framed by the “War of the Currents.” Thomas Edison championed Direct Current (DC)—a stable, continuous flow of energy—while Nikola Tesla and George Westinghouse backed Alternating Current (AC), which could be easily stepped up in voltage to travel long distances across the grid.

Tesla won. AC became the lifeblood of the global power grid. But history has a funny way of looping back on itself. Today, as we stand on the precipice of the largest infrastructure build-out in human history—the artificial intelligence data center—Edison’s DC power is making a quiet, monumental comeback.

The catalyst? The sheer, unyielding physics of energy consumption.

The AI boom, driven by massive GPU clusters from companies like NVIDIA, is extraordinarily power-hungry. We are no longer measuring data center power in megawatts; we are measuring it in gigawatts. And when you are dealing with power at that scale, the friction of legacy architecture becomes a multi-billion-dollar bottleneck.

On X Ben Bajarin cited a recent conference discussion by an executive from power management supplier Eaton that highlighted a massive architectural shift happening right now behind the scenes:

“800-volt DC to the rack is probably one of the biggest architectural changes that are starting to be designed into data centers, and a lot of those designs are taking place right now. You know, honestly, when look at Eaton, I think that’s one of the untold stories here, is that DC power is probably one of the biggest transformational things that are going to hit the electrical industry since, quite frankly, AC electricity was around in the Edison days.”

To understand why this is revolutionary, you have to look at how a traditional data center gets its power. Power arrives from the utility grid as medium-voltage AC. It is then stepped down to low-voltage AC, sent to the server floor, converted into DC, stepped down again, and finally fed into the server rack at 54 volts.

Every time power is converted from AC to DC, or stepped down through a transformer, there is a penalty. It generates heat, and it loses energy.

“We estimate that there’s roughly about 5% electrical loss during that transition. If you could just go from DC, directly from the utility feed, all the way through the data center into the rack, that’s 5% efficiency gain that you could get.”

In the abstract, 5% sounds like a rounding error. But scale changes everything. Eaton projects that the upcoming data center build-out to support AI will require somewhere between 50 and 100 gigawatts of power.

“So on 50 gigawatts or 100 gigawatts of power generation that’s needed, that’s 5 gigawatts of power that all of a sudden just appears from the existing infrastructure. And that is really, that is really exciting.”

Five gigawatts is not a rounding error. Five gigawatts is the equivalent output of five standard nuclear reactors. It is enough energy to power millions of homes. And in this new 800-volt DC architecture, those five gigawatts aren’t created by burning more coal, building more solar panels, or splitting more atoms.

They are created purely by the removal of friction. By subtracting the unnecessary steps.

There is a profound philosophical metaphor hidden in this electrical engineering triumph. In our own lives, and in our organizations, we are obsessed with generation. When we face a deficit—a lack of time, a lack of output, a lack of revenue—our default instinct is to generate more. We try to work longer hours, hire more people, or drink more coffee.

But how much of our daily energy is lost to “conversion friction”? How much mental power evaporates when we constantly context-switch between tasks, essentially converting our mental state from AC to DC and back again? How much organizational momentum is lost translating an idea through five different layers of middle management before it reaches the “rack” where the actual work is done?

Often, the most elegant and impactful solution isn’t to generate more power. It is to look at the existing architecture of your life or business, identify the transition points that are bleeding energy as heat, and rewire the system to flow directly to the source.

The invisible architecture that shapes our digital lives is shifting. In the race to build the future of artificial intelligence, the biggest breakthrough wasn’t a new way to create energy, but a century-old method of preserving it.

Categories
AI Software

The Thermodynamics of Thought

For the last two decades, we have lived in the era of zero marginal cost. The defining characteristic of the internet age was that once software was written, distributing it to the billionth user cost virtually the same as distributing it to the first. We grew accustomed to the economics of abundance—infinite copies, infinite reach, lightweight infrastructure.

But the recent commentary regarding the true nature of Artificial Intelligence forces a jarring mental correction:

“AI is not software riding on old infrastructure. It is a new industrial system that converts energy into intelligence – requiring a capital stack measured in trillions, not billions.”

This distinction is not merely semantic; it is physical.

When we view AI through the lens of traditional SaaS (Software as a Service), we miss the magnitude of the shift. We are looking for an app; what is being built is a refinery. We are witnessing a return to heavy industry, but the commodity being refined isn’t crude oil—it is information, and the byproduct is reasoning.

This requires us to think less in terms of code and more in terms of thermodynamics. In this new industrial system, intelligence is an energy-intensive output. Every token generated, every inference drawn, requires a specific, measurable conversion of electricity into heat and computation. Unlike the static code of a website, an AI model is a furnace. It must be fueled constantly.

This explains the capital stack. We are seeing numbers that seem irrational in the context of venture capital—trillions, not billions. But if you view a data center not as a server farm, but as a power plant that generates intelligence, the numbers align with historical precedents. We are not funding startups; we are funding the modern equivalent of the electric grid, the transcontinental railroad, or the petrochemical complex.

We are pouring concrete, smelting copper, and manufacturing silicon on a planetary scale. The “cloud” was always a misleading metaphor—it sounded fluffy and ethereal. The reality of the AI transition is heavy, hot, and incredibly expensive.

We are moving from an era where we organized the world’s information (low energy) to an era where we synthesize new reasoning (high energy). We are building a machine that eats electricity and excretes intelligence. That isn’t a software update; that is a new industrial revolution.

Categories
AI Books Nvidia

The Thinking Machine

Over the weekend after Christmas, I started reading Stephen Witt‘s book The Thinking Machine: Jensen Huang, Nvidia, and the World’s Most Coveted Microchip which was published last April.

For some reason, I ignored this book until the end of the year – but wow – was I hooked once I started reading it a few days ago. The book grew out of a New Yorker piece Witt wrote in 2023 titled “How Jensen Huang’s Nvidia Is Powering the A.I. Revolution“.

Witt’s book is obviously about Nvidia and CEO Jensen Huang but it’s also about so much more of what’s happening in the world of AI.

In addition, the last chapter is quite a capstone to the whole book – a delight.

Highly recommended!