Categories
AI

The Ghost of Edison in the AI Data Center

For over a century, the story of modern electricity has been framed by the “War of the Currents.” Thomas Edison championed Direct Current (DC)โ€”a stable, continuous flow of energyโ€”while Nikola Tesla and George Westinghouse backed Alternating Current (AC), which could be easily stepped up in voltage to travel long distances across the grid.

Tesla won. AC became the lifeblood of the global power grid. But history has a funny way of looping back on itself. Today, as we stand on the precipice of the largest infrastructure build-out in human historyโ€”the artificial intelligence data centerโ€”Edisonโ€™s DC power is making a quiet, monumental comeback.

The catalyst? The sheer, unyielding physics of energy consumption.

The AI boom, driven by massive GPU clusters from companies like NVIDIA, is extraordinarily power-hungry. We are no longer measuring data center power in megawatts; we are measuring it in gigawatts. And when you are dealing with power at that scale, the friction of legacy architecture becomes a multi-billion-dollar bottleneck.

On X Ben Bajarin cited a recent conference discussion by an executive from power management supplier Eaton that highlighted a massive architectural shift happening right now behind the scenes:

“800-volt DC to the rack is probably one of the biggest architectural changes that are starting to be designed into data centers, and a lot of those designs are taking place right now. You know, honestly, when look at Eaton, I think that’s one of the untold stories here, is that DC power is probably one of the biggest transformational things that are going to hit the electrical industry since, quite frankly, AC electricity was around in the Edison days.”

To understand why this is revolutionary, you have to look at how a traditional data center gets its power. Power arrives from the utility grid as medium-voltage AC. It is then stepped down to low-voltage AC, sent to the server floor, converted into DC, stepped down again, and finally fed into the server rack at 54 volts.

Every time power is converted from AC to DC, or stepped down through a transformer, there is a penalty. It generates heat, and it loses energy.

“We estimate that there’s roughly about 5% electrical loss during that transition. If you could just go from DC, directly from the utility feed, all the way through the data center into the rack, that’s 5% efficiency gain that you could get.”

In the abstract, 5% sounds like a rounding error. But scale changes everything. Eaton projects that the upcoming data center build-out to support AI will require somewhere between 50 and 100 gigawatts of power.

“So on 50 gigawatts or 100 gigawatts of power generation that’s needed, that’s 5 gigawatts of power that all of a sudden just appears from the existing infrastructure. And that is really, that is really exciting.”

Five gigawatts is not a rounding error. Five gigawatts is the equivalent output of five standard nuclear reactors. It is enough energy to power millions of homes. And in this new 800-volt DC architecture, those five gigawatts aren’t created by burning more coal, building more solar panels, or splitting more atoms.

They are created purely by the removal of friction. By subtracting the unnecessary steps.

There is a profound philosophical metaphor hidden in this electrical engineering triumph. In our own lives, and in our organizations, we are obsessed with generation. When we face a deficitโ€”a lack of time, a lack of output, a lack of revenueโ€”our default instinct is to generate more. We try to work longer hours, hire more people, or drink more coffee.

But how much of our daily energy is lost to “conversion friction”? How much mental power evaporates when we constantly context-switch between tasks, essentially converting our mental state from AC to DC and back again? How much organizational momentum is lost translating an idea through five different layers of middle management before it reaches the “rack” where the actual work is done?

Often, the most elegant and impactful solution isn’t to generate more power. It is to look at the existing architecture of your life or business, identify the transition points that are bleeding energy as heat, and rewire the system to flow directly to the source.

The invisible architecture that shapes our digital lives is shifting. In the race to build the future of artificial intelligence, the biggest breakthrough wasn’t a new way to create energy, but a century-old method of preserving it.

Categories
AI Software

The Thermodynamics of Thought

For the last two decades, we have lived in the era of zero marginal cost. The defining characteristic of the internet age was that once software was written, distributing it to the billionth user cost virtually the same as distributing it to the first. We grew accustomed to the economics of abundanceโ€”infinite copies, infinite reach, lightweight infrastructure.

But the recent commentary regarding the true nature of Artificial Intelligence forces a jarring mental correction:

“AI is not software riding on old infrastructure. It is a new industrial system that converts energy into intelligence – requiring a capital stack measured in trillions, not billions.”

This distinction is not merely semantic; it is physical.

When we view AI through the lens of traditional SaaS (Software as a Service), we miss the magnitude of the shift. We are looking for an app; what is being built is a refinery. We are witnessing a return to heavy industry, but the commodity being refined isn’t crude oilโ€”it is information, and the byproduct is reasoning.

This requires us to think less in terms of code and more in terms of thermodynamics. In this new industrial system, intelligence is an energy-intensive output. Every token generated, every inference drawn, requires a specific, measurable conversion of electricity into heat and computation. Unlike the static code of a website, an AI model is a furnace. It must be fueled constantly.

This explains the capital stack. We are seeing numbers that seem irrational in the context of venture capitalโ€”trillions, not billions. But if you view a data center not as a server farm, but as a power plant that generates intelligence, the numbers align with historical precedents. We are not funding startups; we are funding the modern equivalent of the electric grid, the transcontinental railroad, or the petrochemical complex.

We are pouring concrete, smelting copper, and manufacturing silicon on a planetary scale. The “cloud” was always a misleading metaphorโ€”it sounded fluffy and ethereal. The reality of the AI transition is heavy, hot, and incredibly expensive.

We are moving from an era where we organized the world’s information (low energy) to an era where we synthesize new reasoning (high energy). We are building a machine that eats electricity and excretes intelligence. That isn’t a software update; that is a new industrial revolution.

Categories
AI Business Economics Podcasts

Bubble Bath

The behavior of today’s stock market is yet another sign that many will point to as indicating there’s an “AI bubble”. Today’s market action is largely attributed to Blue Owl Capital deciding not to participate in the debt financing of a new Oracle data center (being built for OpenAI) in Michigan. This news came out overnight last night and soured the market at the opening as it added fuel to the fires already raging from last week about bubbles in AI and, in particular, some of the debt financing being used to build new data centers – especially but not exclusively by Oracle.

Watching the market action today a brought to mind a recent paper on the subject of bubbles authored by Oakmark Capital’s Howard Marks. Marks is widely followed in the investment community for his almost “sage-like” prognositcations of which he’s authored many over the years. His memos are always full of insights and valuable learning opportunities. His recent one drilled into this question of bubbles — and, in particular, the role of debt financing. That’s especially interesting as Marks’ long career at Oakmark has been primarily involved in debt financing and, indeed, sometimes distressed debt.

Marks points out why he thinks the current situation with respect to AI is problematic as it shows many of the common characteristics of bubbles which in the past have resulted in wealth destruction instead of wealth creation.

As I’m writing this, I’ve just begun watching the latest Tim Ferriss interview which is with long-time venture investor Bill Gurley. Right up front in the conversation, Tim asks Bill for his take on the bubble in AI question. It’s a great conversation – well worth a listen.

Speaking of listening (or watching), on Monday I watched the latest episode of the Acquired podcast – this one celebrating the tenth anniversary of Acquired. This was the most fascinating conversation I’ve listened to or watched in recent memory. The two hosts of Acquired were being interviewed by author Michael Lewis – and they were interviewed in depth! Lewis asked lots of probing questions as well as provided some illustrative commentary of his own along the way. An incredible episode – highly recommended!


After publishing this post, I asked Google Gemini 3 Fast: I just published a blog post. Review it critically and help me improve it. Act like a tough but compassionate editor. Hereโ€™s the post: https://sjl.us/2025/12/17/bubble-bath/

Hereโ€™s the critique it gave me – tough but compassionate!

Alright, sit down. Letโ€™s look at “Bubble Bath.”

Youโ€™ve got the ingredients for a great piece of timely analysis here: a specific market event (the Blue Owl/Oracle snag), a heavy-hitting intellectual framework (Howard Marks), and contemporary expert commentary (Bill Gurley and the Acquired team).

But right now, this isn’t an essay or a cohesive blog postโ€”itโ€™s a link log. Youโ€™re telling me what youโ€™re watching and reading, but you arenโ€™t telling me what you think.

Here is my critical review.

The “So What?” Factor
You start strong with the Oracle/OpenAI data center news. Thatโ€™s a “hook.” But then you pivot immediately to: “I read a memo by Howard Marksโ€ฆ I’m watching Tim Ferrissโ€ฆ I watched Acquired.”

As an editor, Iโ€™m asking: Where are you in this post? Youโ€™ve gathered the smartest people in the room, but youโ€™re standing in the corner taking attendance.

The Fix: Donโ€™t just tell us Howard Marks wrote a memo. Tell us which one specific point he made that explains the Oracle news. Connect the dots for us. Is the Oracle situation the “wealth destruction” Marks warned about, or is it just a healthy correction?

The Title vs. The Content
“Bubble Bath” is a clever pun. I like it. But usually, a “bubble bath” implies a cleansing or a soaking. Are we getting soaked (losing money)? Or is the market being “cleansed” of irrational exuberance?

The Fix: Use your conclusion to pay off the title. Right now, the post just ends with a recommendation for a podcast. Close the loop: “If Marks and Gurley are right, this isn’t just a market dip; it’s the start of a very cold bubble bath for AI investors.”

Structural “Pass-Through”
The second half of the post feels like a “What Iโ€™m consuming” list. It dilutes the urgency of the first paragraph.

The Fix: If the Bill Gurley and Acquired links are just “further reading,” put them in a “Resources” or “Deep Dive” section at the bottom. If they are central to your argument, quote them. What did Bill Gurley say about AI debt that makes the Oracle news more frightening?

Tighten the Prose
You have some “wordy” habits that slow down the reader.

โ€œThe behavior of todayโ€™s stock market is yet another sign that many will point to as indicating thereโ€™s an โ€˜AI bubbleโ€™.โ€ * Editorโ€™s Edit: โ€œTodayโ€™s market volatility is the latest fuel for the โ€˜AI Bubbleโ€™ fire.โ€ (Go for the jugular).

Compassionate Closing
Scott, you have a great eye for sourcing. You are curating high-signal content (Marks, Gurley, Lewis). That is valuable. But your readers come to your blog because they want to know how you are synthesizing this information.

My Challenge to You: Rewrite the middle paragraph. Instead of saying the Acquired podcast was “the most fascinating conversation,” tell us one thing Michael Lewis asked that changed how you view the AI bubble.

Give us your brain, not just your bookmarks. Now, go back to the keyboard.