The Race We Already Lost
How a Temporary Scarcity Story Builds Permanent Power
The Surface
I recently watched a video from GNCA, Steve Burke’s consumer advocacy channel. He was documenting real consumer harm caused by AI companies racing to secure computer chips.
Data centers are hoovering up RAM. OpenAI alone has locked up roughly 40% of global DRAM supply for a data center project that won’t turn a profit until 2030. DDR5 kits that cost $120 six months ago now run over $400. Micron figured out they can charge consumer prices that match data center margins. Why compete when you can just raise the floor?
GPUs are getting stockpiled by companies that can’t even use them yet. Data centers are sitting dark across the country. Fully built, fully equipped, waiting for power that won’t arrive for years. Microsoft’s CEO admitted it on camera. Chips have stopped being the choke point. Electricity is. They’ve got inventory they can’t plug in.
Burke frames this as consumer harm, and he’s right. You’re paying more for components because corporations are stockpiling silicon they won’t power for years. Your electricity bill is climbing to feed facilities that serve chatbots.
It’s good journalism. Hardware accountability. The kind of thing Burke built his reputation on.
It’s also the shallow end of the pool.
The question Burke doesn’t ask: why would anyone buy chips that depreciate in three years to sit in buildings that won’t have power for four?
Either these companies are stupid, or we’re looking at the wrong asset.
The Tell
GPUs depreciate fast. Three years, maybe four, and you’re a generation behind. The chips that drove the first wave of hoarding are already getting eclipsed. Newer architectures ship every year. By the time these data centers get power, they’ll be running yesterday’s hardware.
And getting power from the grid runs on a different timescale. It takes permits. Environmental reviews. Studies to prove the grid can handle the load. Upgrades to the infrastructure that delivers the electricity. Some of these data centers won’t see utility power until 2028.
So the math doesn’t work.
They’re buying chips in 2025 that’ll be two generations old by the time they can plug them in. Because the chips were never the point.
The right to draw power from the grid is the asset they’re acquiring. Paperwork that says they get megawatts when the grid finally has them to spare. That right doesn’t become obsolete when a new chip architecture ships.
The chips give them urgency. Something physical to point at when they say this is about American competitiveness.
Then the chips sit idle in a warehouse while the thing these companies actually wanted, the grid connection, works its way through the queue.
The AI story is real enough. The investment thesis is something else.
The Play
So what are they actually buying?
Land. Massive tracts of it, parked near substations and transmission corridors. The kind of real estate that holds its value long after the chips have become obsolete.
Permits to build. Environmental approvals that normally take years. NEPA reviews. Air quality certifications. Water use agreements. Once they get them, they keep them.
A spot in the power queue. Utilities are already oversubscribed. Some of these waitlists stretch to 2030. Getting in line now is worth more than the hardware they claim to need the power for.
Their own power plants. Some are building gas turbines on site. Others are cutting deals with nuclear operators. Either way, the electricity bypasses the grid entirely and belongs to whoever built it.
None of this is easy to get. All of it gets easier when they’re waving the flag and talking about the race against China.
The race for AI dominance is real. But the national security language is also doing double duty. It clears the path for infrastructure that will outlast whatever AI advantage it was supposed to secure.
Because the lasting value isn’t in the silicon. It’s in the infrastructure that got permitted while everyone was watching the AI show.
The Capture
Washington isn’t confused. The lobbyists know what they’re buying. The companies paying them know.
In December of 2025, the White House issued an executive order on AI. The stated goal: “sustain and enhance the United States global AI dominance through a minimally burdensome national policy framework.”
Minimally burdensome. That’s the tell.
The order bars state laws that conflict with federal policy. It directs the Attorney General to stand up an “AI Litigation Task Force” within 30 days. The task force has “sole responsibility” for challenging state regulations deemed inconsistent with the administration’s approach.
A patchwork of state AI regulations would create a compliance nightmare for everyone. But preemption also clears the path for infrastructure acquisition that has nothing to do with innovation.
Colorado passed a law banning algorithmic discrimination. The White House cited it as an example of regulation that might “force AI models to produce false results.” Requiring models to avoid discrimination gets framed as forcing them to lie.
States that don’t fall in line risk losing federal funding. The order specifically mentions BEAD broadband money as leverage. Other discretionary grants are put on notice too.
Federal preemption so states can’t interfere. Litigation to strike down the ones that try. Funding pressure to make the rest behave.
This is about power, and that power has costs.
The Cost
That December order wasn’t the first. Five months earlier, in July, a separate executive order took aim at environmental law. The accompanying AI Action Plan stated the goal plainly: “reducing regulations promulgated under the Clean Air Act, the Clean Water Act, [and] the Comprehensive Environmental Response, Compensation, and Liability Act.”
That matters because of what’s getting built.
Private power bypasses the grid, and a lot of the oversight that comes with it. Carbon at scale, in service of servers that won’t see utility power for years.
Your electricity bill is going up to pay for grid upgrades you didn’t ask for. In some states, data centers have already added $15-18 a month to residential bills. And the companies driving the demand? They’re getting tax breaks. You’re subsidizing the buildout and paying the markup.
For what?
OpenAI says they’ve got 800 million weekly users. Some fraction of those people get real value. Medical insights that matter. Scientific work that moves the needle. Protein folding. Drug discovery. Climate modeling, ironically enough.
The rest is chatbots. Autocomplete. Homework. Anime profile pics.
The infrastructure is scaled to the 800 million users. The breakthroughs justify it. The engagement metrics pay for it.
Washington is loosening the Clean Air Act so people can generate more content.
That’s a lot to sacrifice for a bet. Especially when the bet depends on one thing: Western hardware dominance holding.
The Collapse
Right now, a single Dutch company makes the machines that print advanced chips. A single Taiwanese company runs the factories that produce them. Export controls keep China a step behind. That’s the advantage.
The problem is this advantage has a clock.
Export controls can restrict shipments, slow progress, but they can’t freeze knowledge. Engineers move where the money is. Techniques spread. Workarounds get built. The chokepoints that make the “race” story feel urgent are under constant pressure. They don’t need to collapse entirely. They just need to slip enough that scarcity pricing stops working.
And scarcity pricing depends on capability staying scarce.
China has reportedly built a prototype EUV lithography machine in a high-security lab in Shenzhen. It hasn’t produced chips yet, but the target is 2028. The Dutch chokepoint isn’t permanent.
DeepSeek. Kimi K2. Qwen. Open weights under permissive licenses. Technical reports detailed enough that anyone with hardware can run them, fine-tune them, build on them. Open weights travel. No API to revoke. No terms of service to enforce. No kill switch.
The hardware edge is slipping. Capability is flooding the market from China under licenses that let it spread. Two vectors, same destination: the scarcity story stops holding.
The manufacturing advantage is temporary. The infrastructure they’re locking in isn’t. A short-lived scarcity story is being used to acquire long-lived assets.
If the advantage erodes faster than expected, the public is left holding the externalities.
The Asymmetry
The “race against China” framing assumes both sides start equal. They don’t.
In August 2025, Fortune reported on American AI experts returning from tours of China’s AI hubs. What stunned them wasn’t the models or the talent. It was the grid.
“Everywhere we went, people treated energy availability as a given,” one observer wrote. In China, electricity for data centers reads like a solved problem.
That’s the asymmetry hiding in plain sight. Over there, capacity is treated like a national baseline. China maintains reserve margins of 80-100%, at least double what it needs. The U.S. runs regional grids at around 15%, where a hot week in Texas or a crunch in California turns into public warnings.
One energy expert told Fortune: if you can’t build energy infrastructure, you can’t win an energy-hungry race. You can talk about chips and models all day. You still have to plug them in.
The U.S. is trying to close a gap that took decades to open.
The race framing makes it sound neck and neck. It’s not. China built the grid. America is still arguing about whether it can.
The Table
There’s a poker table forming around AI.
Nobody has to be evil for a bubble to inflate. You just need a table where everyone’s holding paper that only stays valuable as long as the game keeps going.
Microsoft antes in. OpenAI buys compute. Nvidia sells the chips that make the whole thing feel inevitable. Money moves around the table in ways that look like growth from a distance.
The problem is the table rewards the same behavior whether the downstream value shows up or not.
Then reality shows up like the dealer, calm as stone, and flips the one card nobody can negotiate with.
Power.
You can bluff a roadmap. You can bluff margins. You can’t bluff megawatts.
The Landing
The technology is real, and it's here to stay.
LLMs are better now than they were a year ago. A year from now, the models we’re using today will look like quaint artifacts. The gains are compounding. The capability is genuine.
AI will still be here after the bubble bursts, the way the internet was still here after the dotcom crash. Pets.com died. Amazon survived. The technology wasn’t the problem.
The argument is what’s being done in its name.
The bubble is real too. Balance sheets that look better than the underlying timeline. A scarcity story that depends on a hardware edge that’s already slipping.
I don’t have a solution. I’m not pretending to.
What I can do is see clearly, and help others see. That’s what this is for.
I don’t have a way to win this. The average person doesn’t either.
That’s the point. The decisions get made upstream. The costs get pushed downstream. The infrastructure gets locked in first, and the justification gets written later.
The grid is the game now. And the game is already decided. What’s on the other side won’t be liberation. It’ll be a new configuration of power that nobody voted for and nobody can steer.
Better to watch with eyes open than pretend someone’s driving.
Enjoyed this piece?
I do all this writing for free. If you found it helpful, thought-provoking, or just want to toss a coin to your internet philosopher, consider clicking the button below and donating $1 to support my work.



Incisive piece on how scarcity narratives get weaponized to lock in permanent infrastructure wins. The grid connection insight is the kicker, people fixate on chips while companies quietly acquire the permits and power rights that actually matter long-term. The Clean Air Act rollback angle makes this even more troubling since we're trading enviornmental protections for infrastructure that might not even deliver the promised AI dominance. Reminds me of how dot-com era telecom buildouts left us with darkfiber everywhere but actual value concentrated in a handful of platforms.