The AI spending spree just keeps getting weirder
The binge has outgrown its data-center-only era, dragging turbines, pipelines, private credit, and anti-LLM moonshots into AI’s widening orbit

Getty Images
AI is everywhere. So Silicon Valley set out to buy more chips and build more server farms. But now, it has started redrawing whole sections of the economy around AI’s appetite. The tech has reached an aisle with gas turbines, sovereign wealth funds, jet engines, and billion-dollar arguments over whether the reigning model-building gospel has been headed down the wrong runway all along.
Neighboring industries keep waking up to discover they’ve been drafted into the AI economy. A supersonic-jet startup has a $1.25 billion order to power AI data centers. Google $GOOGL has bought an energy developer. Meta $META has helped turn a Louisiana data center into a $27.3 billion private-debt spectacle. And Yann LeCun, after years of insisting that Silicon Valley’s favorite AI assumptions have limits, has raised $1.03 billion to chase “world models,” instead.
AI capital is still buying the familiar hardware. But it’s also wandering into stranger territory with the swagger of a rich industry that’s getting even richer and has stopped asking whether something belongs in the AI story and started asking whether it can be bent to the AI story fast enough to secure more power or more time. Silicon Valley isn’t choosing between “scale harder” and “the LLM paradigm is intellectually busted.” The same ecosystem is writing checks to the lab buying gigawatt-scale compute and to the lab arguing next-token prediction will not get you to broadly capable agents.
The anti-casino is still being financed with casino money.
For a while, AI capex was easy to picture. More GPUs. Bigger clusters. Another desert or cornfield marked for a data center that’s the size of a small municipality. Sure, that picture still exists. Last week, Mira Murati’s Thinking Machines locked up at least one gigawatt of Nvidia $NVDA’s next-generation Vera Rubin systems — enough computing power that industry executives pegged its value at around $50 billion. Fei-Fei Li’s World Labs raised another $1 billion for “spatial intelligence.” Safe Superintelligence (SSI), Ilya Sutskever’s still-murky venture, raised $2 billion at a reported $32 billion valuation — despite having no public product. Even the dissent has a burn rate. The heretics have capital stacks.
The checks are getting larger, sure. But the money is getting restless. It’s now landing on rival theories, exotic structures, adjacent industries, and any corporate costume that gets more electricity, more land, more leverage, or more optionality into the system. The AI spending spree isn’t tidy anymore. It has lost the neat corporate silhouette it had a year ago and is now behaving like an industry with too much money, too many bottlenecks — and an increasingly improvised sense of what exactly now counts as AI infrastructure.
Mature spending sprees do this. They spread. They improvise. They scavenge. They colonize the nearby economy. AI has now gotten to that part of its story.
Weirder ways to spend
Silicon Valley used to spend on AI the old-fashioned way: buy the chips, build the cluster, pour the concrete, tell Wall Street that the future needed a bigger server hall. Today’s spending spree looks like it has slipped its leash. Everyone knows the numbers are cartoonish. But the money is arriving in formats that would have sounded more than a little ridiculous a year ago.
Murati’s Thinking Machines has lined up industrial-scale compute before the outside world has much public evidence of what product will justify the footprint. LeCun’s AMI is chasing world models and saying the LLM-heavy route to human-level intelligence is “complete nonsense,” raising a billion dollars to fund a theoretical alternative to the already existing (and already very real) dominant architecture. World Labs is building around spatial intelligence and 3D world models. SSI has turned founder premium into a capital event on a scale that would’ve looked surreal for an ordinary startup — and then gone right back to saying almost nothing in public.
These alternatives aren’t arriving as a lean, skeptical correction to excess. They have billionaire-scale appetites of their own. AMI’s two main cost centers are compute and talent. World Labs’ investor list includes AMD and Nvidia. Same boom, same ocean of money, very different guesses about what the money should buy.
One side of that says the AI race needs industrial-scale compute. The other says that the race may need a different map entirely. Investors, naturally, have decided to fund both. An already-rich industry is funding increasingly strange ways to stay close to whatever intelligence actually turns out to be. When the checks start landing on every plausible route — giant clusters, world models, spatial intelligence, founder-premium moonshots — the spree starts sounding anxious as much as ambitious. Bank of America $BAC’s Savita Subramanian has said investors are “buying the dream.”
S&P Global $SPGI caught the recursive side of all of this, writing that “circular infrastructure deals reign.” The cloud giants and chip suppliers put money into startups, the startups spend that money back on cloud and compute, and the whole system starts looking like a very expensive feedback loop with better branding. AI companies are becoming “asset-heavy,” S&P wrote, behaving more like infrastructure operators than classic software firms. That’s a species change.
Weirder ways to build
If the bets have gotten stranger, the physical and financial machinery under them has gotten stranger still. Moody’s flagged recently that Amazon $AMZN, Meta, Alphabet, Microsoft $MSFT, and Oracle $ORCL have accumulated $662 billion of future data-center lease commitments that haven’t yet hit the balance sheet. The industry has started treating footnotes as staging grounds for the next few years of expansion. The server hall is doubling as a capital-structure problem.
Sightline Climate says it is tracking 190 gigawatts across 777 large data centers and AI factories announced since 2024. Of the 16 gigawatts slated to come online in 2026, only about five gigawatts are actually under construction; around 11 gigawatts remain in the announced stage with no visible construction progress. As much as half of the world’s data-center projects due this year could face delays. Development still surges. So does schedule fiction. The pipeline has started carrying a decent amount of wishful thinking.
Meta’s Hyperion campus shows how odd the buildout can look when it fully matures. The company formed a $27 billion financing arrangement (a bond sale tied to the venture; the largest private-debt offering ever) with Blue Owl Capital for the Louisiana site. Meta kept a 20% stake, Blue Owl took 80%, and the structure pushed the project off Meta’s balance sheet even as the company secured the campus it wanted. That’s private credit, joint-venture architecture, and infrastructure finance dressed in a hoodie and trying to pass as a platform story.
AI companies keep drifting upstream, because waiting politely for the utility has become an intolerably slow way to run an arms race.
Alphabet agreed to buy Intersect for $4.75 billion in cash plus debt, pulling an energy-and-data-center developer into Google’s orbit. OpenAI and SoftBank each invested $500 million into SB Energy; OpenAI originally signed a 1.2-gigawatt lease for the first Stargate buildout, which has since been canceled. A normal tech budget ends at “secure capacity.” This budget — one that’s far from normal — keeps marching until it reaches the people who can secure land, generation, power, and campuses in the first place.
The site plans have started sounding strange, too. Data Center Frontier reported that on-site generation is being treated as primary infrastructure in parts of the market. Crusoe’s longer-range vision includes smaller inference-focused data centers around the country, some operating entirely on their own power grid with solar and batteries. Grid connection, once a utility problem, keeps reappearing as product design. AI capacity now comes with side quests in whatever form of electricity can show up on time.
A lot of this buildout now lives in giant leases, structured equipment deals, private-credit arrangements, and capital stacks that would’ve sounded much more at home around airports, pipelines, and merchant power plants than around model training. The spreadsheet and the substation have started sharing a desk. AI doesn’t just need hardware anymore. It needs financing clever enough to keep feeding hardware into a market with too much demand and too many choke points.
Weirder things to become
Then, the surrounding economy starts changing species. Boom Supersonic’s identity was built around romance, speed (and lots and lots of it), and a fantasy of making air travel glamorous again. Crusoe ordered 29 of its turbines as part of a 1.21-gigawatt on-site power strategy for AI campuses. Baker Hughes $BKR signed on to provide matching generators. AI spending has grown large enough to hand aircraft companies a second act as utility substitutes. The market has a brutal sense of humor.
EV-battery manufacturers are retooling factories to make storage modules for AI data centers as their core market cools. Bitcoin miners are converting server fleets to act as AI data centers, leaning on power access and facilities left over from crypto’s hangover. Williams, the pipeline company, has explored buying gas-producing assets so it can offer hyperscalers a fuller package of fuel, transport, and power; the company already has a 440-megawatt Ohio project tied to Meta and two more Ohio projects that are costing around $3.1 billion. AI keeps drafting new suppliers faster than the old ones can build.
AI’s capital binge hasn’t just expanded. Expansion is the boring part. Money is now arriving as giant bets on rival theories, as leases and bonds and residual guarantees, as energy-developer acquisitions, as supersonic turbines, as repurposed jet engines, as pipeline logic, as private-power improvisation. A spending spree this large is taking over the neighboring businesses, the neighboring balance sheets, the neighboring politics, the neighboring supply chains.
By the time a boom is recruiting aerospace engineers, battery factories, nuclear pragmatists, world-model evangelists, and sovereign wealth funds into the same sentence, the shape of the thing has become hard to miss. AI capex still buys chips, concrete, and cooling towers. It also keeps turning up in stranger places and stranger forms because the basic appetite has outgrown the neat version of itself. A spree with enough money and enough impatience starts bending the nearby economy until somebody, somewhere, decides it’s now in the AI business.
The chips were just the opening bid. Now, AI’s shopping list is weird. The build is weird. And the company it keeps has gotten weird, too.