OpenAI's ambitions for its ChatGPT artificial intelligence platform are rising out of the Texas brushland in Abilene, where a complex the size of New York's Central Park embodies the firm's plans to spend at least $1 trillion on data centers to meet exploding demand.
But something else is also sprouting: questions on how OpenAI will pay for all this commercial real estate development.
To position itself at the forefront of such a buildout, OpenAI has inked blockbuster deals in recent weeks to build AI networks that increase processing power, or “compute.” In the largest deal, the firm landed a $100 billion investment from chipmaking giant Nvidia. Its eye-popping size sparked questions whether Nvidia — with its seemingly limitless pockets as its stock market value hit nearly $4.7 trillion — is spending unprecedented sums merely to prop up the purchase of its own products.

“The crux of the concern is whether Nvidia is juicing demand for their own equipment that isn’t real or sustainable,” Neuberger Berman analyst Jamie Zakalik told CoStar News.
But Zakalik was quick to add that the true relationship between the companies is more nuanced. Others close to Nvidia stress that the recently announced deal represents just one piece of the sweeping growth plans for OpenAI, even as its ChatGPT already has 800 million users and counting. “We do not require any of the companies we invest in to use Nvidia technology,” an Nvidia spokesperson said in an email to CoStar News.
The major sums, and the debate the plans have sparked, reflect the potentially huge stakes involved. OpenAI CEO Sam Altman and some other AI executives say artificial intelligence, supported by a growing amount of data center real estate, could outperform the smartest humans in some aspects in about three years.
But in the race to reach artificial general intelligence, or AGI, the industry has ushered in perhaps the costliest buildout in world history, one that experts have likened to the U.S. interstate highway system, and some even go so far as to mention the Industrial Revolution.
In a podcast a few days after announcing the deal, Nvidia CEO Jensen Huang stressed that it was “an opportunity to invest in OpenAI,” which is “very likely going to be the world’s next multitrillion-dollar hyperscale company. And who doesn’t want to be an investor in that?”
OpenAI has also signed deals with other chipmakers, including one inked in the past week with Broadcom to develop and deploy 10 gigawatts of chips over the next four years. That deal came days after OpenAI said it would partner with chipmaker Advanced Micro Devices to collaborate on data centers that run on AMD’s newest processors.
Another technology funding bubble?
The feverish spending spree and circuitous nature of the deals has conjured up memories for some of the dot-com bubble of the late 1990s, when some firms that had been loaned money by their vendors went out of business.
Even Altman himself acknowledged in an interview with tech reporters in August that “someone is going to lose a phenomenal amount of money” as a result of the hype around AI. Altman echoed the hype comparison to the tech boom of 25 years ago, but noted the internet did end up being "a really big deal.”
However, the internet didn’t change the world as fast as some pioneers said it would. Statistics from the early internet years claimed traffic was doubling every 100 days, but later research showed that traffic was doubling about every year.
This initial estimate in part led to an overinvestment in fiber-optic lines that left behind a glut of “dark fiber.” By some estimates, 85% to 95% of those cables went unused after the bubble burst.
There's a similar "gold rush climate" taking hold these days with the push to be at the forefront of the real estate needed to power artificial intelligence, according to Howard Huang, a market intelligence analyst at Avison Young’s San Francisco office who tracks data centers.
“Everyone is looking at data centers as their next cash cow," Huang said.

In addition to companies like OpenAI and competitor Anthropic that are known for their artificial intelligence-powered language models, technology giants such as Microsoft, Amazon and Meta are spending record sums to build out these properties.
If everything goes right, these networks will power what they hope will be cutting-edge and in-demand products.
Some analysts note that if this demand for data centers and artificial intelligence is overblown or doesn't expand as quickly as predicted, these technology companies may face a glut of underutilized real estate that could be hard to fill, due to the specialized nature of the properties.
This wouldn't be the first time that tech firms have been forced to address such an issue. Companies like Google, Apple and Meta shed millions of square feet of office space in the wake of the pandemic, after those firms gobbled up more real estate than they needed, and as they shifted their investment priorities to AI.
However, those companies are now part of the office sector's turnaround across the country, with some of the biggest names in the industry closing some of the largest purchases so far since the pandemic and adding to the rebounding momentum for the United States office market.
Similarly, the soaring demand for data centers has helped steer the industry out of the dark days of the pandemic.
Avison Young downplayed concerns of an AI bubble, echoing the view of other big real estate firms. “This is a transformative technology,” Huang said.
Circular concerns
The danger of circular financing — also known as round tripping or vendor financing — was a major lesson of the early rise and fall of the internet economy.
During the dot-com boom at the turn of the 21st century, telecom companies vied with one another to lend money to internet startups to buy their gear. Those deals added to inflated expectations for the industry.
Companies such as Cisco, Lucent and Nortel extended millions, even billions, to finance cabling and switching equipment for firms that later folded when the bubble burst.
Similarly, Nvidia and OpenAI's deal "will clearly fuel ‘circular’ concerns,” Stacy Rasgon, an analyst with Bernstein Research, wrote in an investor note.

Such worries have long followed Nvidia, analysts noted. The company expanded from a little-known Silicon Valley semiconductor outfit to the world’s most valuable publicly traded company at the forefront of the AI boom in part by investing in companies that depend on its chips.
OpenAI and rivals such as Meta, X.AI, Google and Microsoft all rely on Nvidia’s chips. According to Zakalik, it was OpenAI’s Altman who approached Nvidia with the recent deal to finance what the company describes as “next-generation AI infrastructure to train and run its next generation of models on the path to deploying superintelligence.”
Nvidia “generates a lot of cash,” she said. And it’s “constantly looking for value-add ways to deploy that capital.”
Just this month, Nvidia was part of an investment group including BlackRock and Microsoft that agreed to buy Aligned Data Centers for $40 billion.
Nvidia in August disclosed in a financial filing that it owned $4.33 billion in holdings across several public companies, all involved in the AI sector.
Inside the math
That backdrop has Nvidia and OpenAI fielding questions about their arrangement that could help pay for real estate development, and they have promised to release more details down the line. But in short, Nvidia, the world’s leading chipmaker, plans to invest up to $100 billion — $10 billion at a time — in OpenAI’s massive build-out of data centers that run on Nvidia’s hardware.
The data centers are expected to require millions of Nvidia chips and ultimately use 10 gigawatts of energy, enough to power about 7.5 million homes simultaneously. In Abilene, Texas, OpenAI's first Stargate data center is up and running on Oracle Cloud Infrastructure and Nvidia chips.
For every $10 billion Nvidia invests in OpenAI, the startup will spend $35 billion on Nvidia chips, according to an analysis from NewStreet Research.
“Nvidia has been a key supplier of OpenAI since the beginning,” said Anshel Sag, an analyst with Moor Insights & Strategy. He added that the deal “ensures OpenAI gets access to the amount of compute it needs from Nvidia," which in return gets “a very large customer constantly building and boosting demand” for its technology.
If the full OpenAI investment is completed, it will represent Nvidia’s largest investment yet.
Last month, Nvidia said it was investing $5 billion in struggling chipmaking rival Intel, as the two Silicon Valley firms collaborate on technologies geared to high-demand data centers. Nvidia also recently announced more than $2.5 billion in investments in AI infrastructure projects in the United Kingdom.
In light of another $10 billion deal the ChatGPT parent recently struck with chipmaker Broadcom to build its own custom-made chip, some analysts also saw Nvidia’s deal with OpenAI as a way to shore up business from one of his biggest customers.
'Someone is going to lose'
The Nvidia and OpenAI deal has highlighted the gargantuan scale of the resources being thrown at artificial intelligence, a technology envisioned for decades that exploded three years ago with the launch of ChatGPT. The computational power required to run ChatGPT and other so-called large language models is notoriously expensive, and the future of some of the AI startups that have rescued San Francisco’s economy in the past year is murky at best, some analysts say.
Altman has made it clear that OpenAI is not expected to turn a profit anytime soon, as it focuses on training the latest ChatGPT models and expanding its access to the processing power needed to do so. Altman said following the release of the latest version of ChatGPT in August that OpenAI should concentrate on expanding processing power, even if it means delaying profitability.
Last year, OpenAI expected about $5 billion in losses on $3.7 billion in revenue. OpenAI’s annual revenue is on track to jump significantly in 2025, but the company is still losing money. A report from Atlassian notes that while daily AI use has doubled in the past year, 96% of companies "have not seen dramatic improvements in organizational efficiency, innovation, or work quality."
It remains unclear how leading AI firms such as OpenAI and competitor Anthropic — with its own language model technology — will ultimately generate meaningful revenue to keep pace with AI infrastructure spending. A report by Bain & Co. estimated that spending on digital networks will require $2 trillion in annual revenue by 2030, more than the combined 2024 revenue of Alphabet, Amazon, Apple, Meta, Microsoft and Nvidia.
Individuals are flocking to AI, but most are using free versions, analysts say, while investors are racing to invest more money to build chips to power the colossal network of data centers expanding across the country.
Companies such as Meta, Anthropic and Alphabet, Google’s parent, are also investing heavily on superintelligence, driven by the conviction that those who sit out the race will be history’s losers. An investment firm executive recently shared a quote on a podcast from Google cofounder Larry Page: “I am willing to go bankrupt rather than lose this race.”