OpenAI president Greg Brockman published a lengthy post on X this week arguing that the global economy is shifting to one fundamentally powered by compute. Software engineering, he wrote, is in a "renaissance," and AI is about to do the same thing to every other kind of computer-based work. The timing is not subtle.
Brockman's post landed just as speculation around OpenAI's next model, internally codenamed Spud, reaches a fever pitch. An extended interview on the Big Technology Podcast published in early April had Brockman calling Spud the product of "two years of research" with what he described as a "big model feel." Sam Altman reportedly told employees the model could "really accelerate the economy." Pre-training wrapped around March 24 at OpenAI's Stargate facility in Abilene, Texas, according to multiple reports.
So when Brockman writes that problem-solving capacity will be "bound by the amount of compute you have access to," he is not musing philosophically. He is selling.
The thesis, and the product pitch hiding inside it
The post itself reads like a polished version of themes Brockman has been workshopping for months. At CES in January, he told AMD CEO Lisa Su that GDP growth would soon track compute availability. "I would love to have a GPU running in the background for every single person in the world," he said at the time, a line that sounds aspirational until you remember OpenAI's $1.4 trillion infrastructure commitment.
His X post extends that argument. Computers, Brockman wrote, are shifting from tools humans must contort themselves to operate into systems that understand intent and execute autonomously. Friction is disappearing. Small teams can now do what large ones used to. People are less focused on managing tools and more on creating, and that, Brockman claims, is bringing "joy" back to work.
It is a nice framing. It is also a framing that conveniently positions OpenAI's product roadmap as the inevitable future of labor.
What Spud actually is (and isn't)
Brockman was more concrete in the Big Technology interview. Spud is a new base model, a fresh pre-training run that consolidates research OpenAI has been doing since roughly 2024. "I think of Spud as a new base, as a new pretrain," he said, distinguishing it from the iterative fine-tuning that produced the GPT-5 series. Whether it ships as GPT-5.5 or GPT-6 depends on how large the performance gap over GPT-5.4 turns out to be. OpenAI hasn't committed to a name.
The company shut down Sora, its video generation tool, on the same day pre-training completed. Reports from multiple outlets described Sora as burning roughly $15 million a day in inference costs against negligible revenue. Those GPUs got redirected. A billion-dollar Disney licensing deal, gone. OpenAI renamed its product division "AGI Deployment," which is either a bold organizational signal or the kind of internal branding that looks overconfident in retrospect.
Brockman also claimed in the podcast that AGI is "70-80% there" by his personal definition, which is a number that conveys certainty without actually committing to anything falsifiable. He's said text-based models will "go to AGI" and that OpenAI has "line of sight to much better models coming this year." Confident language, though the company said similar things ahead of GPT-5.
So what does the compute economy actually look like?
Brockman's post gestures at institutional disruption, changing career trajectories, new waves of entrepreneurship. He acknowledges risks and calls for broad distribution of AI's benefits, which is standard-issue language from anyone building the thing that might cause the disruption.
The more interesting question is whether "compute" as the binding constraint on economic output is a useful framework or a self-serving one. OpenAI has committed roughly $1.4 trillion to infrastructure through the Stargate project and related deals. Brockman personally oversees much of this buildout. AMD CEO Lisa Su described his focus on compute as "maniacal." If compute is the new oil, Brockman is both the geologist and the drilling company.
There is a real observation buried in the hype. AI coding tools did change meaningfully around December 2025. Brockman noted in an earlier February post that OpenAI engineers told him their jobs had "fundamentally changed" since then, with Codex writing most of their code. That tracks with what developers outside OpenAI report, too. But the leap from "AI makes coding faster" to "GDP growth will be determined by compute access" is enormous, and Brockman offered no evidence for the latter at CES or in this post.
TechCrunch noted at the time that Brockman provided no examples when making the GDP-compute claim. That hasn't changed.
The timing problem
Brockman's post arrived during a window where OpenAI needs to control the narrative. Competitors have closed the gap. Claude Opus 4.6 and Gemini 3.1 Pro both outperform GPT-5.4 on certain benchmarks. Reports describe an internal "Code Red" posture at OpenAI since December 2025. An IPO process appears imminent. And a fraud trial in Oakland, where Brockman is a named defendant alongside Altman, is set for April 27.
Posting a manifesto about compute-driven economies right before your next big model drops and right before you go to trial is, at minimum, good PR timing. At maximum, it's a framework designed to make OpenAI's spending look like infrastructure investment rather than a bet that needs to pay off before the money runs out. The company reportedly generates about $13 billion a year in revenue against a $1.4 trillion infrastructure commitment. That math requires a lot of faith in the compute economy thesis.
Spud's release, according to Polymarket, has a 78% probability of happening before April 30. If the model delivers, Brockman's manifesto looks prescient. If it doesn't, it's a blog post about how important it is to buy the thing the author is selling.




