On April 29, 2026, four tech titans—Amazon, Alphabet, Meta, and Microsoft—prepare to report their first-quarter earnings after the bell. And one name is expected to hang over every analyst question, every financial metric, every attempt to justify ballooning infrastructure costs: OpenAI.
Key Takeaways
- OpenAI’s rapid rise is forcing hyperscalers to defend $40 billion in combined AI-related spending this quarter
- Microsoft’s exclusive cloud partnership with OpenAI now directly influences its Azure growth narrative
- Amazon and Google are accelerating internal AI development while simultaneously investing in rivals to OpenAI
- Meta’s open-source strategy is being tested against commercial models trained on proprietary data
- Investors are questioning whether any hyperscaler can truly own the AI stack—or if they’re just funding the tools that might disrupt them
Microsoft’s $40 Billion Gamble Isn’t Paying Off—Yet
Microsoft’s financials this quarter aren’t just about cloud growth. They’re about whether its $13 billion investment in OpenAI was a masterstroke or a misallocation of capital. Since 2023, Microsoft has funneled billions into OpenAI while pushing Azure as the exclusive infrastructure backbone for its AI training. But in Q1 2026, that bet remains unproven at scale.
Azure’s AI revenue still isn’t broken out separately in filings. Instead, Microsoft bundles it under “intelligent cloud,” a category that includes legacy enterprise software and server products. That lack of transparency is starting to grate on investors. Morgan Stanley analyst Erik Woodring, in a note distributed April 28, called the opacity “a red flag” given the magnitude of spending.
What’s clear: Microsoft is spending like it’s in a sprint. Its $40 billion commitment to build AI data centers with OpenAI by 2028 is already showing up in capital expenditures. CapEx for the first three months of 2026 hit $18.2 billion, up 37% year-over-year. That’s not just growth—that’s a full-scale infrastructure war.
But growth doesn’t equal returns. And right now, Microsoft can’t show a direct line from OpenAI’s models to Azure’s bottom line. Not in a way that satisfies shareholders. The company says 45% of Fortune 500 companies are now using Azure OpenAI services. That sounds impressive—until you realize only 12% are paying for premium tiers with fine-tuning or custom deployment.
And here’s the irony: the more OpenAI’s models succeed, the more Microsoft risks becoming a utility provider for someone else’s intelligence. OpenAI controls the model stack. Microsoft owns the metal. But intelligence—pricing power, innovation velocity, customer lock-in—that’s increasingly concentrated in San Francisco, not Redmond.
Google and Amazon Are Betting Twice—And Hedging
While Microsoft doubles down on one partner, Alphabet and Amazon are playing both sides. Google’s internal Gemini team is under renewed pressure. CEO Sundar Pichai has publicly stated that Gemini is “core to our future,” but internally, teams admit they’re playing catch-up. In Q1, Google Cloud’s AI revenue grew just 18%—well below the 32% analysts expected.
Meanwhile, Google’s venture arm quietly increased its stake in Anthropic in January 2026. The investment, confirmed in a regulatory filing but not publicly announced, totals $750 million. Anthropic, OpenAI’s most direct rival, runs entirely on Google Cloud. That’s no accident. It’s a hedge: if Gemini stumbles, Google still profits from the demand OpenAI and its peers are creating.
Amazon’s Dual Path: Infrastructure and Investment
Amazon is in a similar bind. AWS remains the largest cloud provider by revenue, but its AI momentum lags. In Q1, AWS revenue grew 19%—solid, but down from 23% a year ago. And while AWS launched new AI chips and inference optimizations, it lacks a breakout model like GPT or Claude.
Yet Amazon is betting big elsewhere. Its $4 billion investment in Anthropic in 2023 is now followed by an additional $1.2 billion in early 2026, according to a person familiar with the funding round. That brings Amazon’s total stake to over 20%, giving it board representation and co-development rights on future models.
But here’s the tension: AWS wants every AI company to run on its infrastructure. Yet its biggest investment is in a company that could one day bypass AWS entirely—just as OpenAI did with Microsoft by building its own data centers. Amazon’s strategy isn’t just risky. It’s self-contradictory.
Meta’s Open-Source Gamble Could Backfire
Meta stands apart. It isn’t chasing exclusive partnerships. It’s betting that open-sourcing AI models will create ecosystem dominance. Since launching Llama in 2023, Meta has released Llama 2, Llama 3, and now Llama 4, each more powerful than the last. In Q1 2026, over 20,000 organizations reported using Llama models for production workloads, according to Meta’s internal tracking.
That’s a lot of adoption. But adoption doesn’t equal revenue. Meta’s ad business remains strong—up 22% in Q1—but its AI investments aren’t monetizing. And while developers love free models, enterprises are increasingly wary of legal risk and support gaps. Salesforce, for example, quietly shifted from Llama to Azure OpenAI in February 2026, citing “compliance and SLA requirements.”
Meta’s counterargument: by controlling the model standard, it can shape the tools, frameworks, and developer habits of the next decade. But that’s a long game. And in the short term, Meta’s R&D spend jumped to $7.8 billion in Q1—up 41% year-over-year—mostly on AI infrastructure and talent.
- Microsoft: $13B invested in OpenAI, Azure AI revenue still unquantified
- Google: 18% AI revenue growth, $750M in Anthropic, Gemini under pressure
- Amazon: $1.2B new Anthropic funding, AWS AI growth slowing
- Meta: 20,000+ Llama adopters, $7.8B R&D spend, no AI revenue
- All four: $40B+ in combined AI spending in Q1 2026
The Real Question: Who Owns the AI Stack?
This earnings season isn’t about revenue growth or user counts. It’s about control. Specifically: who owns the AI stack, and who’s just renting space in it?
OpenAI didn’t just build a model. It built a pricing model, a developer ecosystem, and a brand so strong that enterprises line up to pay premium rates. And it did it without owning data centers at scale. That’s the dream: intelligence as a service, unmoored from infrastructure.
The hyperscalers, meanwhile, are stuck in a paradox. They must fund AI’s growth—because if they don’t, they miss the wave. But the more they fund it, the more they empower independent players who can extract value without sharing it.
It’s not just OpenAI. It’s Anthropic. It’s Mistral. It’s the rise of specialized AI startups that pick and choose cloud providers based on cost, not loyalty. The stack is fragmenting. And the companies that spent 15 years consolidating power through cloud lock-in are now seeing that power erode.
Investors Are Done with Vague Promises
On April 28, 2026, Cathie Wood of ARK Invest tweeted: “If hyperscalers can’t show AI-driven margin expansion by Q3, we’re heading into a reckoning.” That sentiment is spreading.
Analysts aren’t asking for moonshots. They’re asking for line items. For clear KPIs. For proof that AI spending isn’t just a black hole of GPUs and talent acquisitions. When Alphabet reports, they’ll want to know: how much of Google Cloud’s growth came from Gemini? When Amazon speaks, they’ll ask: what revenue does Anthropic generate, and how does AWS benefit?
And Microsoft? They’ll be pressed on one question above all: if OpenAI builds its own data centers, what’s left for Azure?
“The risk isn’t that OpenAI fails. The risk is that it succeeds—and becomes the most valuable AI company in the world without being owned by the companies funding its infrastructure.” — Erik Woodring, Morgan Stanley, April 28, 2026
What This Means For You
If you’re a developer, this earnings cycle should change how you think about platform risk. Building on Azure OpenAI or Google’s Vertex AI gives you speed and support. But it also ties you to a vendor that may not control the model’s roadmap. Meanwhile, open models like Llama offer flexibility—but no guarantees on security, updates, or legal protection. Your choice isn’t just technical. It’s strategic.
For founders and builders, the message is sharper: the infrastructure layer is becoming commoditized. The real value is in the interface, the data, the user experience. OpenAI proved that you don’t need to own servers to own the stack. If you’re raising money to build another GPU cluster, you’d better explain why you’re not just a cost center for someone else’s intelligence play.
What happens if one of these hyperscalers finally breaks rank and starts charging OpenAI for compute? Not as a partner—but as a customer?
Sources: CNBC Tech, original report


