Stung by reports that Meta is negotiating to replace its hardware with Google’s Tensor Processing Units (TPUs), Nvidia has felt the need to respond on social media. Shares fell 2.6% Tuesday as the giant launched a rare, two-front public defense of its market dominance.
Posting directly to X, the company claimed its technology remains a “generation ahead” of custom silicon. Simultaneously, executives circulated a private memo to analysts disputing “accounting fraud” allegations from Michael Burry, the investor famous from the movie The Big Short.
The Defensive Crouch: Nvidia Breaks Silence
Breaking from its usual aloofness, the chipmaker reacted sharply to the market response that saw its stock hit an intraday low of -7.1% before recovering slightly.
Investors were rattled by the prospect of a fracture in Nvidia’s >90% market share.
Promo
Addressing the competitive threat posed by custom silicon directly, the company issued the defensive post on social media.
Nvidia stated that it “is a generation ahead of the industry” and “the only platform that runs every AI model and does it everywhere computing is done.”
We’re delighted by Google’s success , they’ve made great advances in AI and we continue to supply to Google.
NVIDIA is a generation ahead of the industry , it’s the only platform that runs every AI model and does it everywhere computing is done.
NVIDIA offers greater…
, NVIDIA Newsroom (@nvidianewsroom) November 25, 2025
Such a public rebuttal marks a significant tonal shift for a company that typically ignores competitors. It signals that the “untouchable” narrative is fraying under pressure from vertically integrated hyperscalers.
Analysts noted the unusual timing of this defense. It arrives just as Google’s stock has risen ~16% since late October, driven by the success of the Gemini 3 update.
Such defensive posturing suggests Nvidia is “spooked” by the sudden viability of Google’s vertical stack.
The Hardware Coup: Meta & The TPU Threat
Underpinning this market anxiety is an alledged Meta deal which indicates the social media giant is in advanced talks to rent Google’s Tensor Processing Units (TPUs).
According to Trendforce, the arrangement would involve renting compute via Google Cloud starting in 2026.
If finalized, the partnership would reportedly expand to on-premise deployment of TPUs within Meta’s own data centers by 2027.
Validating Google’s “vertical integration” thesis, this move would prove that custom ASICs can replace general-purpose GPUs for top-tier workloads.
Google’s Gemini 3 model, trained entirely on Google TPUs, has achieved state-of-the-art performance. This empirically disproves the long-held assumption that Nvidia hardware is strictly necessary for frontier models.
Attempting to counter this narrative technically, the company contrasts its general-purpose architecture with specialized chips, arguing that ASICs lack the flexibility required for rapidly evolving model architectures.
Central to this argument is the idea of “fungibility”, the ability to repurpose hardware for different tasks. Nvidia GPUs can switch from training to inference to graphics rendering instantly.
However, hyperscalers like Meta and Google run workloads at such massive scale that specialization becomes an economic advantage. If a chip is 30% more efficient for a specific matrix multiplication task, that translates to billions in power savings.
Despite these claims of superiority, the market is seeing a divergence. Brian Kersmanc of GQG Partners highlighted this contradiction, noting that “the Nvidia argument is that they’re on all platforms, while arguably the most successful AI company now, which is [Google], didn’t even use GPUs to train their latest model.”
A Google spokesperson confirmed the shift, stating that “we are experiencing accelerating demand for both our custom TPUs and Nvidia GPUs. We are committed to supporting both, as we have for years.”
While diplomatic, the statement confirms that Google is aggressively encroaching on Nvidia’s turf. Meta is one of the world’s largest purchasers of H100 and Blackwell chips according to reports, making any reduction in orders a material financial risk.
The Financial Front: Battling the ‘Big Short’
Beyond the hardware war, Nvidia opened a second front against investor Michael Burry. The Scion Asset Management founder recently compared the company to Cisco during the dot-com bubble.
He argued that Nvidia is supplying the hardware for a build-out that will eventually suffer intensive corrections.
Burry’s allegations center on “accounting fraud,” specifically targeting stock-based compensation and depreciation schedules.
In response, Nvidia circulated the seven-page memo to analysts. This move was described as “quietly fighting” behind the scenes rather than a public press release.
Clarifying the numbers, the document stated that actual stock buybacks totaled $91 billion. This figure directly refutes Burry’s claim of $112.5 billion.
Addressing the core of the financial allegations, the memo stated:
“Nvidia does not resemble historical accounting frauds because Nvidia’s underlying business is economically sound, our reporting is complete and transparent, and we care about our reputation for integrity.”
Nvidia also denied using Special Purpose Vehicles (SPVs) or vendor financing to inflate revenue figures in the same memo.
Engaging with a short-seller’s thesis at this level of detail is highly unusual for a company with a $4 trillion market cap. It suggests executives are concerned about the narrative taking hold among institutional investors.
The Broader Shift: ‘Rough Vibes’ in the AI Economy
Both the hardware and financial battles unfold against a backdrop of cooling “AI hype” and shifting enterprise priorities.
OpenAI CEO Sam Altman recently admitted to “rough vibes” and “economic headwinds” in a leaked memo.
Signaling the end of the “easy growth” era, Altman conceded that “Google has been doing excellent work recently in every aspect.”
As enterprise customers are increasingly focused on ROI, consequently, cheaper, efficient alternatives like TPUs become more attractive than expensive Nvidia clusters.
Google is clearly capitalizing on this. Its Nano Banana Pro branding for its enterprise AI features is just one example.
While the name is whimsical, the strategy is serious. It targets practical business use cases like “Thinking” mode and 4K image generation, prioritizing user utility over raw theoretical performance benchmarks.
Marc Benioff’s public defection from ChatGPT to Gemini 3 underscores this shift in sentiment among tech leaders.
Praising the new capabilities, the Salesforce CEO stated this week that he was not going back to ChatGPT after switching to Gemini.
Holy shit. I’ve used ChatGPT every day for 3 years. Just spent 2 hours on Gemini 3. I’m not going back. The leap is insane — reasoning, speed, images, video… everything is sharper and faster. It feels like the world just changed, again. ❤️ 🤖 https://t.co/HruXhc16Mq
— Marc Benioff (@Benioff) November 23, 2025
The market seems now to be moving from a “growth at all costs” phase to an “efficiency and integration” phase. This transition favors Google’s full-stack approach over Nvidia’s component-based dominance.

