Google Vs. Nvidia: Inside The AI Hardware Showdown

Google Vs. Nvidia: Inside The AI Hardware Showdown
Source: Forbes

Wall Street began 2025 with expectations that Alphabet (NASDAQ:GOOG), Google's parent company, would allocate about $60 billion in capital expenditures.

So where is all this funding being directed?

Directly into AI infrastructure: servers, storage, power and cooling systems, along with a vast quantity of chips to support Search, Ads, YouTube, Gemini, and Google Cloud.

Regardless of where GOOG stock heads, your portfolio should remain on course.

In Nvidia's Q2 FY26 earnings, anonymous customers accounted for 39% of revenue (23% + 16%) - it's safe to say that's Microsoft, Google, and Amazon in some order.

The top three hyperscalers (Amazon AWS, MS Azure, Google Cloud) command over 60% of the global cloud market. They are premium customers for Nvidia.

However, there's a concern - GPU expenditures are likely increasing at a much faster pace than Google's cloud revenue.

For example:

What does this mean? Google is likely increasing its spending on chips at a rate that outpaces the revenue generated from those chips. This might pressure cash flows and diminish returns on investment over time. That's why Google aims to regain control and adjust the balance somewhat.

Google is implementing a dual-track strategy: leveraging Nvidia for flexibility while utilizing its own custom Tensor Processing Units (TPUs) for raw efficiency and cost management.

The AI landscape is transitioning from training (developing models), which relied significantly on high-performance GPUs, to inference (executing them billions of times daily). This is the domain of TPUs.

Google's Strong Position to Enhance TPU Usage Both Internally and Externally

AI isn't merely a product for Google -- it's ubiquitous: Search, Ads, YouTube, Gmail, Maps, Android, Gemini. Billions of identical inferences are made daily. Ideal for rigid, ultra-efficient TPUs.

TPUs already manage most of the internal workloads, and now they are rapidly expanding outward:

Will TPUs Assist in Lowering Computing and Capital Expenditure Costs?

Very likely to do so.

As TPUs take on more workloads (within GOOG and with collaborators like Anthropic), Google gains real negotiating strength with Nvidia:

The outcome? Increased leverage + power at the negotiation table for Google.

Every additional workload that shifts to TPUs represents a workload for which Nvidia will not receive payment.

Conclusion

As previously mentioned, Google isn't aiming to eliminate Nvidia -- they're essentially optimizing their reliance.

Google's chips are a strategic power play aimed at redefining AI economics, controlling expenses, and maintaining leverage over the premier chipmaker in the industry. This could potentially elevate Google's valuation, particularly with its stock already climbing by 50% this year.

When the leading AI product player in the world begins writing smaller checks to Nvidia, the landscape will transform.

The Trefis High Quality (HQ) Portfolio, comprising 30 stocks, has consistently demonstrated a history of comfortably exceeding its benchmark, which includes all three indices - the S&P 500, S&P mid-cap, and Russell 2000. What accounts for this? As a collective, HQ Portfolio stocks have yielded better returns with reduced risk compared to the benchmark index; generating a steadier performance, as reflected in HQ Portfolio performance metrics.