Nvidia has been on a tear. Just in case you've been living under a nice mossy rock the past few years, the advent of generative AI has ushered in a golden era for the company. Let's talk some truly eye-watering facts and figures: during Nvidia's [[link]] Q2 2025 earnings call, the company reported a total revenue of $46.7 billion. Okay, so it's more like a diamond-encrusted-with-foie-gras-on-the-side era, but it's been far from [[link]] smooth sailing.
As such, with the just off the horizon and data centre demand continuing to grow, during the aforementioned earnings call, "We advise our partners, our customers to pace themselves."
But, specifically referring to Nvidia's H100 and H200 GPU clusters often used in AI data centres, Huang had this to say: "Right now, the buzz is — I'm sure all of you know about the buzz out there. The buzz is everything sold out."
However, it's worth noting that . As such, Huang kept these priorities in sight as he reassured investors that Nvidia was taking steps to avoid similar scarcity with Rubin later this year.
In his closing remarks, Huang said, "Our next platform, Rubin, is already in fab. We have six new chips that represent the Rubin platform. They have all ticked up at TSMC. Rubin will be our third-generation NVLink rack scale AI supercomputer, and so we expect to have a much more mature and fully scaled up supply chain."
Given the constantly shifting US tariff situation and its far-reaching effects on the wider geopolitical stage, it's hard to comment on the projected success of even the best laid plans. And then there's the fact that major player —could this herald the shape of things to come for AI's place within big tech?
If Huang suspects as much, he didn't let on, instead saying, " and Rubin AI factory platforms will be scaling into the $3 trillion to $4 trillion global AI factory build out through the end of the decade."

👉👈
1. Best overall:
2. Best value:
3. Best budget:
4. Best mid-range:
5. Best high-end: