Nvidia has reported record quarterly revenue of US$39.3 billion for its fourth quarter ended January 26, representing a 78% increase from the same period last year and a 12% rise from the previous quarter. The results, which cap a fiscal year that saw the company's revenues more than double to US$130.5 billion, come amid an unprecedented surge in demand for AI compute infrastructure as businesses and cloud providers race to develop and deploy generative AI applications.
The remarkable growth trajectory has transformed Nvidia from a gaming hardware company into the world's most valuable chip maker, with a market capitalization that has surpassed that of long-established tech giants. This transformation has been driven by the company’s strategic pivot towards data centre products that power AI workloads, with its GPUs becoming the industry standard for training and running large language models.
The company’s Blackwell architecture, revealed at GTC last year and launched in the quarter, delivered US$11 billion in revenue in what Colette Kress, Nvidia’s Executive Vice President and CFO described as “the fastest product ramp in our company’s history.”
“Demand for Blackwell is amazing as reasoning AI adds another scaling law – increasing compute for training makes models smarter and increasing compute for long thinking makes the answer smarter,” says Jensen Huang, Founder and CEO of Nvidia. “We’ve successfully ramped up the massive-scale production of Blackwell AI supercomputers, achieving billions of dollars in sales in its first quarter.”
In a recent discussion, Nvidia's CEO Jensen Huang described the interest in the company's latest artificial intelligence chip, Blackwell, as "insane." The Blackwell chip, anticipated to be priced $30,000 and above per unit, is highly sought after by major corporations such as OpenAI, Microsoft, and Meta, among others, who are establishing AI data centers to support applications like ChatGPT and Copilot.
Nvidia has emerged as a significant winner in the AI surge, witnessing a remarkable increase of approximately 150% in its stock value this year. The demand for Blackwell is particularly strong as firms strive to enhance their AI capabilities. Nvidia's financial performance continues to impress, with revenue climbing to $30.04 billion in the fiscal second quarter, marking a 122% increase compared to the same period last year. The company projects sales of $32.5 billion for the current quarter.
Benchmarks reveal Blackwell chips are more than twice as fast as previous Hopper generations, showcasing Nvidia's continued dominance in AI training. The benchmarks focus on AI training — the phase where systems learn from vast datasets — which remains a key competitive frontier despite the market's growing focus on AI inference.
One major finding was that Nvidia and its partners were the only ones to submit data for training a large-scale model like Llama 3.1 405B, an open-source AI system from Meta Platforms with trillions of parameters. According to the data, Nvidia's new Blackwell chips are more than twice as fast per chip as the previous-generation Hopper chips. In the fastest result, a cluster of 2,496 Blackwell chips completed the training task in just 27 minutes. By contrast, more than three times that number of Hopper chips was needed to match or better that performance.
Morgan Stanley analyst Joseph Moore described the surge in interest for Nvidia’s Blackwell architecture as "exceptional demand," with demand already outpacing supply. Moore points to growing demand from big customers, from hyperscalers and large enterprise buyers, who are investing in infrastructure faster than anyone expected. That surge is being driven by the need to support both inference and large-model AI workloads.
Nvidia’s Blackwell chips enable higher returns on investment, with significant improvements in performance and efficiency for AI data center operations.


