The figure signals Huang’s confidence that Nvidia can remain the biggest player in the market for AI chips amid growing competition and investor skepticism over whether its strategy to claw back its profits in the AI ecosystem is paying off.
Huang did not elaborate on the forecast. But it represents a big step away from the roughly $500 billion revenue opportunity for 2026 Nvidia reiterated on its last earnings call.
Shares of Nvidia – the world’s most valuable listed company with a market value of more than $4.3 trillion – soon jumped on the news but pared those gains to 1.4% in last trade.
Huang was speaking at the 18,000-plus capacity hockey arena at an event that has become one of the largest showcases of AI technology.
The four-day conference is also expected to reveal how the top AI chipmaker plans to adapt to the rapidly changing AI landscape.
He began the keynote by arguing that part of Nvidia’s competitive advantage is its CUDA chip programming software, which some analysts consider its strongest shield.
“The installed base attracts developers who then create new algorithms that achieve breakthrough” technologies, Huang said. “We’re in every cloud. We’re in every computer company. We serve almost every single industry.”
In-demand AI chips
The keynote is also likely to include details of a next-generation AI chip called Feynman, named after the late American physicist Richard Feynman.
Huang is also likely to talk about physical AI such as data centers, digital assistants known as AI agents, and robots.
Another focus is likely to be on Groq, a chip startup from which Nvidia licensed technology for $17 billion in December. Groq specializes in fast and cheap “predictive” computing, in which an AI model takes what it has already learned and uses it to answer a question or make a prediction in real time.
After spending billions of dollars in recent years on chips to train their AI models, companies like OpenAI, Anthropic and Facebook-owned Meta Platforms are turning to serving the millions of users who are tapping into those AI systems.
Nvidia faces more competition in the market for chips for predictive-computing tasks than for AI-training chips, and analysts expect the company to strengthen its defenses against rivals seeking to regain market share lost to Nvidia in recent years.
Analysts also expect Nvidia to elaborate on why it invested $2 billion each in Lumentum and Coherent, both of which make lasers to send information between chips in the form of beams of light.
Despite that increased competition, some of which are designing their own chips from Nvidia’s own customers, Nvidia remains central to the global AI ecosystem.
Nations like Saudi Arabia are building custom AI systems for their own populations using its chips, and it is one of the only major US companies that continues to release open-source AI software, an area of competition between the US and China.
(You can now subscribe to our ETMarkets WhatsApp channel)
