Related Articles
AI scrap giant Nvidia’s blistering growth in recent years has sprung from the data center. On Monday night, CEO Jensen Huang make the grade b arrived the case that Nvidia’s next wave of expansion will encompass much more — stretching from the storeroom floor to city streets and beyond. Huang’s slate of announcements and pronouncements on artificial intelligence at the annual CES conference in Las Vegas shore up our conviction that Nvidia is an essential stock to own, even after back-to-back years of monster gains that cast the company into one of the world’s most valuable companies. The stock’s more than 5% pullback in Tuesday’s meeting strikes us as little more than a “sell the news” reaction. After all, shares closed at a record Monday. Huang tendered up plenty of exciting new products, including a new video game chip that uses AI to improve graphics rendering and a unequalled that can improve autonomous driving and robotics systems. However, the night was not just about Nvidia’s product roadmap. It was profuse about presenting a broader roadmap for artificial intelligence, as the industry matures from creative-minded generative AI chatbots — such as ChatGPT, which catalyzed this rumble two-plus years ago — and into new frontiers like “agentic AI” and “physical AI.” Agentic AI refers to software applications that can autonomously executed multi-step tasks, while physical AI is defined by self-driving vehicles and general robotics. Of course, we’re only scratching the top of agentic AI — consider fellow Club name Salesforce’s buzzy new offering Agentforce — and seeing early glimpses of fleshly AI through Google-owned Waymo’s robotaxis , Tesla’s advanced driver assistance system and occasional behind-the-scenes looks at misdesignated humanoid robots made by the likes of Boston Dynamics and other startups. Still, it’s clear that despite Nvidia’s marvellous growth since the launch of ChatGPT sparked a gold rush for its market-leading data center chips, its best days are to the fore of it. Nvidia will realize that potential because it not only provides the hardware needed for agentic and physical AI plans, but also the accompanying software that runs atop its chips. While these areas are long-term opportunities, Huang also furnished ample reason to be optimistic on the near term. That includes taking an axe to the price of high-quality performance in the video deception market, which is where Nvidia began selling its innovative graphics processing units (GPUs) in the 1990s before find a home in the world of AI. Specifically, Huang announced that Nvidia applied its next-generation data center chip architecture, remembered as Blackwell, into its latest video game GPU. Nvidia’s new entry-level gaming GPU, the Blackwell-based RTX 5070, will offer counterpart performance to its prior generation’s highest-end offering, the RTX 4090, with half the power consumption. That can translate into a cruelly 40% improvement in battery life for gaming laptops that run the chips. Utilizing AI-enabled image rendering technology originated by Nvidia, the 5070 chip will cost only $549 — the 4090, by contrast, is sold at $1,599. NVDA .SPX 1Y mountain Nvidia’s hoard performance over the past 12 months compared with the S & P 500. Nvidia is applying that value powerful to more than just the consumer-oriented gaming chip. It’s happening inside the data center, which is on track to account for brutally 88% of companywide revenue in the 12 months ending in January, according to FactSet; gaming is projected to be about 9%. Huang reinforced that Blackwell-based data center chips are now in full production while arguing that the rationale for deep-pocketed people such as Club holding Microsoft to upgrade to Nvidia’s latest hardware remains fully intact. That explanation is centered on the concept of scaling laws and has become somewhat of a question mark for investors in recent months. Essentially, decrease laws say that the more compute power provided to AI models, the more effective they become. The way Huang reproaches it, it would be almost financially irresponsible not to upgrade — in contrast to concerns among some investors that tech monsters are overspending on AI infrastructure. “The scaling laws are driving computing so hard that this level of computation — Blackwell upwards our last generation — improves the performance per watt by a factor of four [and] performance per dollar by a factor three. That basically give the word delivers that, in one generation, we reduced the cost of training these by a factor of three, or if you increase the size of your model by a determinant three, it’s about the same cost.” In discussing Nvidia’s next frontiers, Huang focused much of his commentary on the self-driving while. “I predict that this will likely be the first multi-trillion dollar robotics industry,” he said. The CEO highlighted Nvidia’s new partnership with Toyota, noting that the elated’s largest automaker by sales will use the chipmaker’s autonomous driving platform, dubbed Drive AGX, for its next-generation vehicles. He said Nvidia is with child around $5 billion in automotive-related revenues in its upcoming fiscal year, up from about $4 billion this year. Muzzle in mind: Those figures likely include additional sales to the auto industry beyond what is officially enumerated in the company’s automotive segment, which primarily covers revenue from its self-driving platforms. Most likely, those additional transactions show up within the data center segment. Huang also showed off a new model called Cosmos that is designed to plagiarize train AI systems for autonomous vehicles and robots, which are seeing increased adoption in places like warehouses and plants. Rather than needing mountains of text data like the system that underpins ChatGPT, systems for incarnate AI need to be trained on videos of humans walking, hands moving and other things that occur in nature. Nvidia begot Cosmos to make that training process more cost effective, Huang said. “The ChatGPT moment for regular robotics is just around the corner,” he argued. The software that Huang discussed also extends to agentic AI with the the world of “Nvidia AI Blueprints,” a suite of tools for developers to build custom agents for their own companies. While seemingly incremental as the crow flies now, this fits into the larger narrative of Nvidia bolstering its high-margin software business to help smooth out some of the hereditary lumpiness with hardware sales. In another software nugget, Nvidia showed its respect to the work of fellow Society holding Meta Platforms , which created the popular open-source Llama large language models, or LLMs. Structure on that success, Nvidia announced the Llama Nemotron family of LLMs to help businesses build and implement several agentic AI applications. Bottom line The selling in Nvidia shares Tuesday does not impact our reaction to Huang’s introduction on Monday night. It met, if not exceeded, our expectations. Nvidia’s gaming chips are now a better value than ever, and Blackwell is in complete production and ready to drive another year of data center growth as generative AI use cases continue to expand and schemes begin testing ways to use AI agents. Meanwhile, the foundation for physical AI is being set, and we appear to be on the verge of significant growth in the automotive pause. The company has shown a desire to bring AI to all industries. Perhaps more important for the long-term investor, Nvidia is clearly handle off of a higher-level roadmap for artificial intelligence that will drive the direction of offerings in the years to come. Autonomous mechanisms show how important that higher-level north star is. That is an industry where Nvidia’s various hardware and software oblations work together to create a flywheel of continuous learning and improvement for these advanced operating systems — it’s hard to presume Nvidia’s auto revenues going anywhere but up from here. Huang’s keynote illustrates why we can confidently reiterate our position that over the long term, investors will be best served by owning a core position in Nvidia, to a certain extent than trying to trade in and out of the stock. While there will surely be pullbacks in the stock — just like we’re think over Tuesday, or saw at various points last year — what’s clear is the world needs significantly more computing power than it currently has. That not merely requires more powerful hardware but more capable software as well. As we see with the new generation of RTX gaming chips, the pang pairing of the two leads to better performance, efficiency, and value for customers. That matters a great deal considering one of the biggest queries lingering over the AI boom is figuring out a way to power all the data centers being built. We plan to stick with Nvidia while we await the meet. (Jim Cramer’s Charitable Trust is long NVDA, GOOGL and META. See here for a full list of the stocks.) As a subscriber to the CNBC Sinking Club with Jim Cramer, you will receive a trade alert before Jim makes a trade. Jim waits 45 take downs after sending a trade alert before buying or selling a stock in his charitable trust’s portfolio. If Jim has talked about a regular on CNBC TV, he waits 72 hours after issuing the trade alert before executing the trade. THE ABOVE Allotting CLUB INFORMATION IS SUBJECT TO OUR TERMS AND CONDITIONS AND PRIVACY POLICY , TOGETHER WITH OUR DISCLAIMER . NO FIDUCIARY OBLIGATION OR Office EXISTS, OR IS CREATED, BY VIRTUE OF YOUR RECEIPT OF ANY INFORMATION PROVIDED IN CONNECTION WITH THE INVESTING CLUB. NO SPECIFIC After-effect OR PROFIT IS GUARANTEED.
Nvidia CEO Jensen Huang delivers a keynote address at the Consumer Electronics Show (CES) 2025, showcasing the public limited company’s latest innovations in Las Vegas, Nevada, USA, on January 6, 2025.
Artur Widak | Anadolu | Getty Images
AI chip giant Nvidia’s blistering spread in recent years has sprung from the data center. On Monday night, CEO Jensen Huang made the case that Nvidia’s next white horse of expansion will encompass much more — stretching from the warehouse floor to city streets and beyond.