首页 News 正文

In the heart of Silicon Valley, the San Jose Convention Center in California, a long-awaited technology feast is taking place here this week.
On March 18th local time, Nvidia CEO Huang Renxun, dressed in his iconic black leather jacket, climbed onto the Ten Thousand People Stadium, resembling a rock superstar. "Welcome to the GTC conference. I hope you realize that this is not a concert, but a developer conference," he said humorously at the beginning. In fact, for the Nvidia GPU Technology Conference (GTC), the industry has always hailed it as the "Woodstock Music Festival of the AI industry", but at this grand event, the lively notes include chips, algorithms, computer architecture, and mathematics.
As a "shovel seller" in the AI era, Nvidia showcased many "black technologies" at the GTC conference. Among them, the new GPU chip based on Blackwell architecture has 4 times the performance of the previous generation and 25 times the energy efficiency improvement, making it the "strongest AI chip". The new AI microservices provide a one-stop service from application software to hardware programming, with a full stack layout on hardware, software, and systems.
More noteworthy is that Huang Renxun constantly mentioned the concept of "AI factories" in his speech, hoping that enterprises can consider data centers and AI tools from different perspectives. He believes that the world is at the beginning of a new round of industrial revolution, in which the raw materials input are data, and the valuable digital tokens come out. The intermediate link between processing and refining data is the data center. According to Huang Renxun's estimation, this will be a market with an annual scale of 250 billion US dollars.
There are many highlights of the AI event: "The Strongest AI Chip" Blackwell, humanoid robots, Vision Pro digital twins
In 2016, Huang Renxun personally delivered the first DGX-1 supercomputer composed of GPU chips to OpenAI in front of Musk and other OpenAI veterans. Nowadays, all companies involved in the competition between big language models and generative AI are competing for Nvidia's GPU chips. Undoubtedly, in this wave of generative AI craze, Nvidia is the key "shovel seller" behind it.
At this week's GTC conference, Huang Renxun lived up to expectations and showcased a series of "black technologies", once again igniting the AI industry.
Every mapping (data source: organized according to public information)

At this technology event, Huang Renxun officially announced the "AI nuclear bomb": a GPU chip based on the Blackwell architecture. The training performance of Blackwell GPU is 4 times that of the previous generation Hopper GPU, the inference performance is 30 times, and the energy efficiency is about 25 times. The rapid performance improvement of Nvidia GPUs has made Nvidia senior scientist Jim Fan exclaim that a "new Moore's Law" has been born.
For example, if you want to train a model with 1.8 trillion parameters (GPT-4 scale) using Hopper architecture chips, it may require 8000 GPUs, consume 15 megawatts of power, and take about 90 days. If using Blackwell, only 2000 GPUs and 4 megawatts of electricity are needed, resulting in a significant reduction in energy consumption.
The industry expects companies that will adopt the Blackwell series chips to include Amazon, Google, Meta, Microsoft, OpenAI, Tesla, and xAI. Huang Renxun revealed that the price of the new GPU is between 30000 and 40000 US dollars, which will undoubtedly further drive Nvidia's revenue.
Nvidia has also launched NIM (Nvidia Inference Microservices), providing a one-stop service from application software to hardware programming, allowing developers to easily build and deploy AI applications, accelerating the landing and application of AI technology.
In addition, Nvidia also showcases a series of cool AI application scenarios, such as ESMFold for predicting protein structure, Groot as the basic model of humanoid robots and Jetson Thor as the humanoid robot chip, BYD's in car computing platform, Apple Vision Pro application on the Omniverse platform, and digital twins of Earth's climate.
This GTC conference has also received enthusiastic support from Wall Street. Morgan Stanley analyst Harlan Sur believes that the Blackwell architecture consolidates Nvidia's dominant position in the AI field, "still one to two steps ahead of its competitors.". Bank of America analyst Vivek Arya stated that Blackwell based GPU products "take a step forward in training performance and leap in inference performance", maintaining a target price of $1100 and a "buy" rating.
Goldman Sachs analyst Toshiya Hari believes that Nvidia's strong innovation capabilities and extensive customer relationships will drive its sustained growth, maintain its competitive advantage in the future, and put pressure on competitors. Therefore, Hari has raised its target stock price from $875 to $1000.
As of the time of publication, Nvidia has risen by 7.35% this week, with a cumulative increase of 95.75% since the beginning of this year, and a full year increase of 239% in 2023.
Not only selling chips, Nvidia is interested in the $250 billion market
"What we really sell is not chips. The chips themselves cannot work and need to be built into a system to run." While the super performance of Blackwell chips has sparked heated discussions, Huang Renxun has repeatedly emphasized this during the GTC conference. "Nvidia has built the entire data center for AI, just breaking it down into various parts, some of which are suitable for your company."
In Huang Renxun's view, Nvidia's market opportunity does not lie in GPU chips, as many companies are also developing GPUs. Nvidia's true competitive advantage lies in a data center solution that integrates chips, software, algorithm engines, security technology, inter chip communication, and more.
"Data centers are rapidly moving towards accelerated computing (referring to efficient computing processing that utilizes dedicated hardware to surpass CPU conventional computing efficiency), which is a market worth $250 billion annually and growing at a rate of 20% to 25% annually. This is mainly due to the demand for AI, and Nvidia will occupy an important share," said Huang Renxun. He also stated that this is the confidence that Nvidia's market value can rise from $1 trillion to $2 trillion in 9 months.
Huang Renxun attaches great importance to data center business because he believes that data centers will play a central role in the new industrial revolution. Therefore, he repeatedly emphasized during the GTC conference that data centers should be understood with the mindset of "AI factories".
Huang Renxun used the analogy of the electric power industrial revolution to explain "AI factories": in the previous industrial revolution, water was inputted and electricity was produced, while in the data center room, the industrial revolution, whose raw materials were data, processed and output were data tokens. "This type of token is invisible and will be distributed around the world. It is very valuable."
Industry insiders believe that Huang Renxun's so-called "AI factory" is actually similar to the term used by domestic intelligent computing centers. It is a new type of "power plant" that provides computing power for AI large models, AI applications, etc., rather than traditional devices for storing and managing data.
The reason why Huang Renxun repeatedly emphasized the concept of "AI factories" is actually to hope that enterprises can see data centers as profitable units, rather than simply investing in equipment, so as to encourage more enterprises to accelerate the deployment of Nvidia's services. "Data centers were once seen as cost centers and capital expenditures for companies, and you would think of them as a cost. However, factories are another matter, they can make money. The new world of generative AI will create a new form of factory," he added.
With the explosion of generative AI, data center business has become Nvidia's main growth point. The latest quarterly financial report released in February this year showed that Nvidia's data center business accounted for 83% of its revenue, a year-on-year increase of 409% to $18.4 billion. This growth is closely related to the demand for Nvidia Hopper GPU architecture for large model training and inference, and with the introduction of Blackwell GPUs, Nvidia is expected to continue to firmly occupy the position of AI's "shovel seller".
During the GTC conference, Dell announced the collaboration with Nvidia to build Dell's "AI factory" and upgrade its flagship Edge XE9680 server to support Nvidia's latest GPU architecture. Lenovo Group has also announced a partnership with Nvidia to launch a new hybrid artificial intelligence solution, which will provide developers with the recently released NVIDIA microservices, including NVIDIA NIM and NeMo Retriever. Microsoft CEO Nadella also stated that he will use the GB200 Grace Blackwell processor in Microsoft's global data centers to assist organizations around the world in implementing AI.
When Nvidia's share price soared, everyone on Wall Street was concerned about how much more Nvidia could rise and whether there was a foam. In the eyes of many, this round of AI craze has just begun, and Nvidia will continue to grow.
Dan Ives, a renowned analyst at WedBush Securities, said in a comment email to the Daily Economic News reporter, "The wave of investment initiated by Nvidia's' golden 'GPUs has sparked a wave of spending in the technology industry for the next few years. In this fourth industrial revolution, as business and consumer usage scenarios spread globally, we expect AI spending to reach $1 trillion in the next decade. Currently, the demand for AI is indisputable, with Microsoft, Google, Amazon, Oracle, and Meta all having astonishing capital expenditure figures in AI transformation investments, and Nvidia is the leader."
Recently, Dan Ives believes that AI related expenses will account for 8% to 10% of a company's IT budget in 2024, compared to less than 1% in 2023.
Furthermore, it is worth mentioning that during the NVIDIA GTC conference, in addition to Huang Renxun's keynote speech, guests such as Li Feifei, a member of the National Academy of Engineering and the first Redwood Chair Professor at Stanford University, and Brad Lightcap, the Chief Operating Officer of OpenAI, also brought exciting sharing to explore breakthrough developments in fields such as AI and accelerated computing.
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

我放心你带套猛 注册会员
  • 粉丝

    0

  • 关注

    0

  • 主题

    31