首页 News 正文

On February 27th, Caixin News Agency reported that increasing evidence suggests that the tight supply of artificial intelligence chips is easing, and some companies that have purchased large quantities of Nvidia H100 80GB processors are now attempting to resell these processors.
At present, it is reported that the delivery cycle of Nvidia H100 GPU for artificial intelligence (AI) and high-performance computing (HPC) applications has been significantly shortened from 8-11 months to 3-4 months.
According to reports, some companies are reselling their H100 GPUs or reducing orders as the scarcity of these chips begins to decline and the cost of maintaining these unused inventory is also high. This situation has undergone significant changes compared to a year ago, when obtaining Nvidia's Hopper GPU was a major challenge.
The current relief from the shortage of artificial intelligence processors is also reflected in the fact that renting Nvidia's H100 GPU from cloud service providers such as AWS, Google Cloud, and Microsoft Azure has become easier. For example, AWS has launched a new service that allows customers to arrange shorter GPU rentals, addressing previous issues with chip availability, resulting in reduced waiting times for obtaining artificial intelligence chips.
Despite the improved availability of chips and significantly shortened delivery times, the demand for artificial intelligence chips still far exceeds the supply.
Especially for companies that develop and train large-scale language models themselves, they still face supply issues, largely due to the large number of GPUs they require. These companies still face several months of delay in obtaining the required processors or capacity.
As a result, the prices of Nvidia H100 and other processors have not decreased, and the company continues to enjoy high profit margins.
However, with the successive emergence of many alternatives to Nvidia processors, such as AMD and AWS processors, the market may usher in a more balanced situation. Another reason is that companies have become more cautious in their spending on artificial intelligence processors.
Anyway, currently, the demand for artificial intelligence chips in the market remains strong, and as large language models become larger, the demand for computing performance is also increasing.
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

123458039 新手上路
  • 粉丝

    0

  • 关注

    0

  • 主题

    1