首页 News 正文

American artificial intelligence cloud service provider Lambda announced on Thursday (April 4th) that the company has secured a financing loan of up to $500 million. What sets it apart is that this loan is secured by the Nvidia GPU (Image Processing Unit) chip it owns.
The press release refers to this innovative approach as a "special purpose GPU financing tool": using GPUs as collateral and supported by the cash flows they generate. Lambda wrote that this represents an important milestone in the AI computing market, enabling the company to provide cloud deployment funding to thousands of users without the need for them to sign long-term contracts.
Lambda stated that $500 million in funding will support the company's deployment of more Nvidia GPUs for AI developers to train, fine tune, and infer generative AI models in Lambda Cloud. "This pioneering financing tool opens up a new way to fund the deployment of tens of thousands of Nvidia GPUs."
"With the widespread integration of big language models and generative AI, the demand for computing continues to grow, and we are pleased to provide support to Lambda as they can meet the needs of AI engineers," wrote Don Trent, a senior executive at the globally renowned investment bank Macquarie Group, leading the financing
With the arrival of the AI wave, the training of general large models, industry large models, industry applications based on general large models, and inference all require a large amount of intelligent computing power to provide support. The demand for AI chips, AI servers, and cloud computing computing power will continue to increase, and Nvidia's H100 is recognized as the GPU most needed for training models.
As early as the launch of the H100 chip this year, Nvidia chose Lambda and CoreWeave as the first companies to use the chip, and Nvidia's investment was also involved in Lambda's financing process. It is worth mentioning that Lambda will also become one of the first cloud service providers to deploy B200 chips.
An analysis suggests that Nvidia is selling its GPUs to cloud startups, thereby expanding the target customer base of GPUs beyond giants such as Amazon and Microsoft. In addition, by providing services to smaller cloud providers such as Lambda Labs, Nvidia can help promote competition and further consolidate its position.
Last month, NVIDIA CEO Huang Renxun told the media that the B200 may be priced between $30000 and $40000, and is expected to be shipped later this year. Huang Renxun stated that the global annual expenditure on data center equipment will reach $250 billion, and Nvidia will gain a higher share than other chip manufacturers.
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

happy2000617 新手上路
  • 粉丝

    0

  • 关注

    0

  • 主题

    0