首页 News 正文

Traditional chip giants are betting on AI applications, and large software and hardware technology companies have also taken the final step of expansion.
On the early morning of November 16th, Microsoft launched two self-developed chips - Maia 100 and Azure Cobalt 100- at its annual developer conference Microsoft Ignite, providing impetus for training and reasoning of large language models. This officially fills the only ecological gap between Microsoft, Google, and Amazon, as the latter two giants already have their own customized chips.
Maia 100 is Microsoft's first artificial intelligence chip, mainly designed for training large language models. It belongs to the ASIC (Application Specific Integrated Circuit) chip and is suitable for x86 hosts. It is manufactured based on TSMC's 5nm node and can seamlessly integrate with Microsoft's Azure software stack. This chip will be launched in Azure data centers starting early next year, with the aim of reducing Microsoft's dependence on Nvidia GPUs.
Microsoft's second chip, Azure Cobalt 100, is a 128 core cloud native chip based on the Arm architecture. It is designed for general-purpose computing tasks and can compete with Intel processors and Amazon Cloud's Graviton series chips. This chip can also be compatible with the Azure ecosystem, and its greater use is to run universal computing workloads on Microsoft Cloud, reducing costs.
In fact, Microsoft speculates that there are signs of chip making. In 2010, Microsoft hoped to develop its own AI hardware. According to The Information, Microsoft has been developing a new AI chipset codenamed "Athena" since at least 2019, with the aim of providing an alternative to Nvidia chips for training and reasoning in large language models such as ChatGPT. According to Toms Hardware, Athena uses TSMC's 5nm process, specifically designed for training large language models.
However, since then, Microsoft has denied some details of these reports, such as AMD's involvement in them, and will no longer disclose any other information about Athena.
Now it seems that Microsoft's Maia 100 and Azure Cobalt 100 chips, both of which use TSMC's 5nm process, are reminiscent of the relationship between the new product and Athena.
Microsoft also pointed out that they do not intend to sell these high-end custom chips, but will use them for their subscription software products as part of Azure cloud computing services. According to Microsoft, they are already trying to use Cobalt chips for the team collaboration platform Teams.
Against the backdrop of a shortage of chips and a surge in demand for AI applications, Microsoft's rapid promotion of customized chip plans is widely interpreted by the outside world as the best choice for cloud computing giants with resources.
A person familiar with the matter once revealed that during the development of Athena, Microsoft had already ordered up to a few hundred thousand GPUs from Invidia to meet the needs of OpenAI. In September, as the ChatGPT craze subsided, there were constant market news that Microsoft began to reduce the order volume of NVIDIA H100 graphics cards. At Microsoft's October financial report conference call, the issue of "cost reduction" was also repeatedly emphasized.
However, it is questionable whether Microsoft can break free from its dependence on NVIDIA with two high-end chips. In addition to further examination of the true level of customized chips, the other series of updates released by Microsoft at the meeting still include Nvidia's presence. For example, Microsoft announced that its Azure will provide Nvidia's new generative artificial intelligence modeler to customer enterprises, which means that the cooperation between the two will continue to expand, and also indicates that Microsoft does not want to completely undermine the tacit understanding with Nvidia.
标签: NVIDIA Microsoft two
CandyLake.com 系信息发布平台,仅提供信息存储空间服务。
声明:该文观点仅代表作者本人,本文不代表CandyLake.com立场,且不构成建议,请谨慎对待。
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

hahawer 新手上路
  • 粉丝

    0

  • 关注

    0

  • 主题

    0