首页 News 正文

Technology giant Microsoft (Nasdaq: MSFT) has officially entered the AI chip industry.
On November 15th, Microsoft Chairman and CEO Satya Nadella delivered an almost hour long opening speech at the Microsoft Ignite Global Technology Conference. As predicted earlier, Nadella's speech almost entirely revolved around AI, launching the first CPU (central processing unit) Azure Cobalt 100 designed by Microsoft in terms of hardware, as well as the first AI chip Azure Maia 100 specifically designed for cloud training and reasoning. Both will be prioritized for supporting Microsoft's own cloud services.
On the same day, Microsoft's stock price remained stable, rising 0.04% to close at $369.67, continuing to reach a new closing high. Currently, Microsoft's total market value has reached $2.75 trillion.
According to Nadella, Cobalt 100 is based on the Arm architecture and has 128 cores. It is known as the "fastest CPU among all cloud computing providers" and has been applied in some of Microsoft's businesses and will be launched next year. The highly anticipated Microsoft self-developed AI chip Azure Maia also made its debut at the conference. The Maia 100 adopts a 5nm process and has 105 billion transistors. Nadella stated that Maia 100 aims to run large language models and help AI systems process large amounts of data faster. It will first provide support for Microsoft's own AI applications, and then be open to partners and customers.
According to foreign media analysis, Microsoft's Maia 100 and Cobalt 100 will respectively pose challenges to NVIDIA and Intel's leading positions in the AI chip and CPU industries. Faced with the challenge of insufficient supply of top AI chips, Microsoft has finally joined the self-developed ranks of other internet giants. Previously, Google had launched its self-developed AI Tensor Processing Unit (TPU) since 2016 and had reached its fifth generation in September of this year. Amazon Cloud Technology (AWS) announced the launch of its self-developed chip Training for training AI models in 2020.
At the conference, Nadella also took the opportunity to showcase the close cooperation between Microsoft and two chip giants NVIDIA and AMD. Azure cloud services are about to use the latest NVIDIA H200 chip, while on the other hand, the computing power provided by AMD's AI chip MI300X will also be available to some Microsoft users starting from the 15th.
NVIDIA CEO Huang Renxun also made an appearance at the conference, reviewing the comprehensive cooperation process between the two AI giants from hardware to software, conducting a wave of "commercial hype", and announcing that Microsoft users will be able to generate and deploy models using NVIDIA's AI workshop services on Azure cloud.
When asked about the future prospects of AI innovation, Huang Renxun stated that generative AI is "the most important platform transformation ever in the computer industry". Huang Renxun believes that the first wave of AI originated from a group of startup companies such as OpenAI, and now the entire industry has entered the second wave, which is the enterprise level AI driven by Microsoft Copilot. The third wave of AI will also be the largest, with the entire world's heavy industry being digitized and benefiting from generative AI.
Of course, Nadella also didn't forget to bring "good friend" OpenAI in his speech.
Microsoft Azure Cloud Services will provide users with the latest OpenAI products as soon as possible, including the GPT-4 Turbo and multimodal capabilities announced at last week's OpenAI Developer Conference, as well as fine-tuning based on GPT-4. In addition, OpenAI's custom GPT function GPTs will be integrated into Microsoft's Copilot system, helping Microsoft launch Copilot Studio for custom AI assistants.
However, due to the overlap of business areas, the competitive relationship between Microsoft and OpenAI has also been receiving attention. The Verge technology website pointed out that Microsoft took advantage of this conference to officially rename Bing AI as Copilot, perhaps to better compete with ChatGPT.
Undoubtedly, Microsoft's huge "circle of friends" has provided many conveniences for the company's various business development. At the conference, as another major announcement, Microsoft announced the official opening of the "one-stop" data processing platform Microsoft Fabric. The design concept of this platform is to consolidate all data assets of the enterprise and reshape the existing way in which the enterprise processes data. Fabric's AI assistant can also integrate with Microsoft's office ecosystem such as Office and Teams, providing functions such as "one click PPT generation", and connecting data channels with a variety of SaaS (software operations services) companies.
In addition, Microsoft also announced today the launch of the "MAAS (Models as a Service)" industry model, where users can directly call major APIs (application programming interfaces) through Microsoft's services, fine-tuning or deploying various open-source large models. Nadella announced that Microsoft has reached a partnership with major model developers such as Meta to launch well-known models such as Llama 2 as a service.
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

ws_lmx 新手上路
  • 粉丝

    0

  • 关注

    0

  • 主题

    0