首页 News 正文

After eight years, the theme of the Yunqi Conference "Calculation, for the Value that Cannot be Calculated" has returned once again. Photographed by journalist Ye Xiaodan

After 8 years, the theme of the Alibaba Cloud Cloud Habitat Conference has returned to "computing, for value that cannot be calculated". This is not only a comeback, but also a reshaping of the top cloud service providers in China under the wave of artificial intelligence (AI).
The big model is the hottest topic in all the venues of this Yunqi Conference, and everyone is very concerned about the implementation of the big model and the technological gap between countries. "An exhibitor leader invited to participate in this Yunqi Conference mentioned it to a reporter from Daily Economic News.
Since 2023, the development of GPT technology (deep learning model) has triggered a new wave of AI craze in China. After nearly a year of development, the technology and application of large models in China have also shown more diverse trends.
The landing application of large models and underlying computing power technology have also become a major focus of this conference. A reporter from Daily Economic News found at the Yunqi Conference that with the continuous development and iteration of large model technology, technology has leveraged more diverse business models, and at the current time point, this may only be the beginning. On the other hand, facing the ever-changing international situation, the domestic AI industry is also exploring new paths forward.
On October 31st, Alibaba Group Chairman Cai Chongxin disclosed data in a speech that 80% of China's technology companies and half of the large model companies are currently ahead of Alibaba Cloud. From underlying computing power to artificial intelligence platforms, and then to model services, Cai Chongxin stated that Alibaba Cloud is undergoing comprehensive technological upgrades and innovations in the AI era.
Under this wave of AI, Alibaba Cloud has been making frequent moves in the field of big models this year. What kind of shift is Alibaba Cloud making between advancements and setbacks?
On November 1st, the large model application attracted numerous exhibitors and developers. Photographed by journalist Ye Xiaodan

Exciting Big Model, Blooming Flowers
It's late autumn, 26 in Hangzhou, hot but not dry. The Yunqi Conference has attracted tens of thousands of developers from around the world to participate, and the related exhibition halls are crowded with people. This conference has set up three themed pavilions: Computational Power Pavilion, Artificial Intelligence+Pavilion, and Industrial Innovation Pavilion, bringing together over 200 cloud computing enterprises across the entire industry chain, as well as 3000 technologies, products, and applications for exhibition.
On November 1st, at the Artificial Intelligence+Pavilion, various large model applications attracted many people to stop. In addition to well-known large model applications for C (personal users) such as Miaoya camera and Tongyi Tingwu, there are also large model technology demonstrations for vertical industrial applications such as digital humans and intelligent cabins.
Hey, Xiaohang, help me generate a jazz style music.
Okay, generating music... The music content has been generated, please listen
With the sound of 'Soul of Jazz' in the car, Huang Yingning, Senior Product Director of Zebra Zhixing, showcased the application of large model technology in the intelligent cockpit to reporters on the cockpit panel.
Huang Yingning stated that traditional small models have limited understanding abilities and need to understand human behavior based on pre-defined corpus. But human creativity and needs are very complex and ever-changing, and traditional small models cannot understand so much. The intelligent cockpit equipped with large model technology can infer based on the information spoken, and has better comprehension ability.
In April 2023, former CEO of Sogou Company, Wang Xiaochuan, officially announced the establishment of the AI big model company "Baichuan Intelligence". At the Yunqi Conference, Wang Xiaochuan stated that the current situation of domestic big models is "one step slower in ideals, and three steps faster in landing". Although the United States is leading in technological innovation and ideals, China has stronger application and landing capabilities.
In April of this year, Alibaba Cloud released version 1.0 of Tongyi Qianwen, and on October 31st, Alibaba Cloud officially released Tongyi Qianwen 2.0, a 100 billion level parameter model. In 10 authoritative evaluations, the comprehensive performance of Tongyi Qianwen 2.0 exceeds GPT-3.5 and is accelerating to catch up with GPT-4.
At the same time, a group of 8 industry models trained based on the Tongyi big model has been launched. Alibaba Cloud introduced that the 8 industry models are targeted at the most popular vertical scenarios and are trained specifically using domain data. Developers can integrate model capabilities into their large model applications and services through web page embedding, API/SDK calls, and other methods.
The final mile of the big model landing is still a pain point
Large model technology is driving terminals to become more agile and intelligent.
At the computing center, the staff instructed the robot to "put the cake on the plate". With the help of large model technology, the robot successfully picked up the cake on the desktop and placed it on the dining plate.
Wang Xiaodong, Vice President of Alibaba Cloud Intelligent Group and Head of Intelligent Internet of Things Business, told the Daily Economic News that with the development of technology, robots can plan and execute, and execution is very important. Without execution, all large models will be trapped in the digital world. He stated that a large model is just a brain, and if it does not penetrate the physical world, it cannot truly have an impact on this world.
In Wang Xiaodong's view, execution is very important. The closed-loop capabilities of robots, such as perception, cognition, planning, memory, and execution, will radiate to all end users in the future.
Of course, how to better implement and empower the development of industries with large models is still the focus of current attention.
Liu Fanping, CEO of Rock Core Digital Intelligence, revealed in an interview with reporters that from the perspective of the entire system, the current large model technology not only focuses on computing power bottlenecks, but also focuses on the breakthrough application and implementation of vertical large models. For example, the RockAI model launched by Core Intelligence. We have landed many customers who focus on finance, chemical engineering, and manufacturing, but it is not enough. We still need to 'cultivate' more industries and fields. Only in this way can we know what users really need and truly implement the technology of big models better. "Liu Fanping introduced.
Liu Fanping revealed that customers hope to use the capabilities of large models to help them solve some problems in the operation of the enterprise. But there are differences in the understanding of technology between supply and demand. As a technology provider, we need to better understand their needs. Liu Fanping believes that "this is the 'last mile' of large model technology and the biggest difficulty encountered during the implementation process.
At the Yunqi Conference, you can clearly feel that there is still a lot of demand for big models. Especially in some traditional industries, they are very concerned about the role that big models can play in helping enterprises digitize, or whether big models can help enterprises solve some pain points. "An exhibitor told the Daily Economic News reporter.
In Liu Fanping's view, promoting the implementation of large model technology in the industry, vertical large models are indeed a development trend. In fact, we are at this turning point, but not all big models can accompany enterprises through this turning point. Because it seems like an opportunity, everyone is also moving in this direction. But to truly do this well, it is not just about having a big model or fine-tuning other people's open-source models. In the future, there may be more demand for self-developed and customized big models
However, under the huge cost investment and computing bottleneck, there is still a certain threshold for self-developed large models, and launching adaptive large models through open source is still the choice of many developers.
The large model customized T-shirt experience platform jointly developed by Senma and Alibaba Cloud is photographed by journalist Ye Xiaodan.

Under the bottleneck of computing power, the demand for open source and flexible computing power is increasing
At the Yunqi Conference, A-share listed company Senma Clothing collaborated with Alibaba Cloud to demonstrate how to use large model technology to generate images and print on site, enabling customized T-shirts to be generated within 2 minutes.
The on-site staff introduced to the reporter that the customized T-shirt is supported by a trained large model. All developers can use open-source software to fine-tune and fine-tune their own large models.
The staff introduced that enterprises can layout this large model on their own function calculations, and the use of computing power can achieve on-demand payment. For example, generating an image through a large model and calculating the cost of an image is a few cents. So, costs can also be significantly reduced.
A senior cloud computing practitioner introduced to reporters that function computing simplifies the threshold for entering AIGC (Generative Artificial Intelligence). Currently, developers are divided into two groups: traditional developers and AI developers. Many AI developers do not know how to write code, but they can fine-tune based on large models.
The concept behind function calculation is the concept of Serverless (a model of cloud computing). The practitioner introduced that research and development personnel do not need to choose a server model, and the pay as you go model has changed. According to the previous model, developers need to rent a server and choose configuration, bandwidth, storage capacity... However, development projects with large models may have a shorter lifespan and require some flexible methods. If developers feel that the direction of their design is incorrect and needs to be adjusted, they can make timely adjustments and be more flexible.
From a certain perspective, this flexible model avoids waste in purchasing equipment and computing power. Lowered the threshold for entrepreneurship, while also reducing computational waste and improving operational efficiency.
Open source has brought prosperity to the entire ecosystem. According to a set of data shared by Cai Chongxin, 80% of China's technology companies and half of the large model companies are currently running above Alibaba Cloud.
At the same time, using cloud cards and shared computing power is also one of the ways to solve the computing power gap in China's AI industry under the ban.
According to a previous report by the Economic Observer, cloud computing technology expert Liu Shimin has noticed that official channels in China are increasingly unable to buy high-end AI chips, and are facing limitations in computing power development. He also saw that some cloud vendors can provide GPU computing power, the most basic AI service, based on relevant AI products.
Previously, industry insiders also proposed that cloud computing power sharing or leasing is often suitable for manufacturers with low frequency training needs. Based on the same or the same type of computing chips, cloud GPUs provided by cloud manufacturers can indeed form a substitute.
Wang Xiaodong, Vice President of Alibaba Cloud Intelligent Group and Head of Intelligent Internet of Things Business, introduced to the reporter of Daily Economic News that there are many low computing power and low storage mobile devices nowadays. "So we are also exploring how to use application engines to make AI applications run on lower computing power devices. This needs to be combined with the cloud and integrated into the cloud
Under this trend, Alibaba Cloud is also more firmly moving towards openness and open source.
At last year's Yunqi Conference, Alibaba Cloud released the AI open source community "Magic Tag". Over the course of a year, Magic has gathered 2.8 million developers and over 2300 high-quality models, with model downloads exceeding 100 million, making it the largest and most active AI developer community in China.
Cai Chongxin stated that Alibaba "wants to create the most open cloud in the AI era", and he hopes to use this "cloud" to make developing and using AI easier and cheaper, helping various industries, especially small and medium-sized enterprises, to transform AI into huge productivity.
The slogan at the entrance of the computing power hall of the Yunqi Conference is photographed by journalist Ye Xiaodan.

Alibaba Cloud's advancement and retreat: regression, connectivity, and ecology
A new wave is coming.
Wang Jian, an academician of the CAE Member and founder of Alibaba Cloud, said in his speech at the conference that the combination of AI and cloud computing will bring about the third wave of cloud computing. It will not be completed in one or two years, but it may give us ten or decades of time to make enough innovations to be invented in the cloud computing era.
Wang Jian believes that even leading companies like Nvidia cannot avoid one thing about generative AI - "all these calculations will be provided in the cloud. He believes that this is also a trend in the development of the industry, as today's electricity may not be the same as Edison's electricity, but electricity, as a public service and infrastructure, has a very long-standing vitality.
The value of cloud computing has been placed on a higher dimension. At this Yunqi Conference, Alibaba Cloud has returned to the theme of "computing for value that is difficult to calculate" after 8 years of absence.
Large models are the core technology of this wave of AI, and the quality of basic models largely determines the prospects for AI industrialization. Training large models is a systematic engineering that encompasses complex technologies such as computing power bases, networks, storage, big data, AI frameworks, AI models, etc. Only a powerful cloud computing system can train high-quality large models.
The slogan "Data Center is a Computer" was placed in the most prominent position in the exhibition hall of the computing power hall of this Yunqi Conference. This is the slogan proposed by Alibaba Cloud in 2009. Alibaba Cloud CTO Zhou Jingren stated that today, the AI era requires such a technical system even more. As a supercomputer, cloud computing can efficiently connect heterogeneous computing resources, break through the bottleneck of a single performance chip, and collaborate to complete large-scale intelligent computing tasks.
Alibaba Cloud CTO Zhou Jingren delivers a speech at the Yunqi Conference, photographed by journalist Ye Xiaodan [align]
It is not difficult to see that behind the big model technology, the value of connectivity is becoming prominent.
Today, we have seen some terminal applications of AIGC technology that play an efficient role in scenarios. Through cloud integration, you can see the shadow of intelligence and AI behind you, and terminals are becoming more and more intelligent and agile. However, in the end, the most fundamental aspects are connectivity and security. "A cloud service provider executive sighed to the Daily Economic News reporter.
On a more fundamental technological foundation, Alibaba Cloud's hardware development has not stopped either.
On November 1st, Alibaba Pingtou Ge released its first SSD (Solid State Drive) main control chip, Zhenyue 510, which is deeply customized for cloud computing scenarios and achieves 4 microseconds of ultra-low latency. Zhenyue 510 will be the first to be deployed in Alibaba Cloud data centers and can be applied in business scenarios such as AI, online transactions, big data analysis, high-performance databases, and software defined storage.
Under the AI wave, Alibaba Cloud continues to iterate its full stack technology capabilities that integrate software and hardware. As the largest public cloud service provider in China, Alibaba Cloud has made timely progress and retreat in its market layout.
In April of this year, Alibaba Cloud announced a 15% to 50% reduction in core product prices, with a maximum reduction of 50% for storage products. The explosion of new technologies combined with significant market price reductions is also seen by industry insiders as expanding market share through price reductions, while also expanding and prospering the entire AI ecosystem under the wave of AI.
The latest data from IDC shows that the overall market size of China's public cloud services (IaaS/PaaS/SaaS) in the first half of 2023 was $19.01 billion. The explosive growth of AI demand promotes the development of public cloud: In the first half of 2023, AI related demand grew rapidly, and the attention of society and enterprises to AIGC and AI big models significantly increased. The computing power market and PaaS layer related products were affected by market heat, and the growth rate was significant. Alibaba Cloud ranks high in the IaaS and PaaS markets.
The scene of the Yunqi Conference is crowded with people. Photographed by journalist Ye Xiaodan every time [align]
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

海角七号 注册会员
  • 粉丝

    0

  • 关注

    1

  • 主题

    29