Is it time for Google to open up a lightweight and large model, Gemma, and the era of universal AI?
hughmini
发表于 2024-2-22 11:24:19
1260
0
0
Google released a new artificial intelligence "open model" Gemma on February 21st, which means that open source for large models means that external developers can build them into their own models. Google has also become another major technology company, following Meta, attempting to take the path of open source big models and accelerating the arrival of the era of universal AI.
Google stated that Gemma is a series of "lightweight" advanced open models built using the same research and technology as creating Gemini models. Developers can use the Gemma "Open Model" series to build artificial intelligence software for free. The company stated that it is publicly disclosing key technical data, such as so-called "model weights".
Google CEO Sundar Pichai said, "Gemma has demonstrated powerful performance and will be available globally starting today, running on laptops or Google Cloud."
Market analysis suggests that Google's open source of large models may attract software engineers to develop on Google's technological foundation and encourage the use of its newly profitable cloud division. Google stated that these models have also been optimized for Google Cloud.
However, Gemma is not entirely open source, which means the company can still establish terms and ownership for using the model.
It is reported that compared to the Gemini model previously released by Google, the Gemma model may have smaller parameters, with 2 billion or 7 billion parameter versions available for selection. Google has not yet disclosed the parameter size of its largest Gemini.
Google stated, "Gemini is the largest and most powerful AI model widely used today. The Gemma model shares technology and infrastructure components with Gemini, and can run directly on developers' laptops or desktops."
The company also emphasizes that Gemma surpasses models with larger parameters on key benchmarks while adhering to strict standards for safe and responsible output.
Previously, the open-source Meta's Llama 2 model had a maximum of 70 billion parameters. In contrast, OpenAI's GPT-3 model has 175 billion parameters.
In a technical report released by Google, the company compared Gemma's 7 billion parameter model with several models including Llama 27 billion parameter, Llama 213 billion parameter, and Mistral 7 billion parameter in different dimensions. Gemma outperformed its competitors in benchmark tests such as question answering, reasoning, mathematics/science, and code.
Nvidia stated during the release of the Gemma model that it has partnered with Google to ensure that the Gemma model runs smoothly on its chip. Nvidia also stated that it will soon develop a chatbot software to be used in conjunction with Gemma.
Opening up AI models with smaller parameters is also Google's business strategy. Previously, iFlytek also chose to open source smaller parameter size models.
Liu Qingfeng, Chairman of iFlytek, explained to a reporter from First Financial News, "The key to General Motors' big models is to see who has good performance, and open source big models are to establish an ecosystem. Therefore, from a technical perspective, generally open source big models are slightly lower than General Motors' big models."
"We have also observed that many companies may hide their biggest model and still hope to establish barriers for commercialization," a researcher engaged in AI big model development told a reporter from First Financial.
There are currently different views on open source big models. Some experts believe that open source AI big models may be abused, while others support open source methods, believing that this can promote technological development and expand the beneficiaries.
CandyLake.com 系信息发布平台,仅提供信息存储空间服务。
声明:该文观点仅代表作者本人,本文不代表CandyLake.com立场,且不构成建议,请谨慎对待。
声明:该文观点仅代表作者本人,本文不代表CandyLake.com立场,且不构成建议,请谨慎对待。
猜你喜欢
- Baidu Robin Lee: The AI native era needs millions of native applications, not 100 big models
- Rolling GPT-4? Google releases the strongest AI model interpretation
- News reports that Apple is developing its own large-scale language model for devices
- IPhone can run! Microsoft launches lightweight model Phi-3 with performance comparable to GPT-3.5 Turbo AI in the future on mobile devices?
- The first hundred billion parameter model from Tongyi Qianwen has arrived
- JD technical leader: Large models will become smaller and even finer down to the scene
-
隔夜株式市場 世界の主要指数は金曜日に多くが下落し、最新のインフレデータが減速の兆しを示したおかげで、米株3大指数は大幅に回復し、いずれも1%超上昇した。 金曜日に発表されたデータによると、米国の11月のPC ...
- SNT
- 前天 12:48
- 支持
- 反对
- 回复
- 收藏
-
長年にわたって、昔の消金大手の捷信消金の再編がようやく地に着いた。 天津銀行の発表によると、同行は京東傘下の2社、対外貿易信託などと捷信消金再編に参加する。再編が完了すると、京東の持ち株比率は65%に達し ...
- SNT
- 前天 12:09
- 支持
- 反对
- 回复
- 收藏
-
【GPT-5屋台で大きな問題:数億ドルを燃やした後、OpenAIは牛が吹くのが早いことを発見した】OpenAIのGPT-5プロジェクト(Orion)はすでに18カ月を超える準備をしており、関係者によると、このプロジェクトは現在進 ...
- SNT
- 4 小时前
- 支持
- 反对
- 回复
- 收藏
-
【ビットコインが飛び込む!32万人超の爆倉】データによると、過去24時間で世界には32万7000人以上の爆倉があり、爆倉の総額は10億ドルを超えた。
- 断翅小蝶腥
- 3 天前
- 支持
- 反对
- 回复
- 收藏