首页 News 正文

Tech giant Apple has released a preview version of its AI system, Apple Intelligence, to developers and revealed in a technical paper that the company uses chips developed by Google.
On July 29th local time, Apple released a test migration for iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1, which added some Apple Smart features, allowing some users who joined the developer program to experience some of Apple Smart's features first.
At the Apple Worldwide Developers Conference (WWDC 24) in June of this year, Apple Smart made its debut. It is reported that Apple Intelligence will penetrate the entire system and provide AI assistance from four perspectives: language, images, actions, and personal context, including important message recommendation, writing assistance, and image generation. The Apple Smart Beta will be launched this fall as a built-in feature for iOS 18 and other systems, and only supports English (US).
However, recently there have been reports that iOS 18 and iPadOS 18, which are scheduled to be released in September this year, will temporarily not include Apple Smart, meaning that the first batch of iPhone 16 models released in September this year will lack the support of Apple Smart. Apple will introduce AI features to users in the upcoming iOS 18 update in October.
The developer beta version released this time only includes some features of Apple Smart, which can only be used by iPhone 15 Pro, 15 Pro Max, iPad and Mac equipped with Apple's self-developed chips. Currently, it is only available to paid developer users and limited to the United States region. In the beta version, users can experience the enhanced Siri, writing tools, photo search, as well as AI transcription and rewriting proofreading functions added in applications such as phone, SMS, email, and notifications in advance.
Among these features, the most eye-catching is Siri with a brand new appearance. When Siri is activated, a soft light will illuminate the outer edge of the display screen. When communicating with users, Siri can switch between text and voice and answer thousands of questions about how to use iPhone, iPad, and Mac. Even if the speaker expresses incorrectly or changes the question midway, Siri can maintain contextual coherence and better understand the instructions.
With the help of writing tools, as long as a dialog box is opened, users can rewrite, proofread, and edit text in almost any scenario, including emails, memos, Pages documents, and various third-party apps. The rewriting function can change the tone of the text without affecting the content, including three options: friendly, professional, and concise. Many netizens joked that it seems that the English spell checking tool Grammarly is really "finished" this time.
Writing tools can help users proofread text and modify tone. Source: X platform
In addition, memos and other apps have added audio transcription functionality, while photo apps have added natural language search functionality. Users can also automatically select relevant photos and songs to create short videos by entering descriptions. The email application has added a feature that will first display time sensitive messages for users.
Currently, Siri's appearance updates and writing tools are the two most exciting features for users. Many users have shared videos of themselves using the new Siri on social media, or how writing tools can transform rude text into a friendly style. Renowned Apple whistleblower Max Weinbach pointed out after testing on Mac computers that Apple's intelligence performs exceptionally well in rewriting long text and organizing long audio, with fast speed and relatively low power consumption.
Tech blogger Nick Ackerman also released a video of himself experiencing Apple's intelligence on an iPhone, concluding that currently, Apple's intelligence has integrated well into the entire ecosystem; Although many mobile phone manufacturers are installing AI on their phones, he believes that iOS 18, which is equipped with complete AI content, has the potential to become a "rule changer" for Apple's operating system.
There are still many AI features released by Apple in June that have not been released in iOS 18.1, including "genmoji" that generates emoji expressions through text descriptions, "image playground" that automatically generates small images in selected scenes, and "image wand" that generates images in notes. ChatGPT has not yet been integrated into the Apple operating system.
In a report on the 29th, WaMSI Mohan, an analyst at Bank of America, wrote: "We expect that as the AI feature combination (including software and potentially hardware) of the iPhone improves by 2025, the current wave of enthusiasm for the iPhone will continue for a longer period of time
In addition, Apple released a technical paper titled "Apple Intelligence Foundation Language Models" on the 29th, detailing how the AI foundation model AFM behind Apple's intelligence is trained on a cloud of Tensor Processing Units (TPUs) independently developed by Google. In this report, Apple did not mention Nvidia's GPU chips.
Google sells the right to use TPU through its Google Cloud platform. It is reported that in order to build an AI model (i.e. device side AFM) that can run on iPhone and other devices, Apple used 2048 of Google's latest TPU v5p chips launched in December last year. In the AFM server, Apple deployed 8192 TPU v4 processors.
On the 29th, Apple (Nasdaq: AAPL) closed at $218.240 per share, up 0.13%, with a total market value of $3.35 trillion.
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

123458039 新手上路
  • 粉丝

    0

  • 关注

    0

  • 主题

    1