On Wednesday morning local time, Apple announced ahead of Global Accessibility Day (May 16th) a series of new assistive features that will be unveiled later this year when the new iOS 18 system is released.
According to Apple's announcement, this batch of features includes allowing users to control iPhones and iPads with just their eyes, using a tactile engine to feel music, reducing motion sickness through vehicle motion prompts, setting voice shortcuts, and the "personal voice" feature will also support regular language.
Apple has revealed that with the launch of new features, users only need a pair of eyes to operate their iPhone and iPad.
Eye tracking, supported by artificial intelligence, provides users with built-in options to use iPads and iPhones with just their eyes. Eye tracking is a feature designed specifically for users with disabilities. It only takes a few seconds to set and calibrate the front camera, and through the device side machine learning function, all data related to setting and controlling this feature is securely stored on the device side and not shared with Apple. Eye tracking can run in both iPadOS and iOS applications without the need for additional hardware or accessories. Through eye tracking, users can browse the elements of the application and activate each element using hover controls, accessing other functions such as physical buttons, swipes, and other gestures solely through their eyes.
Music tactile sensation is a new way for hearing impaired users to experience music on their iPhone. After turning on this assistive feature, the touch engine in iPhone will reflect light beats, textures, and subtle vibrations as music plays. The music tactile function is applicable to millions of songs in Apple Music and will be provided as an API to developers, allowing more users to experience the music in their app.
IPhone and iPad users can use voice shortcut keys to add custom phrases to Siri, enabling them to initiate shortcuts and complete complex tasks. Another new feature, Listen for Atypical Speech, provides an option to enhance the range of speech recognition. The listening to atypical speech function uses device side machine learning to recognize the user's speech patterns. These features are designed specifically for users who experience language impairment due to cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke. Based on the features introduced in iOS 17, they provide new personalized and control features for users who are unable to speak or at risk of speech ability.
On the iPhone 15 Pro, the screen displays "Set Vocal Shortcut" and prompts the user to select an action and record a phrase, teaching the iPhone to recognize their voice.
On the iPhone 15 Pro, the screen displays "Say 'Circle' for the last time" and prompts the user to teach the iPhone to recognize this phrase by repeating it three times.
On the iPhone 15 Pro, users receive a reminder from the voice shortcut key that reads "Open Activity Circle".
Another practical feature is the upcoming full system "real-time subtitle" feature of Apple Vision Pro, which can convert live conversations and communication in app audio into subtitles in real time. Vision Pro will also add the feature of using the window bar to move subtitles during immersive videos.
Apple has also disclosed that updates to visual aids will include adding features such as "reducing transparency", "intelligent reversal", and "reducing flickering lights" for users with low vision or those who wish to avoid strong light and frequent flickering.
Next, Apple will also introduce a new feature for iPhone and iPad aimed at alleviating motion sickness.
Apple stated that motion sickness is usually caused by sensory conflicts between what people see and what they actually feel, so displaying some moving points at the edge of the screen can reduce sensory conflicts while avoiding affecting text display. Apple devices can automatically recognize whether the user is in a sports car, and this feature can also be turned on and off through the control center.
More features:
For visually impaired users, voiceovers will feature new voice, flexible volume rotors, custom volume controls, and customizable keyboard shortcuts for voiceovers on Mac.
The amplifier will provide a new reading mode and easily activate the detection mode through the operation button.
Braille users can use new ways to start and maintain Braille screen input, improving control and text editing speed; Braille screen input now supports Japanese; Support using Braille keyboard to input multiple lines of Braille, as well as selecting different input and output methods.
For low vision users, hovering input will enlarge the text entered in the text box and display it in the user's preferred font and color.
For users at risk of losing their speech ability, the personal voice will be released in Mandarin. Users who have difficulty pronouncing or reading complete sentences can now use shortened sentences to create their own voices.
For users with speech disorders, real-time voice will include classification and features compatible with real-time subtitles.
For disabled users, the auxiliary touch function of the virtual touchpad allows users to use a small area on the screen as an adjustable touchpad to control the device.
Switching controls now allow iPhone and iPad cameras to recognize finger tapping gestures as a switch.
Voice control will support custom vocabulary and complex words.
Show the new reading mode in the amplifier on the iPhone 15 Pro.
As of the close on May 15th local time, Apple (AAPL) closed at $189.72, up 1.22%, with a market value of $2.9 trillion.