Gamingdeputy reported on May 15th that Gamingdeputy learned from Apple’s official website that Apple today announced that it will launch new accessibility features later this year, including eye tracking, which can help disabled users use eye movements. iPad or iPhone.
In addition, Music Hap will provide hearing-impaired users with a new way to experience music using the haptic engine in iPhone; vocal shortcuts allow users to perform tasks with customized sounds; when using iPhone or iPad in a moving vehicle, Vehicle Motion Cues help reduce motion sickness; visionOS will provide more accessibility features.
The above new features are expected to be available in updates such as iOS 18, iPadOS 18 and macOS 15, but Apple has not explicitly stated this.
Gamingdeputy attaches the introduction to Apple’s official website as follows:
Eye tracking comes to iPad and iPhone
Eye tracking powered by artificial intelligence gives users a built-in option to use iPads and iPhones with just their eyes. Eye tracking is a feature designed for users with disabilities. It takes just seconds to set up and calibrate the front camera, and through on-device machine learning, all data to set up and control this feature is saved securely on the device. Not shared with Apple. Eye tracking works within iPadOS and iOS apps and requires no additional hardware or accessories. With eye tracking, users can navigate through the elements of an app and use dwell controls to activate each element, accessing other features like physical buttons, swipes, and other gestures using just their eyes.
Musical touch allows more people to experience music
Music Touch is a new way for hearing-impaired users to experience music on iPhone. When this accessibility feature is turned on, the haptic engine in iPhone embodies taps, textures, and subtle vibrations as music plays. The Music Touch feature works with millions of songs in Apple Music and will be available as an API to developers, allowing more users to experience music in their apps.
More new voice features
iPhone and iPad users can add custom words to Siri through vocal shortcuts to launch shortcuts and complete complex tasks. Another new feature, Listen for Atypical Speech, provides options to enhance speech recognition range. The Listen to Atypical Speech feature uses on-device machine learning to identify a user’s speech patterns. These features are designed for users whose speech ability is affected by cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, and build on features introduced in iOS 17 to provide new features for users who are unable to speak or are at risk for their speech ability. personalization and control features.
Vehicle motion tips help reduce motion sickness
Vehicle Motion Alerts is a new feature for iPhone and iPad that helps ride-hailing users reduce motion sickness. Research shows that motion sickness is often caused by a sensory conflict between what people see and what they actually feel, which may prevent some users from using their iPhone or iPad comfortably while riding in a moving vehicle. Vehicle motion cues display animated dots at the edge of the screen that represent changes in vehicle motion to help reduce sensory conflict without interfering with the main display content. Using the iPhone and iPad's built-in sensors, Vehicle Motion Alerts can identify when a user is in a moving vehicle and provide appropriate feedback. The feature can be set to appear automatically on your iPhone, or it can be turned on and off in Control Center. Pause video Vehicle Motion Alerts on iPhone 15 Pro Vehicle Motion Alerts is a new feature for iPhone and iPad that helps passengers in moving vehicles reduce motion sickness.
CarPlay voice control and more accessibility updates
CarPlay's upcoming accessibility features include voice control, color filtering, and voice recognition. With voice control, users can use CarPlay and control apps using just their voice. Through sound recognition, hearing-impaired drivers or passengers can turn on the alarm function and be notified by the car horn and siren sound. Color filters make it easier for color-blind users to use the CarPlay interface, and other visual aids include bold text and large text.
VisionOS upcoming accessibility features
Accessibility features introduced in visionOS this year include system-wide real-time captioning, helping all users, including hearing-impaired users, understand dialogue in conversations and app audio in real time. With FaceTime's real-time subtitles feature in visionOS, more users can easily connect and collaborate using their own Personas and enjoy a unique experience. Apple Vision Pro will add the ability to use window bars to move subtitles to Apple's immersive video, and will also support other Made for iPhone (MFi) hearing aids and cochlear hearing processors. Updates to visual accessibility features will include the addition of “Reduce Transparency,” “Smart Invert,” and “Reduce Flicker Lights” features for users with low vision or who want to avoid bright lights and frequent flickers.
visionOS will provide real-time captioning, allowing hearing-impaired users to follow live conversations and audio in the app. These features, along with the dozens of accessibility features already available in Apple Vision Pro, provide a flexible input system and an intuitive interface designed for a broad user base. Features such as Narration, Zoom, and Color Filters provide visually impaired users with the ability to use spatial computing, while features such as Guided Access support those with cognitive impairments. Users can control Vision Pro with any combination of eyes, hands or voice, and accessibility features include switch control, voice operation and dwell control, which can also help people with disabilities.
Other updates
For visually impaired users,narrationThere will be new voices, a flexible volume rotor, custom volume controls, and the ability to customize VoiceOver keyboard shortcuts on Mac.
amplifierA new reading mode will be available and detection mode can be easily launched via the action button.
Braille users have new way to start and maintainBraille screen inputimproved control and text editing speed; Braille screen input now supports Japanese; support for usingbraille keyboardEnter multiple lines of Braille and choose different input and output methods.
For low vision users,Hover inputText entered in the text box will be enlarged and displayed in the user's preferred font and color.
For users at risk of losing the ability to speak,personal voiceA Mandarin version will be released. Users who have difficulty pronouncing or reading full sentences can now use shortened sentences to create a personal voice.
For users with speech impairment,real time voicewill include classification andLive subtitlesCompatible features at the same time.
For users with disabilities,virtual trackpadThe Assistive Touch feature allows users to control the device by using a small area of the screen as a resizable trackpad.
switch controlThe iPhone and iPad cameras can now recognize finger tap gestures and use them as switches.
voice controlCustom vocabulary and complex words will be supported.