Apple unveils Eye Tracking, Music Haptics, and more accessibility features for iPhone and iPad

Apple recently unveiled a host of new features designed to make its products more accessible and user-friendly. From Eye Tracking to Music Haptics, these updates cater to a diverse range of user needs.

Let’s take a closer look at some of the exciting new features coming soon to iPhones, iPads, and other Apple devices.

Apple eye tracking accessibility features

“Each year, we break new ground when it comes to accessibility,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”

Eye Tracking

The most prominent new feature is Eye Tracking, which allows users to navigate their iPhones and iPads using only their eye movements. Powered by artificial intelligence and leveraging the front-facing camera, Eye Tracking enables users to perform gestures like tapping and swiping, all with their eyes. With the addition of Dwell Control, users can easily select items on the screen with a simple glance.

Music Haptics

For users who are deaf or hard of hearing, Apple introduces Music Haptics, a feature that uses the Taptic Engine to translate music into a series of taps, textures, and vibrations. This innovative approach allows users to experience the rhythm and emotion of their favorite songs in a whole new way. Music Haptics will be available across millions of tracks in the Apple Music catalog and will also be accessible to developers through a new API.

Music Haptics

Vocal Shortcuts and Listen for Atypical Speech

Vocal Shortcuts empower users to assign custom utterances to Siri, allowing them to launch shortcuts and complete tasks with simple voice commands. Additionally, Listen for Atypical Speech uses machine learning to recognize unique speech patterns, benefiting users with conditions that affect speech, such as cerebral palsy, ALS, or stroke.

Listen for Atypical Speech uses on-device machine learning to recognize user speech patterns. Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features provide a new level of customization and control, building on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.

Apple Listen for atypical speech

Vehicle Motion Cues

To address motion sickness while using an iPhone or iPad in a moving vehicle, Apple introduces Vehicle Motion Cues. This feature displays animated dots on the screen’s edges, representing changes in vehicle motion and helping to reduce sensory conflict without interfering with the main content. Using sensors built into the device, Vehicle Motion Cues automatically recognizes when the user is in a moving vehicle and adjusts accordingly.

CarPlay accessibility updates

Apple’s in-car software, CarPlay, is also receiving accessibility updates, including Voice Control, Color Filters, and Sound Recognition. Voice Control allows users to navigate CarPlay and interact with apps using only their voice, while Sound Recognition notifies users who are deaf or hard of hearing of important sounds like car horns and sirens.

carplay accessibility

Live Captions for visionOS 

visionOS 2 introduces Live Captions, enabling users who are deaf or hard of hearing to follow along with spoken dialogue in live conversations and in-audio from apps. This update further enhances the accessibility of Apple’s ecosystem for users with vision impairments.

visionOS live captions

Additional accessibility enhancements

In addition to the aforementioned features, Apple is introducing a range of other accessibility enhancements, including updates to VoiceOver, Magnifier, Braille Screen Input, Hover Typing, Personal Voice, Live Speech, and Virtual Trackpad. These updates are designed to improve usability for users with various disabilities, ensuring that Apple products are accessible to all.

With these new accessibility features, Apple continues to lead the way in creating technology that is inclusive and user-friendly.

“Apple Vision Pro is without a doubt the most accessible technology I’ve ever used,” said Ryan Hudson-Peralta, a Detroit-based product designer, accessibility consultant, and cofounder of Equal Accessibility LLC. “As someone born without hands and unable to walk, I know the world was not designed with me in mind, so it’s been incredible to see that visionOS just works. It’s a testament to the power and importance of accessible and inclusive design.”

WWDC 2024

Stay tuned for the official unveiling of these features at Apple’s upcoming WWDC event on June 10, with the software updates set to be released later this year.

About the Author

Asma is an editor at codecraftedweb with a strong focus on social media, Apple news, streaming services, guides, mobile gaming, app reviews, and more. When not blogging, Asma loves to play with her cat, draw, and binge on Netflix shows.