Apple's next accessibility features let you control your iPhone and iPad with just your eyes

Eye Tracking being demoed on an iPad
(Image credit: Apple)

Ahead of Global Accessibility Day on May 16, 2024, Apple unveiled a number of new accessibility features for the iPhone, iPad, Mac, and Vision Pro. Eye tracking is leading a long list of new functionality which will let you control your iPhone and iPad by moving your eyes. 

Eye Tracking, Music Haptics, Vocal Shortcuts, and Vehicle Motion Cues will arrive on eligible Apple gadgets later this year. These new accessibility features will most likely be released with iOS 18, iPadOS 18, VisionOS 2, and the next version of macOS. 

These new accessibility features have become a yearly drop for Apple. The curtain is normally lifted a few weeks before WWDC, aka the Worldwide Developers Conference, which kicks off on June 10, 2024. That should be the event where we see Apple show off its next generation of main operating systems and AI chops. 

Eye-Tracking looks seriously impressive 

Eye Tracking demoed on an iPad.

(Image credit: Apple)

Eye-tracking looks seriously impressive and is a key way to make the iPhone and iPad even more accessible. As noted in the release and captured in a video, you can navigate iPadOS – as well as iOS – open apps, and even control elements all with just your eyes, and it uses the front-facing camera, artificial intelligence, and local machine learning throughout the experience. 

You can look around the interface and use “Dwell Control” to engage with a button or element. Gestures will also be handled through just eye movement. This means that you can first look at Safari, Phone, or another app, hold that view, and it will open. 

Most critically, all setup and usage data is kept local on the device, so you’ll be set with just your iPhone. You won’t need an accessory to use eye tracking. It’s designed for people with physical disabilities and builds upon other accessible ways to control an iPhone or iPad.

Vocal Shortcuts, Music Haptics, and Live Captions on Vision Pro

Apple's new Vocal Shortcuts for iPhone and iPad.

(Image credit: Apple)

Another new accessibility feature is Vocal Shortcuts, designed for iPad and iPhone users with ALS (amyotrophic lateral sclerosis), cerebral palsy, stroke, or “acquired or progressive conditions that affect speech.” This will let you set up a custom sound that Siri can learn and identify to launch a specific shortcut or run through a task. It lives alongside Listen for Atypical Speech, designed for the same users, to open up speech recognition for a wider set. 

These two features build upon some introduced within iOS 17, so it’s great to see Apple continue to innovate. With Atypical Speech, specifically, Apple is using artificial intelligence to learn and recognize different types of speech. 

Music Haptics on the iPhone is designed for users who are hard of hearing or deaf to experience music. The built-in taptic engine, which powers the iPhone’s haptics, will play different vibrations, like taps and textures, that resemble a song's audio. At launch, it will work across “millions of songs” within Apple Music, and there will be an open API for developers to implement and make music from other sources accessible.

Additionally, Apple has previews of a few other features and updates. Vehicle Motion Cues will be available on iPhone and iPad and aim to reduce motion sickness with animated dots on that screen that change as vehicle motion is detected. It's designed to help reduce motion sickness without blocking whatever you view on the screen.

A look at Live Captions in visionOS at Apple Vision Pro

(Image credit: Apple)

One major addition arriving for VisionOS – aka the software that powers Apple Vision Pro – will be Live Captions across the entire system. This will allow for captions for spoken dialogue within conversations from FaceTime and audio from apps to be seen right in front of you. Apple’s release notes that it was designed for users who are deaf or hard of hearing, but like all accessibility features, it can be found in Settings.

Since this is Live Captions on an Apple Vision Pro, you can move the window containing the captions around and adjust the size like any other window. Vision accessibility within VisosOS will also gain reduced transparency, smart inverting, and dim flashing light functionality.

Regarding when these will ship, Apple notes in the release that the “new accessibility features [are] coming later this year.” We’ll keep a close eye on this and imagine that these will ship with the next generation of OS’ like iOS 18 and iPadOS 18, meaning folks with a developer account may be able to test these features in forthcoming beta releases.

Considering that a few of these features are powered by on-device machine learning and artificial intelligence, aiding with accessibility features is just one way that Apple believes AI has the potential to make an impact. We’ll likely hear the technology giant share more of its thoughts around AI and consumer-ready features at WWDC 2024.

You Might Also Like

TOPICS
Jacob Krol
US Managing Editor News

Jacob Krol is the US Managing Editor, News for TechRadar. He’s been writing about technology since he was 14 when he started his own tech blog. Since then Jacob has worked for a plethora of publications including CNN Underscored, TheStreet, Parade, Men’s Journal, Mashable, CNET, and CNBC among others.

He specializes in covering companies like Apple, Samsung, and Google and going hands-on with mobile devices, smart home gadgets, TVs, and wearables. In his spare time, you can find Jacob listening to Bruce Springsteen, building a Lego set, or binge-watching the latest from Disney, Marvel, or Star Wars.

Read more
Apple Vision Pro Review
Apple Intelligence finally arrives on Vision Pro, but it's the new iOS app that might turn heads
iOS 18
iOS 18: new features, compatible devices, and everything you need to know
Apple Intelligence
Apple Intelligence explained: The new Apple AI for your iPhone, iPad and Mac explained
Apple iPhone 16 Pro Max REVIEW
The iPhone 16 is getting a neat Apple Intelligence upgrade with iOS 18.3 – here are 3 new tricks I'm excited about
Apple Intelligence
The 5 best Apple Intelligence features to try right now on your new iPhone, iPad, or Mac
iOS 18 Control Center
iOS 18.4: 5 new features to expect, including Ambient Music and Photos filtering
Latest in Software
Boston Dynamics all electric Altas
This robot can do a cartwheel better than me and now I'm freaking out – but in a good way
ChatGPT Voice mode
​​Siri sucks, here’s how to add ChatGPT or Gemini voice mode to your Action button on iPhone instead
Xbox Wireless Controller
Microsoft is adding a powerful new feature for using Xbox controllers with Windows 11
Taco Bell AI Drive-Thru
AI is taking over your favorite fast food restaurants as Taco Bell, Pizza Hut, and KFC team up with Nvidia - 500 locations by the end of 2025
Woman disgusted by her laptop
Embarrassing Windows 11 bug that deleted Copilot app is now fixed – but will anyone outside of Microsoft care?
ChatGPT and Gemini Deep Research
I pitted ChatGPT Deep Research against Gemini Deep Research - here's how Google's free tool compares to OpenAI's paid offering
Latest in News
Quordle on a smartphone held in a hand
Quordle hints and answers for Friday, March 21 (game #1152)
NYT Strands homescreen on a mobile phone screen, on a light blue background
NYT Strands hints and answers for Friday, March 21 (game #383)
NYT Connections homescreen on a phone, on a purple background
NYT Connections hints and answers for Friday, March 21 (game #649)
The ASSC Assassin's Creed collection.
The Assassin's Creed x Anti Social Social Club drop includes gaming merch that I wouldn't be embarrassed to wear
Lock on Laptop Screen
Data breach at Pennsylvania education union potentially exposes 500,000 victims
Boston Dynamics all electric Altas
This robot can do a cartwheel better than me and now I'm freaking out – but in a good way