We have great expectations from iOS 18 – Apple’s upcoming operating system update for the iPhone coming later this year. iOS has been stagnant in its evolution as an OS meant for the modern smartphone and with the arrival of generative AI experiences, it could overhaul the basic iPhone experience. Apple, however, seems to be in a rush since it revealed a couple of new Accessibility features a month ahead of the Worldwide Developers Conference (WWDC) 2024.
Although the gurus and fans are labelling it as a feature set reserved for the upcoming iOS 18 update, it is safe to say that some of these could reach your iPhones before the iOS 18 stable version. The next big iOS update could mark its arrival in late September.
Regardless of which iOS version gets these features, the new accessibility features open up a lot of new possibilities for iPhone users. Eye tracking-based control has been in demand for a long time, especially since a couple of Android skins had it for years.
Then there’s another new tool that aims to help with motion sickness while you browse your phone inside a moving car. There’s even a new haptic feedback system for those who want to experience the beats, textures, taps and more while listening to music.
Let’s take a quick look at all the newly announced features heading to your iPhone in the near future, possibly as special iOS 18 accessibility features.
A quick look at the new iOS accessibility features
Eye tracking
As part of iOS’ accessibility features, Eye Tracking will let the physically challenged navigate around the user interface of their iPhone and iPad with their eyes.
The feature relies on the front-facing camera to calibrate the movements and along with on-device machine learning, it lets the user highlight or activate an option and navigate further into the app. It can even help with swipes and gestures. Apple says that the feature works across all iOS and iPadOS apps.
5 Exciting Updates Expected At Apple WWDC 2024: The Big AI Push, iOS 18, New iPads And More
A Look At All The Apple iPads In Chronological Order Of Release: From The First iPad To The M4 iPad Pro
Music Haptics
With Music Haptics, Apple wants to make the hearing impaired experience music using the iPhone’s vibrations.
When enabled, your iPhone will emulate the textures and beats of an audio track using the Taptic Engine, which has been known to create precisely controlled vibrations.
For regular people, it will add to the sense of immersion while grooving to their favourite tracks. The feature will be initially available for Apple Music users but Apple will extend its API to other third-party developers.
Speech features
As part of Apple’s special speech features, we will be getting Vocal Shortcuts and Listen for Atypical Speech. With Vocal Shortcuts, users can create custom sounds to trigger shortcut actions with Siri. For example, one can assign ‘WhatsApp’ to open the WhatsApp app interface.
Listen for Atypical Speech uses on-device machine learning to better understand the speech patterns of users with conditions like ALS or cerebral palsy.
Vehicle Motion Cues
Have you experienced motion sickness while browsing your phone in a moving car? Apple wants to help you here with its Vehicle Motion Cues feature that combats motion sickness.
To make it work, your iPhone will show animated dots on the screen that move with the vehicle’s motion to match your physical sensations. If the vehicle turns left, the dots will move right to impart the sensation of movement.
This should theoretically reduce the conflict between what you see and feel.
Voice Control in CarPlay
CarPlay also gets a couple of new accessibility features as part of the latest update. Drivers as well as passengers can use Voice Control to manage CarPlay hands-free with spoken commands.
Additionally, for those who have difficulty hearing, Sound Recognition provides alerts for car horns and sirens, thus enhancing road awareness.
Color Filters make the CarPlay interface easier to decipher for the colourblind.
Reader Mode in Magnifier
The Magnifier app is also getting a new Reader Mode built in for those who struggle to understand a certain kind of handwriting. All you need to do is point the Magnifier app’s camera viewfinder at the text and tap on the Reader Mode. Your iPhone will detect the text and reproduce it digitally for clear understanding.
(Hero and Featured Image Credits: Courtesy Freestocks via Unsplash)