Apple’s new accessibility features let you control an iPhone or iPad with your eyes
Apple introduces eye tracking and music haptics in iOS 18 for enhanced accessibility.
With the forthcoming release of iOS 18 and iPadOS 18, Apple is setting a new benchmark for accessibility in technology, introducing innovative features that leverage artificial intelligence to aid users with disabilities. Among these features, eye tracking stands out for its ability to allow users with physical disabilities to control their devices using just their eyes, a functionality that promises to enhance independence without the need for additional hardware. Combined with this, the introduction of music haptics brings an immersive experience to the deaf and hard of hearing community by translating audio into tactile sensations, utilizing the device's vibration system to relay the rhythm and nuances of music.
Beyond these advancements, Apple aims to mitigate motion sickness experienced by users in moving vehicles through vehicle motion cues. This feature, which displays animated dots to represent motion changes, aims to reduce sensory conflict by syncing visual stimuli with the physical experience of motion. Additionally, vocal shortcuts and the 'Listen for Atypical Speech' feature demonstrate Apple's commitment to inclusivity, catering to users with speech-affecting conditions by allowing for customized voice commands and enhancing Siri's recognition of diverse speech patterns.
As these features are expected to debut in the next iteration of Apple's operating systems, they underscore the company's ongoing pursuit of accessibility. Artificial intelligence and machine learning are at the core of these innovations, indicating Apple's forward-thinking approach to making technology accessible for all. These developments, announced ahead of Global Accessibility Awareness Day, reflect Apple's dedication to inclusivity, ensuring that its products serve the needs of a broad spectrum of users.