Initially showcased as part of Apple’s enhanced accessibility features, Eye Tracking utilizes the front-facing camera and on-device machine learning to track a user’s eye movements accurately.
Designed primarily for users with physical disabilities who find it challenging to navigate touchscreens, Eye Tracking allows individuals to control their devices simply by looking at the screen.
Setup is straightforward and accessible through the Settings app under Accessibility > Physical and Motor > Eye Tracking.
Users follow a dot moving around the screen to calibrate the system, which takes about a minute to complete.
Once enabled, a small black dot appears on the screen, representing where the user is looking. This dot acts as a pointer, replacing traditional touch controls.
Dwell Control, another feature enabled with Eye Tracking, allows users to select items on the screen by maintaining their gaze on the desired item for a few seconds.
Customization options enhance usability further. Users can adjust settings like Dwell Control sensitivity and enable Snap to Item, which automatically selects the nearest interface element when the user gazes at it. This feature reduces the need for precise accuracy when making selections, enhancing overall user experience.
Moreover, Eye Tracking supports additional features like accessing Notification Center and Control Center, all through eye movements.
This comprehensive integration makes iOS 18 and iPadOS 18 more accessible and user-friendly for individuals with disabilities, empowering them to interact with their devices independently and effectively.
With Eye Tracking, Apple continues to innovate in accessibility technology, ensuring that all users, regardless of physical challenges, can fully utilize their iPhones and iPads with ease and efficiency.
This advancement underscores Apple’s commitment to inclusivity and user empowerment through cutting-edge technological solutions.