Tech giant Apple has confirmed the introduction of eye tracking technology for iPhone and iPad users worldwide.
This innovative feature, enabled through artificial intelligence (AI), allows users to operate their devices with just their eyes.
Announced last week, the eye tracking tool is part of Apple’s new accessibility features, designed to enrich the lives of users with disabilities. According to Apple’s chief executive, “We believe deeply in the transformative power of innovation… and have supported inclusive design for almost 40 years.”
The feature uses the front-facing camera to set up and calibrate in seconds, ensuring all data remains securely on the device and isn’t shared with Apple. Users can navigate through apps, access physical buttons, and perform swipes and motions using only their eyes.
While the feature won’t be available until later this year, it has already generated significant interest on social media. Some users have praised the innovation for its potential to assist those with specific disabilities, while others have joked about the potential for increased laziness.
In addition to eye tracking, Apple unveiled several other new features, including Vehicle Motion Cues, designed to reduce motion sickness in moving cars, and Music Haptics, which enables deaf or hard-of-hearing users to experience music through vibrations. The company also announced new speech features for customers with speech-impairing conditions, allowing users to program specific utterances to virtual assistant Siri.
These innovative features reflect Apple’s long-standing commitment to delivering the best possible experience to all users, pushing the boundaries of technology to enrich lives.