Apple reveals Eye Tracking, other new features for iPhone, iPad
Tim Cook's American tech giant aims to provide "best possible" experience to users
Apple Wednesday revealed new accessibility capabilities that will be available later in the year in which Eye Tracking is the most prominent one.
Eye Tracking enables people with physical limitations to operate an iPad or iPhone using their eyes.
In addition, more accessibility features will be added to visionOS; Vocal Shortcuts will enable users to complete tasks by creating their own unique sound; Vehicle Motion Cues will lessen motion sickness when using an iPhone or iPad while driving; and Music Haptics will provide a new way for users who are deaf or hard of hearing to experience music through the iPhone's Taptic Engine.
With the help of Apple silicon, AI, and machine learning, these features combine the power of Apple software and hardware to promote the company's decades-long goal of creating products that are accessible to all.
“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO.
“That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”
-
NASA Artemis II rocket heads to the launch pad for a historic crewed mission to the Moon
-
Blood Moon: When and where to watch in 2026
-
Elon Musk’s Starlink rival Eutelsat partners with MaiaSpace for satellite launches
-
Blue Moon 2026: Everything you need to know
-
Scientists unravel mystery of James Webb’s ‘little red dots’ in deep space
-
ISS crew of four completes medical evacuation with safe splashdown off California
-
Annular solar eclipse 2026: Here's everything to know about the ‘ring of fire’
-
World’s first ice archive created to preserve fast-melting glaciers’ secrets