Apple Reveals New Accessibility Features for Apple Watch, iPad, and iPhone

Apple Reveals New Accessibility Features for Apple Watch iPad and iPhone

Apple has Announced a host of new accessibility features coming to iOS, watchOS, and iPadOS later this year. These features are designed for users with cognitive, hearing, visual, and mobility disabilities.

New Apple Accessibility Features

Launched today, the first service is called SignTime and it allows customers to communicate with AppleCare and Retail Customer Care. Customers can use American Sign Language (ASL) in the US, British Sign Language (BSL) in the UK, and French Sign Language (LSF) in France for Apple Care executives to respond your inquiries.

AssistiveTouch for Apple Watch

AssistiveTouch for Apple Watch
Image Courtesy: Apple

Apple is adding AssistiveTouch in WatchOS to help users with upper limb differences. The function can detect subtle differences in muscle movement and allows users to navigate a cursor on the screen via hand gestures like squeezing or pinching. This way, users can easily answer incoming calls, control an on-screen pointer, and more directly on the Apple Watch.

This is one of the best accessibility features I’ve seen on a portable device. If you want to use these features, check out this guide on how to use the accessibility shortcut on Apple Watch.

Eye-Tracking for iPad

iPadOS will allow users to control their iPad using their eyes, thanks to support for third party eye tracking devices. In this way, compatible MFi devices will now track the person’s eye movements and move the pointer accordingly. Meanwhile, prolonged eye contact will result in actions such as a touch.

See also  Fitbit Luxe Leaked Images Reveal Slim Design and Premium Build

VoiceOver update and audiogram support

voiceover enhancement
Image Courtesy: Apple

Apple is updating its VoiceOver screen reader with improved capabilities. This includes more details about people, text, table data, and other objects within images. Users can browse images by row and column as a table and the feature will even describe a person’s position relative to other objects within the image.

The Cupertino giant also brings support for recognizing audiograms to headphone adaptations. Apple says users can import paper or PDF audiograms to see the latest results of their hearing tests.

Other features coming to Apple devices include new background sounds, sound actions for Switch control, text and screen size settings, and inclusive Memoji customizations for users with oxygen tubes, cochlear implants, and a soft helmet for hats.

Leave a Reply

Your email address will not be published.