Apple Shares New iOS Accessibility Features with Users

Apple; The developer conference, which will present the new features of iOS, WatchOS and iPadOS, shared with users the updates it made in the field of accessibility a few weeks before WWDC 2021.
Sharing the innovations it has brought to iOS, watchOS and iPadOS platforms with users at WWDC (Worldwide Developers Conference), the annual developer conference in America, Apple shared its innovations in the field of accessibility with its users a few weeks before the next conference. Among these features is SignTime, which is a highly functional feature for people with speech impairments, as well as a feature that helps you focus by masking background noises.

SignTime is the most popular feature among the accessibility features announced by Apple before WWDC. SignTime will provide a service that translates body language verbatim to users who have hearing difficulties. The new feature will be available in the US, UK and France on May 20. In addition, the feature is expected to come to other countries. The reason why the feature was first introduced in these countries can be attributed to the fact that these three countries work with sign language. SignTime, which can translate sign language instantly, seems to be very useful for people who have hearing problems.

Assistive Touch for Apple Watch

Apart from SignTime, Apple will bring the Assistive Touch service, which is already very important for iPhone users, to Apple Watch devices. Assistive Touch, which will combine gyroscope and speed sensors embedded in the devices with machine learning on the device, will be able to understand the inconspicuous movement differences on the muscle. This will allow users to answer incoming calls or access notification settings without having to touch the device screen.

Apart from these features:

In addition to these 2 prominent features, Apple also mentioned that the eye tracking feature that will make iPad users happy will come to the devices. In compatible MFi (Made for iPhone / iPod / iPad) devices, as the eye is moved together with this feature, the pointer on the screen will also follow the eye and clicks can be performed when looking at a certain point for a certain period of time. Apple has also brought an update to its MFi listening devices program that includes support for bidirectional listening. With this feature, users with hearing problems will be able to make FaceTime calls without using their hands.

In addition, Apple stated that it will add the support of the audiogram, which means the graphic showing the audible range for standard frequencies, to its service, which we know as ‘Headphone Accomodation’, which allows us to adjust the sound we hear on the device. In addition, a feature that will mask unwanted ambient sounds by playing sounds such as rain and ocean sounds in the background was also mentioned. With this feature, Apple aims to minimize external factors that distract or distract users.