Apple’s focus on iPhone accessibility features take a step forward with iOS 16

When we talk of software and larger operating system updates for smartphones, most of us refuse to look beyond the visual side of things. A bunch of new features. And the inevitable comparisons, with a generation earlier and the present-day alternatives. Yet, dig a bit deeper and the often-overlooked accessibility features, are crucial for many users. Particularly those who may be differently abled. Their ability to use a smartphone, hinges on these functionalities.

Good time, to speak about accessibility features that Apple has included in iOS for iPhone, as we usher in the annual International Day of Persons with Disabilities, on December 3. The suite that’s currently available on iPhone (and most of it in iPadOS as well, with Apple Watch getting extended functionality too) will be further augmented with a “Custom Accessibility Mode”, which is expected to be released with iOS 16.2 that should arrive on iPhones, sometime in the coming weeks.

Also Read: The new Apple TV 4K bolts serious gaming to evolving video streaming prowess

“When accessibility is built into devices, apps, and systems that exist for the entire population, however, it can bring added value to everyone involved,” points out the 2019 study by the Massachusetts Institute of Technology, called ‘Navigating the 21st century without vision’.

The widening gamut of accessibility

This isn’t a one-off for Apple. Pieced together over time, 2021 was pivotal in expanding the helpful features for users with people with mobility, vision, hearing, and cognitive disabilities. It is then that Apple added Assistive Touch to the Apple Watch, eye tracking support for third-party apps on the iPad and expanded on-device object detection assistance for VoiceOver screen reader.

“At Apple, we’ve long felt that the world’s best technology should respond to everyone’s needs, and our teams work relentlessly to build accessibility into everything we make,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives for Apple, at the time. In 2022 and heading into 2023 with iOS 16, the functionality is evolving further.

Recognizing distinct sounds (such as those of a crying baby or the doorbell), using voice control to interact with apps and features on the iPhone (your voice commands will be a direct replacement for touch) and even door detection using the Magnifier app, are some of the newer functionalities.

The spectrum now goes beyond the options available for the most part – larger text, transparency reduction, increase contrast and the VoiceOver screen reader. On an Apple iPhone, you’ll find most of the options in Settings > Accessibility.

Identifying and placing you in a real environment

Sound Recognition has been around since iOS 14 but has improved with every generational update, right up till the present-day iOS 16 iteration. The iPhone can be configured to hear for and notify you for as many as 15 different sounds. These include alarms (fire, siren, and smoke), animals (cats and dogs, for now), household sounds (such as doorbell, door knock, water running or even a car horn nearby) and sounds that people make (such as a baby’s cry or someone coughing).

Mind you, if you have an Apple Watch, the notifications seamlessly arrive there too.

We have noticed that while the sound recognition does work quite well for the most part, there may often be issues with separating a particular sound in a noisier environment. Such as, our iPhone sometimes didn’t detect the doorbell if there is enough traffic noise coming through from the outdoors.

Apple warns as much and says the functionality should in particular not be your sole guidance “in high-risk or emergency situations, or for navigation”.

Finding your way, with Magnifier

The current iOS 16 update gave the Magnifier tool more functions to work with. While the core task of letting users zoom into objects via the iPhone’s camera (to be able to see clearly), there is also the advanced door and people recognition to consider.

After opening the Magnifier app, you’ll need to select the detection icon on the bottom right of the tabbed options. Here, the camera interface will show the options of door detection and identifying people. If you select the door option, the iPhone camera will attempt to detect if there is a door in the direction in which the camera is looking. And how far away is it, approximately.

We have noticed that this feature detects most doors seamlessly, but often takes a while to react to a glass door sitting somewhere within a full glass separation (or wall, if that’s how you want to define it) and needs clear line of sight of a door knob or handle to give a confirmed response.

The expansion of Magnifier’s functionality did eliminate the need to use third party apps, that used the iPhone’s camera for detecting and navigating physical environments. Mind you, magnifier can do wide range object detection – basically it’ll speak out what it sees, as the iPhone’s camera is pointed around the vicinity Unlike Google’s separate Lookout app, the Magnifier app is (rather conveniently) baked into iOS for iPhones.

Using your voice, instead of touch

“Open weather, swipe left”. That is just one of the things the voice controls option lets you do on an iPhone.

This is different from Siri voice commands. Enabling Voice Control on the iPhone will let you do, as the name suggests. Think of it as a direct replacement for the need to interact with the touchscreen. It is quite simply to speak your way across iOS 16’s apps – setting alarms, playing music, sending messages and more.

“Whether or not it was as simple as that for Apple, today, it’s hard to see why it isn’t as simple as that for everyone else. The technology is there, but the mindset is not,” points out the MIT study, further refencing appliances such as microwaves and washing machines, websites that use animations instead of coded buttons or captcha tests that make it impossible for people with visual disabilities, from accessing features.

New Home Screen: A big update, over the horizon

It is expected that Apple will roll out the “Custom Accessibility Mode” for iOS, with an upcoming iOS update. That is expected to be iOS 16.2 (earlier this week, iOS 16.1.2 rolled out), sometime in the next few weeks.

It is expected that it’ll allow for a simpler and enlarged home screen for the iPhone, a replacement for the default iPhone home screen layout. Something similar to the “easy home screen” mode we have seen on many Android phones, including Samsung Galaxy phones.

What is not clear is the extent and scope of customization options, but larger than usual icons for Phone, Messages, Camera, and Music are expected. That’ll be alongside much larger interface navigation guides (such as the back button) within each app that is available in this easier to use home screen.


For all the latest business News Click Here 

Read original article here

Denial of responsibility! TechAI is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.