Apple is leading the way with accessibility features on its products including iPhone and iPad, that are designed to make the devices still usable for people with vision, hearing, mobility and cognitive disabilities.
One in five Australians have a disability – that’s more than 4.4 million people.
And Australians are also aging into disability with 16 per cent of the population aged over 65.
According to the Australian Institute of Health and Welfare, at age 65, Australians can expect to live, on average, over half of their remaining years with some level of disability.
With that in mind, it’s a good time to highlight some of many accessibility features Apple has created to give anybody with a disability to enjoy the features of their iPhone and iPad in their day to day usage and be equally empowered by the product as any able-bodied customer.
Many of the accessibility features can be used by anyone with many providing a handy shortcut or a convenient new way to use the product.
And some we never knew existed.
The four main areas are vision, hearing, cognitive, and mobility.
Globally there are 285 million people who are visually impaired – of these 39 million are blind.
There are several accessibility features including:
– VoiceOver: VoiceOver is a screen reader that tells you exactly what’s happening on your device. In iOS 15, VoiceOver can now describe people, objects, text and graphs in greater detail than ever. Auditory descriptions of elements help you easily navigate your screen through a Bluetooth keyboard or simple gestures on a touchscreen or trackpad. Set up instructions here.
– Speak Screen: With Speak Screen, you can hear the content of your entire screen read aloud to you, while Speak Selection lets you select and hear a specific range of text. You can also control the pace of the reading more precisely during speech playback with Speech Controller. And all this spoken content can come from one of 70 voices in over 35 languages with customisable pronunciations, so your spoken content won’t feel impersonal at all. Set up instructions here.
– Text Size: Increase text legibility and visibility with simple font adjustments. Larger Text allows you to adjust the size using an accessibility slider. Or you can turn on Bold Text to give words weight on your screen. Your preferred settings can be applied to only the apps you choose. Set up instructions here.
– Magnifier: Magnifier works like a digital magnifying glass. It uses the camera on your iPhone, iPad or iPod touch to increase the size of any physical object you point it at, like a menu or sign, so you can see all the details clearly on your screen. Set up instructions here.
– People Detection: The LiDAR Scanner in iPhone 12 Pro, iPhone 12 Pro Max, iPhone 13 Pro and iPhone 13 Pro Max, 12.9‑inch iPad Pro (4th generation or later), and 11‑inch iPad Pro (2nd generation or later) can determine a person’s proximity to you. People Detection uses technology that measures how long it takes light to reflect back from objects, helping you do things like stand in line at a safe distance, better navigate a noisy area or find an empty seat with ease. Set up instructions here.
In 2005 there was an estimated 3.55 million Australians that had some level of hearing loss – that’s the equivalent of about 17 per cent of the population.
Hearing aids are now available that work with the iPhone.
Apple has partnered with top manufacturers to create hearing aids and sound processors designed specifically for iPhone, iPad and iPod touch.
These advanced hearing devices provide outstanding sound quality, offer many helpful features, and are as easy to set up and use as any other Bluetooth device.
Users can also Instantly apply their audiologist’s environmental presets as you go outdoors or enter noisy locations, like restaurants, without having to rely on additional remotes. Set up instructions here.
Other hearing accessibility features include:
– Live Listen: Live Listen is an assistive listening feature that helps you have better conversations in loud places. Once activated you can move your device towards the people you’re talking with. Audio is picked up by the device microphone and sent to your wireless headphones or hearing aid, so you can hear what they’re saying more clearly. Set up instructions here.
– Conversation Boost: Conversation Boost for AirPods Pro helps you better hear conversations in crowded or noisy environments. Through computational audio and beamforming microphones, Conversation Boost focuses AirPods Pro on the voice of the person directly in front of you, making it easier to distinguish speech and follow along in face‑to‑face conversations. Set up instructions here.
– Sound Recognition: Sound Recognition can listens for certain sounds and uses on‑device intelligence to notify you when a specific sound is detected. You’ll receive a notification when a particular sound like a doorbell or crying baby is detected. Set up instructions here.
In the Shortcuts app, it’s possible to set up personal automations when sound recognition is triggered.
For example, a smart light can turn red when the doorbell is heard.
– Background Sounds: Everyday sounds can be uncomfortable or annoying. Background sounds like distant rain or ocean waves can minimise distractions and help you focus, stay calm or rest. Choose from balanced, bright or dark noise — or ocean, rain or stream sounds — to mask excess environmental or external noise. These sounds can also mix into or duck under other audio and system sounds that are playing through your device. Set up instructions here.
– Safari Reader: Sometimes navigating the web can be a sensory overload. Safari Reader is an assistive technology feature that strips away ads, buttons and navigation bars, allowing you to focus on all the content you want — and none of the clutter. Set up instructions here.
– Guided Access: Guided Access helps you stay focused on the task (or app) at hand. You can limit a device to stay on one app at a time by disabling the Home button.
You can even restrict access to the keyboard or touch input on certain areas of the screen so wandering taps and gestures won’t distract from learning.
Whether you’re a parent, a teacher or just trying to help yourself focus, you have all the options you need to customise your experience on Apple products. Set up instructions here.
People with physical and mobility disabilities can’t touch, type or hold their device but there are features that allow users to take control with their voice or with eye tracking.
– Dictation: Dictation lets you talk wherever you would type. With this feature you can type an email, notes or a web address — without typing at all and by just using your voice. Tap the microphone button on the onscreen keyboard, say what you want to write, and your device converts your words (and numbers and characters) into text. Set up instructions here.
– Voice Control: You can navigate your device using just your voice. Commands like click, swipe and tap help you easily interact with your favourite apps. You can precisely select, drag and zoom by showing numbers alongside clickable items, or by superimposing a grid on the screen. Voice Control also offers a more efficient way to write and edit. It’s a seamless process to make corrections, format changes, and transition between text dictation and commands. Set up instructions here.
– Switch Control: Switch Control makes it easy and efficient to control your device with a variety of adaptive switch hardware, wireless game controllers or even simple sounds. If you have extensive motor limitations, you can use item, point and manual scanning to navigate sequentially through onscreen keyboards, menus and the Dock.
You can create your own customised panels and keyboards, system-wide or app by app. Platform Switching allows you to use a single device to navigate any other devices you have synced with your iCloud account. That way, you can control your Mac directly from your iPhone or iPad without having to set up your switches on each new device. Set up instructions for Mac here. Also available on iOS devices.
– AssistiveTouch: This feature for iOS and iPadOS lets you adapt your touchscreen to ﬁt your physical needs. If certain gestures, like pinch or tap, don’t work for you, swap them with a gesture that does or create a touch that’s all your own. You can customise the layout of the AssistiveTouch menu too, or connect a Bluetooth mouse to control an onscreen pointer for navigation. Set up instructions here.
– AssistiveTouch for Apple Watch now enables users with upper-body limb differences to navigate an onscreen motion pointer on their Apple Watch display or answer calls using clench, double-clench and pinch gestures. Set up instructions here.
– Head Tracking: Uses the camera on your device to follow the movement of your head to control a pointer on your screen, and tracks your facial movements to perform actions. On Mac, Pointer Control already allows you to use different methods, like head movements, to control your cursor and mouse button. Set up instructions here.
– Eye Tracking: iPadOS now supports third-party eye-tracking devices, making it possible for people to control iPad using just their eyes. Compatible devices will track where a person is looking on screen and the pointer will move to follow the person’s gaze, while extended eye contact will perform an action, like a tap. Set up instructions here.
– Back Tap: Meet the easiest shortcut ever. Back Tap lets you double‑tap or triple‑tap the back of your iPhone to automatically perform a range of customised tasks — from opening your favourite app to taking a screenshot. Choose from 24 different actions or create your own automated shortcuts to simplify your everyday tasks, including: mute, open camera, or screen shot. Set up instructions here.