We all dream of an inclusive world, isn’t it? A world, that will make life a lot easier, especially for those who are differently abled. Technology can help us build a more inclusive world for sure, empowering those living with physical challenges. The makers of Apple have recently taken a step towards that. Now, Apple phones can detect doors and help people trace everyday sounds better. Yes, you heard us right. Two new features launched by this tech giant will address these two most commonly faced issues faced by people with audio visual impairment. These two features are known as Door Detection and Live Caption. These two cutting-edge features are now available globally on the latest versions of iPhones, such as iPhone 13 Pro and iPhone 12 Pro as well as on Apple Watch Series 6. However, they will be available on all of Apple’s existing devices through their frequent software updates, by the end of 2022.
Door Detection: A blessing for people with visual challenges
You must have come across people with dark glasses struggling to find their way out of a shopping mall or a hotel lobby. In all probability, you have helped them too! But now, they will be able to locate the doors on their own, thanks to Apple’s new Door Detection feature. It will allow people with poor vision find a door at an unknown destination, figure out its distance from their position. That’s not all. This feature will also help them figure out if the door is open or close and identify entrance or exit signs. Wondering how Apple made all these possible? Well, the answer is in-built machine learning, camera as well as LiDAR scanner. This scanner is a type of scanner that allows users determine distances by targeting an object or an individual with a laser. The laser, in turn, helps measure the time for the reflected light to return to the receiver (iPhone or iPad in this case).
Live Caption: A boon for those with hearing loss
Can you imagine a life without phone calls with your friends and family? What is unimaginable for you is the sordid reality for many who live with hearing loss. But now, there’s good news for them, thanks to Apple’s Live Caption feature, available on iPhones, iPads and MacBooks. What does this feature do? Well, it helps people with auditory difficulties follow any audio content on their devices through real-time captioning. It can be anything from a phone call to a video content. Wait, there’s more. Users of this feature can even have normal day-to-day conversations. All they need to do is type a response and the device will auto-convert it into an audio, playing it instantly during a real-life interaction.
Something for all
Apart from making the lives of people with hearing and vision difficulties easier, Apple is also addressing the challenges of the community living with motor disabilities. They now mirror Apple Watch apps to the iPhone, or use hand gestures (for instance double pinch) to perform certain functions on their devices like clicking images or answering a phone call.