Tech News & Podcast | Africa

Have a question, comment, or concern? Our dedicated team of experts is ready to hear and assist you. Reach us through our social media, phone, or live chat.

Tech News & Podcast | Africa

Google releases accessibility tools for Assistant, Maps, and Search.

Google has disclosed a number of accessibility upgrades that it is implementing for Maps, Search, and Assistant along with making some camera-based Pixel capabilities more widely available. Wheelchair accessibility is one of the primary areas of focus this time. Maps users will be able to request walking routes without stairs through a new feature that is progressively rolling out on iOS and Android. Google claims that this function would help people who are travelling with strollers and bags as well. It will be accessible everywhere, provided that the business has enough data for the area.

Google Maps enhances accessibility features for wheelchair users

Google notes that walking routes will also automatically have the wheelchair-accessible option applied if you have it enabled in your transit options. If not, you can activate stair-free directions when you request a walking route by touching the three dots at the top of the screen and selecting the “wheelchair-accessible” option.

Conversely, information that is wheelchair accessible will be accessible on more Google products, specifically on Maps for Android Auto and automobiles that have Google integrated into them. If a location has accessible parking, restrooms, steps-free entry, or seats, a wheelchair icon will show up when you search for it and tap on it.

Maps and Search should also make it easy to locate and patronise companies run by individuals with disabilities. If a company choose to classify itself as “disabled-owned,” it will appear in the results for Maps and Search. Similar business labels for Asian, Black, Latino, LGBTQ+, veteran, and women-owned businesses had already been introduced by Google.

In other news, Google has enabled screen reader functionality in Lens in Maps (formerly known as Search with Live View), an augmented reality application that uses the camera on your phone to help you locate locations such as ATMs, restaurants, and bathrooms. You can point your phone at the environment around you and tap the camera symbol in the search bar while you’re in a possibly unfamiliar place.

Eve Andersson, senior director of Google’s Products for All team, stated in a blog post that “if your screen reader is enabled, you’ll receive auditory feedback of the places around you with helpful information like the name and category of a place and how far away it is.” This Lens in Maps feature, designed for those with low vision and blindness, will launch today on iOS and later this year on Android.

The Magnifier app on Pixel devices uses your camera to enable you to magnify details in real life from a distance or to adjust the brightness, contrast, and colour of text on menus and documents to make it easier to read. The Pixel Fold is not compatible with the app, however Pixel 5 and later models can use it.

Google adds that in addition to recognising faces, the most recent version of Guided Frame, which debuted on the Pixel 8 and Pixel 8 Pro earlier this month, can now identify dishes, dogs, and documents to assist those who are blind or have impaired vision in taking clear, sharp pictures. Later this year, Pixel 6 and Pixel 7 devices will receive the Guided Frame update.

Google is providing more customisable Assistant Routines in the interim. According to the firm, you will be able to upload your own photographs, customise the size of the routine, and add it as a shortcut to your home screen. “Research has shown that this personalization can be particularly helpful for people with cognitive differences and disabilities and we hope it will bring the helpfulness of Assistant Routines to even more people,” Andersson stated. For this feature, Google developers drew inspiration from Action Blocks.

Not to mention, Google added a capability to the desktop Chrome address bar earlier this year that allows it to recognise errors and recommend websites based on what it believes you meant. Starting today, Chrome for iOS and Android will support the feature. The goal is to make it easier for people who struggle with dyslexia, language learners, and typos to find what they’re looking for.

Share this article
Shareable URL
Prev Post

Users can now access YouTube’s new feature, a gear icon with more choices for toggling.

Next Post

MailChimp review: New Features include e-commerce automation and more

Leave a Reply

Your email address will not be published. Required fields are marked *

Read next