The app is developed with Artificial Intelligence to figure out the objects through the Smartphone's native camera app.
The search engine giant previously stressed the importance of designing for accessibility, and has released various accessibility apps to improve the user experience for people with disability as of late. If Google Lens can identify a dog's breed from a photo, there's nothing stopping it from using the same tech to help visually-impaired people, and that's where Lookout comes in.
Lookout helps people in situations where they might need to ask for assistance, including learning about a space for the first time, reading text, and completing daily routines like cleaning, cooking, and shopping.
Users will have three modes to select from - Explore, Shopping, and Quick Read. The app will use spoken words to alert the users on the whereabouts of the objects.
Google uses its AI expertise to help the blind explore their surroundings
"As with any new technology, Lookout will not always be 100 percent ideal", Clary said. Or the user should keep it on the front side of his shirt's pocket or coat's pocket so the camera app can see the things around. The company says it's hoping to make the application more accessible, so it could eventually make its way to to more devices, countries and platforms in the future.
Moreover, Google claims that the app won't always work with 100% accuracy, and it will continue to develop the app as it gets more feedback from users.
Sound Amplifier - which was also announced at last year's Google I/O - uses a phone and a set of headphones to filter, augment, and amplify sounds so that users can better hear conversations or announcements in noise-heavy environments. Also, if you like our efforts, consider sharing this story with your friends, this will encourage us to bring more exciting updates for you.