Integration of AI has been very significant in the modern-day industry and has innumerable applications. Allowing for machines to make intelligent decisions and complex software codes allowing for software programs to learn from the surroundings. This application is being used in making apps for the visually impaired for a long time now.

Must Read: Zong 4G and Blue East join forces for IoT Ecosystem development

Microsoft launched an app with very similar functionality named Seeing AI in 2017. However, the topic of discussion is Google at the moment. Google has also been an app of its own to utilise its powerful AI engine to detect objects in the surroundings. Last year, Google announced a new app to help the visually impaired named Lookout. The app uses AI to identify objects through your phone’s camera. It can also read text in signs and labels, scan barcodes, and identify currencies. This week, Google announced that Lookout will finally be available to download — though only for Pixel devices in the US.


Since announcing the app last year, Google says it’s been “testing and improving the quality” of its results. The company cautions that, as with all new technology, Lookout’s results will not always be “100 percent perfect,” but it’s soliciting feedback from early users.

It comes with three modes: Explore, Shopping and Quick Read. Explore, its default mode, gives users audio cues about their environment, telling them if there’s a chair or a cute dog blocking the way, for instance.

Shopping can read barcodes and currency, giving users a way to, say, make sure they’re truly holding a $5 bill. Finally, Quick Read can read signs and labels, making it easy to find Exit doors or goods in a grocery store. In other words, the app can be especially useful for learning the layout of a new space for the first time, or for reading documents and completing daily tasks around the house.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *