How AI Is Delivering a Little Independence to the Visually Impaired

Most of us are lucky to have our sight and couldn’t imagine how we would cope without it. In Australia alone, there are more than 350,000 vision impaired citizens. Simple daily tasks, such as recognizing what is in the cupboard, or observing someone’s emotions, is either difficult or impossible. 

But thanks to a group of hackers working at a hackathon, their lives are about to change significantly. As a matter of fact, an app was developed at that event and has just recently been released by Microsoft. It is called Seeing AI and leverages the ubiquity of the smartphone with their in-built cameras.

The app uses the power of AI to recognize a whole host of objects and communicates them to the user via the inbuilt speaker. One of the main features the app can do is recognize human faces. Simply directing the camera towards someone allows the AI to not only assess the sex of the subject, but determine their age, hair color and mood as well. Additionally, the app even provides the user with information on where and how far the other person is roughly located.

Another great feature is its recognition of currency. When holding a bill in front of the camera, the app will respond with the denomination, something highly useful in countries where there is little color variation between banknotes.

The AI also has some clever experimental features, such as describing the setting of scene, colors and even deciphering handwritten notes.

For those looking for a more integrated solution, the MyEye 2.0 from Orcam will be of interest. This small device can be clipped to the arm of a pair of glasses and is fitted with a camera and microphone.

If the wearer were to point at the text in a menu, the internal AI recognizes this gesture as a desire to have the text spoken, upon which it will read the menu aloud. The device can also recognize faces, money and other objects. For example, it can be trained to identify certain items which it will later highlight when visiting the local supermarket.

Other hand gestures are also available to trigger other common tasks. For instance, pointing a finger at the wrist, as we would when checking our watch, will trigger the device to announce the time of day.

It is applications like these where AI truly comes into its own. By recognizing simple everyday items, recalling faces and providing support in daily tasks, such apps and devices can provide the visually impaired with a few degrees more independence.

Varsha Shivam

Varsha Shivam

Varsha Shivam is Marketing Manager at Arago and currently responsible for event planning and social media activities. She joined the company in 2014 after graduating from the Johannes Gutenberg University of Mainz with a Master’s Degree in American Studies, English Linguistics and Business Administration. During her studies, she worked as a Marketing & Sales intern at IBM and Bosch Software Innovations in Singapore.

View all posts by