Blind users get Seeing AI with enhanced features
Microsoft’s Seeing AI has just got better. The wonderful app that lets blind and limited-vision folks convert visual data into audio feedback will now allow users to now use touch to explore the objects and people in photos.
“This new feature enables users to tap their finger to an image on a touch-screen to hear a description of objects within an image and the spatial relationship between them,” wrote Seeing AI lead Saqib Shaikh in a blog post. “The app can even describe the physical appearance of people and predict their mood.”
It will help do the following
- Speaks text as soon as it appears in front of the camera
- Provides audio guidance to capture a printed page, and recognises the text, along with its original formatting
- Gives audio beeps to help locate barcodes and then scans them to identify products
- Recognizes friends and describes people around you, including their emotions
- An experimental feature to describe the scene around you
- Identify currency bills when paying with cash
- Generate an audible tone corresponding to the brightness in your surroundings
- Describes the perceived colour
- Reads handwritten text
The app lets users tap around to find where objects are , other details that may not have made it into the overall description may also appear on closer inspection, such as flowers in the foreground or a movie poster in the background.
The app now supports the iPad, which is certainly going to be nice for the many people who use Apple’s tablets as their primary interface for media and interactions.
Lastly, there are a few improvements to the interface so users can order things in the app to their preference.