Alexa Show app to assist blind and other low-view clients recognize prevalent household pantry products

/
/
/
17 Views

Amazon is introducing a fresh function to its Alexa Show app to assist blind and other low-view clients recognize prevalent household pantry products by keeping them in front of Alexa’s camera and asking what they are. To recognize the objects the Echo Show sees, the feature uses a combination of computer vision and machine learning techniques.

The Echo Show is the version of the Alexa-powered intelligent speaker that tends to sit in the kitchens of clients because it helps them with other cooking duties, such as setting timers, watching videos from the recipe, or enjoying some music or TV while cooking.

But the Show will now have a fresh obligation for blind consumers: to help them better identify those household pantry products that are difficult to differentiate by contact, such as cans, boxed products, or spices. Customers can just tell stuff like “Alexa, what do I hold?” to use the function. “Or what’s in my hand, Alexa? “Alexa will also provide verbal and audio signals to assist clients put the product in location.

Amazon claims the function was created in cooperation with blind Amazon staff, including its leading accessibility engineer, Josh Miele, who as part of the development process collected feedback from both blind and low-vision clients.

The firm also worked on early studies, product development and testing with the Vista Center for the Blind in Santa Cruz. Echo devices aren’t globally available — and even when they are offered in a particular country, the device may not support the local language. Plus, the feature itself is U.S.-only at launch.

 

  • Facebook
  • Twitter
  • Google+
  • Linkedin
  • Pinterest

Subscribe to get instant news from Speckyweb