Skip to main content
21 November 2019

Amazon
has recently started the international rollout of its Show and Tell feature,
designed to use the camera in an Echo Show
smart display
to help people who are blind identify everyday objects.

The
feature works by using the built-in camera of the Echo Show to visually
identify products. The information is then read out to the user. The feature
was originally released in September to first and second-generation Echo Show
devices in the USA, but anecdotal feedback suggests products in other countries
can also be identified.  

To use Show
and Tell, say to the Echo Show model that supports the feature  “Alexa, what am I holding?” or “Alexa, what’s
in my hand?,” which will kick off verbal and audio cues that guide you to place
the item you’d like to identify in front of the Echo Show’s camera.

It’s encouraging
to see large companies such as Amazon continuing to find new and innovative
ways to use their popular smart devices to add functionality that assists
people with disability.

The Centre for Accessibility is a joint project by Media on Mars, DADAA and Dr Scott Hollier and is funded by the Department of Communities, Disability Services.

Website by Media on Mars - link opens in a new window