Hacked Amazon Echo Controls a Wheelchair

Watch as Amazon Echo thinks it turning lights on and off as it's really controlling a wheelchair.


Amazon Echo is a voice-activated, cloud-connected wireless speaker that can be your personal assistant. Think of Echo as Siri for your home.

Echo, which is designed around your voice, answers to “Alexa” and can tell you the scores to the game, read your book, play your music, or check your calendar. And if you have a smart home, Echo can turn off your lights and integrate with other smart technology.

Bob Paradiso, however, wondered if he “could push Echo’s utility a little further.” He certainly did. In the video above, you’ll see Paradiso turned an electric wheelchair into a voice-controlled wheelchair using Echo, a Raspberry Pi and Arduino Uno.

Echo thinks it’s turning lights on and off, but it’s really controlling the wheelchair. Paradiso says, “Alexa, turn on left 4” and the wheelchair spins. He then says, “Alexa, turn on forward 4” and the wheelchair moves forward. Paradiso wrote a detailed piece about his build, but here’s an interesting snippet:

For my Echo based implementation, I had the constraint of using Echo’s “Alexa, turn on X” and “Alexa, turn off X” commands. Having this relatively long phrase makes control more cumbersome than simple moment to moment LEFT’s and RIGHT’s or just naming a destination. That combined with Echo not responding instantly (voice is processed on Amazon’s servers) and sometimes not recognizing speech correctly makes a real-time micromanaged system not practical. However, also, I wanted to do something ‘relatively’ simple, so that for me ruled out destination specification and all the mapping, pathfinding, and obstacle avoiding work that involves. That pushed me towards something somewhere in between. I opted for the user specifying an action(direction) and a duration in seconds. This did mostly work as shown in the video, though it can get tedious, and a little frustrating if you over or undershoot what you really wanted to do.

So what led to the “Alexa, turn on X” constraint I was working under? Well, unless you use the official Amazon Echo SDK (Alexa Skills Kit, linked below), and host your server code for Amazon’s servers to talk to, you’re limited to types of interactions Echo already can do. Further it seemed to me that going down the Echo SDK road didn’t change the paradigm in any useful way. All voice commands would still have to follow the format of “Alexa, [action word] [target]” or very similar, so not really faster to say, and not any faster for Echo to process and respond.

We’ll have to keep an eye out on other cool hacks using Amazon Echo, which is $30 off today only.

Hack away!




About the Author

Steve Crowe · Steve Crowe is managing editor of Robotics Trends. Steve has been writing about technology since 2008. He lives in Belchertown, MA with his wife and daughter.
Contact Steve Crowe: scrowe@ehpub.com  ·  View More by Steve Crowe.




Comments

Steve Crowe · November 20, 2015 · 4:12 pm

Lindsey,

You make some great points. Thought I’d pass along this piece that details more capabilities of Echo using an automation app called IFTTT.

http://www.androidcentral.com/ifttt-now-lets-you-add-trigger-phrases-amazon-echo?utm_campaign=trueAnthem:+Trending+Content&utm_content=564de8a404d3017695837e2e&utm_medium=trueAnthem&utm_source=twitter

Lindsey Lewis · November 20, 2015 · 3:53 pm

This is a good start,

I always have thought that the Amazon Echo could be great for elder or disabled care situations if hooked up to various devices.  Thing is the Echo needs to see.  That way it could remind a identified person of care that they needed to do something or it could evaluate their situation using dynamic recognition, similar to how self driving cars work.

Another thing is - you would not want someone else to be able to tell the Echo to control the wheel chair without your permission.

And - you should be able to tell the Echo where you want to go by identifiable location names - like “navigate to kitchen”, or more specifically “navigate to refrigerator”, or “bathroom”.  Stuff like that.


Lindsey Lewis · November 20, 2015 at 3:53 pm

This is a good start,

I always have thought that the Amazon Echo could be great for elder or disabled care situations if hooked up to various devices.  Thing is the Echo needs to see.  That way it could remind a identified person of care that they needed to do something or it could evaluate their situation using dynamic recognition, similar to how self driving cars work.

Another thing is - you would not want someone else to be able to tell the Echo to control the wheel chair without your permission.

And - you should be able to tell the Echo where you want to go by identifiable location names - like “navigate to kitchen”, or more specifically “navigate to refrigerator”, or “bathroom”.  Stuff like that.

Steve Crowe · November 20, 2015 at 4:12 pm

Lindsey,

You make some great points. Thought I’d pass along this piece that details more capabilities of Echo using an automation app called IFTTT.

http://www.androidcentral.com/ifttt-now-lets-you-add-trigger-phrases-amazon-echo?utm_campaign=trueAnthem:+Trending+Content&utm_content=564de8a404d3017695837e2e&utm_medium=trueAnthem&utm_source=twitter


Log in to leave a Comment

Article Topics

Household · Personal Robots · Family Robot · News · Media · Videos · Amazon · All Topics

in the Household Hub

Editors’ Picks

Autonomous Snake-like Robot to Support Search-and-Rescue
Worcester Polytechnic Institute is creating autonomous snake-like robots that can navigate through...

Love Writing About Robotics and AI? Robotics Trends is Hiring!
Robotics Trends and sister site Robotics Business Review are growing and adding...

WiBotic PowerPad Wirelessly Charges Drones
WiBotic’s PowerPad wirelessly charges everything from large industrial drones to smaller...

Meet Jing Xiao: WPI’s New Director of Robotics
In January 2018, Jing Xiao will become the new director of the Robotics...