After building a map of its surroundings and being taught the location of each room — where the kitchen is, where the bathrooms are — the wheelchair can understand natural, conversational language that the users feed it through a standard headset and microphone.
This includes direct commands ("Take me to the kitchen.") and more subtle input as well ("I'm hungry."). Even though it already knows the layout of its space, it also has rangefinders at approximately ankle height to ensure it doesn't bump into people or other obstacles.
The team employed Amazon's Mechanical Turk platform to help the wheelchair understand the wide variety of input it could get from users. Mechanical Turk is a service that lets you pay a small fee to have humans do things that computers can't yet do reliably, like write product descriptions or pick out the most aesthetically pleasing photograph from among a set. People on the other end of the service watched videos of the wheelchair carrying out a number of tasks, then came up with the possible commands that might have caused the action to be carried out.
For example, in response to a video of the wheelchair going into a bedroom, Mechanical Turk providers would type "Take me to the bedroom" or "Take me to bed" or "I'm tired" or whatever else is appropriate. This effectively builds out the wheelchair's vocabulary and makes sure it always knows what its users want to do.