There's just one snag – how does a robot tell the difference between an elderly and vulnerable patient who has collapsed, and a similarly shaped object – such as a large duffel bag – lying on the floor?
The problem of teaching machines to distinguish between an everyday situation and a possible emergency is now being tackled by a £7m EU-funded project being conducted at six universities in Britain and abroad.
The project, known as STRANDS (Spatio-Temporal Representations and Activities for Cognitive Control in Long-term Scenarios) is focused on programming robots to learn about their environment and recognise when something is amiss.
The first major phase of the study took place this summer at the University of Lincoln, where researchers from Birmingham and Aachen universities gathered for a week of intensive programming.
Within five years, the scientists hope that a robot placed in an unfamiliar care home will be able to learn about its surroundings and create recognise patterns of everyday activity such as doors opening and closing or furniture being moved.
Operating without any input from humans for up to three months at a time, the robots should be able to tell the difference between a normal situation, such as someone leaving their room during the day, and abnormal one, such as doing so in the middle of the night.
A second arm of the project will see the robots deployed as security guards in office buildings, patrolling and picking up signs of unusual activity such as open windows or people moving around at night.
"When a human security guard is working, they learn about their environment – where things usually are, what people do – they get this common sense understanding of the environment over a long period of time," Dr Nick Hawes, coordinator of the project from the University of Birmingham, said. "The intellectual challenge is, could you enable a robot to operate the same way?"
The major difficulty will be teaching the machines which environmental changes it should consider normal, preventing them from interpreting a repositioned chair or missing stapler as a security threat.
Professor Tom Duckett, Director of the Lincoln Centre for Autonomous Systems Research, explained: "What we are trying to do is enable robots to learn from their long-term experience.
"If you were to go into a busy restaurant at lunchtime and see a chair that was out of place, that would not concern you at all because it would be what you would expect. But furniture moved in an office late at night would be suspicious."
Prof Duckett's role in the project is to oversee the creation of "four dimensional" mapping software, which will allow the robots not just to navigate its way around, but recognise how the environment changes during the day and over longer periods.
The machines have a 360 degree laser at floor level which tells them where they are in relation to walls and doors, but are also equipped with a camera similar to a Microsoft Kinect on their head, allowing them to recognise objects they have seen before and spot when something is out of its usual place.
It is not hard to see how security companies like project sponsors G4S could benefit from intelligent robots capable of performing repetitive and monotonous tasks, explained Marc Hanheide, a member of the Lincoln programming team.
“It could act as an assistant to a human security guard, that would help them and flag things up, informing them to check things,” he said.
“They have got types of tiring duties like patrolling and this is something that could help.”
While coping with the variation of furniture in an empty office is hard enough, getting to grips with normal daily activities in a care home with hundreds of residents poses the tougher of the two challenges.
Some hospitals already use basic robots for jobs like delivering medicines, but developing machines that can react to different situations and perform a variety of tasks is a large step forward.
"In the care home, we want the robot to detect when people have fallen over, but also do things like passing messages and assisting around the place," Dr Hawes said. "Learning that, for example, residents all have a cup of tea at 3pm and being around to help at that time.
"If someone tries to leave with her shopping bag at 3am, the robot should be able to say, 'this isn’t what should be going on', and raise an alarm."
But despite the technical difficulties, perhaps the most difficult part of the project will be encouraging the residents of the Haus der Barmherzigkeit care home in Austria, where the technology will be tested, to accept their new staff.
The robots have a head with two blinking "eyes" on top of their conical bodies, not for technical reasons but to make them more approachable and to help humans interact with them.
"If it's just a box on wheels, it’s a lot harder for people to understand what’s happening and how to interact with the robot," Dr Hawes explained.
"The head gives them a focal point and the eyes will indicate where the robot is looking. If the camera points somewhere the eyes should look in the same direction, so a human should have an intuitive understanding of what it is doing."
Although the project is purely for research purposes, the scientists behind it intend to form a spin-out company and market the software before others have the same idea.
"The security aspect I think is easier because they can get a lot more value out of limited functionality, so I would say after the end of the project you would maybe have another year or two of development to commercialise these things," Dr Hawes said.
"Care homes I think would be another four to six years of work, so maybe in 10 years' time you might start seeing these things."
Full list of project partners: University of Birmingham (UK); University of Lincoln – School of Computer Science (UK); G4S Technology Ltd (UK); Akademie für Altersforschung am Haus der Barmherzigkeit (Austria); Royal Institute of Technology (Sweden); RWTH Aachen University (Germany); The Vienna University of Technology (Austria); University of Leeds.