A beer-pouring robot servant isn't just a fantasy from a decadent version of The Jetsons — it's a cutting-edge machine from Cornell researchers that predicts human movements with a motion-sensing camera, and responds accordingly.

Research in personal robotics has made great strides in recent years — robots can swarm, provide hygiene feedback, offer virtual pet therapy, teach autistic children social skills, deliver goods aerially, conduct lab work, and help paralyzed people walk.

Robot servants like Rosey are still a long way off, however. While personal robots can be programmed to respond within a limited range of possibilities, it can be difficult for them to execute tasks that seem simple, like pouring a beer or opening a refrigerator door, when they depend on judging complex variables like feedback from the humans they are meant to help.

Researchers in Cornell University's Personal Robotics Lab solved the last part of that problem by fitting a PR2, a personal robot designed by the robotic company Willow Garage, named Kodiak with a Microsoft Kinect 3D camera. With the Kinect's motion sensor, the PR2 watches the movements of a target human in the room and compares them to a database of 3D videos about how objects can be used.

After identifying the human's activities — reaching for a cup, or walking over to a fridge — the robot uses a unique algorithm to predict the human's most likely future movements based on the identified objects' functions. It then decides how it can help — pouring beer into the cup, or rolling over to open the fridge door, and constantly updates its predictions as the action develops.

"We extract the general principles of how people behave," said Ashutosh Saxena, Cornell professor of computer science, in a news release. "Drinking coffee is a big activity, but there are several parts to it."

The personal robot breaks down activities like drinking coffee into a "vocabulary" of all the minute components that can constitute such an action, so they can be reconfigured into a wide array of possible variations.

For now, the Cornell team's algorithm is less effective the longer it needs to anticipate in the future. Still, it performs significantly better than chance when predicting a full 10 seconds into the future.

In a study of the robot's ability to respond to anticipated human activities, Kodiak correctly predicted actions in 82 percent of trials when looking one second ahead, 71 percent of the time three seconds ahead, and 57 percent 10 seconds ahead.

Along with Cornell graduate student Hema Koppula, Saxena will present the latest research at the International Conference of Machine Learning in Atlanta starting on June 18, and at the Robotics: Science and Systems conference in Berlin, Germany beginning on June 24.

Saxena told Wired that a robot servant is not quite ready to serve beer or open doors in your apartment, but code for the algorithm used in the study is available on his lab website for developers who want to tweak it for their own robots.

"Even though humans are predictable, they are only predictable part of the time," Saxena said in the news release.

"The future would be to figure out how the robot plans its action. Right now we are almost hard-coding the responses, but there should be a way for the robot to learn how to respond."

Source: Beer-pouring robot programmed to anticipate human actions. Cornell University. 2013.