BEFORE CONSUMERS SEND ROOMBA ROBOTIC VACUUM CLEANERS to the manufacturer in St. James, N.Y., for repair, they often etch their names on the machines in the hopes of getting their own robots back. For the manufacturer, that’s not ideal—it’s cheaper to replace the squat, disc-shaped floor and carpet sweepers than to fix them. Yet, somehow, owners grow attached to the contraptions and worry that a new robot will have a different personality.

“People are grateful that the Roomba improves their lives, so they reciprocate by giving it attention like they would a pet,” says Ja-Young Sung, a Ph.D. student at Georgia Tech who surveyed 379 Roomba owners in 2007 on their attitudes toward the robotic device. Sung found many owners who gave their Roombas names and painted them, adorned them with stickers or dressed them in costumes. Although the Roomba’s “more primitive programming” causes it to bump into furniture as it randomly—and not always efficiently—cleans, some find that trait endearing.

“Human actions aren’t predictable, so the Roomba’s randomness feels more human,” says Sung, who admits that she talks to her own Roomba. And like parents coaching a child to perform a song, Roomba owners frequently turn on the machines to entertain friends, while neighbors pit their Roombas against each other on a homemade racetrack.

That willingness to interact with an object that, if not inanimate, is still hardly human, presents both a challenge and an opportunity for creators of devices that go far beyond housekeeping to imitate the actions of people. As difficult as it is to design a robot that can assemble a Toyota or handle toxic waste, it’s even harder to make a walking, talking machine that’s “socially assistive.” That phrase was coined in 2004 by Maja J. Matari?, director of the Center for Robotics and Embedded Systems at the University of Southern California, and her research group to describe machines that could be therapeutically useful through social interaction—coaching, motivating and monitoring people with cognitive and physical disabilities.

A socially assistive robot could, for example, tirelessly encourage a stroke patient to do rehabilitation exercises. It might walk next to someone with dementia, giving directions to help navigate the hallways of an assisted-living facility while chatting companionably. Or it could be a nonthreatening catalyst to teach children with autism how to interact with humans.

Service robots aren’t new; rehabilitation machines used primarily to push or pull stroke patients’ limbs have been around for a decade. But such devices are heavy, expensive and not particularly good company, Matari? says. Only recently have roboticists been able to go further, delving into the complex realm of human-robot interactions, as much a study of human psychology as of engineering. For a human to relate to a robot, the machine must be capable of any number of qualities: expressing a personality, discerning the user’s emotions and intentions, displaying feelings such as empathy, or following social conventions. “A socially ignorant robot always takes a direct path, stops if something is in its way and interrupts at any point to do its task,” explains Kerstin Dautenhahn, research professor in the School of Computer Science at the University of Hertfordshire in the United Kingdom. “But a socially interactive robot modifies its path to avoid getting too close to a human, waits until the right time to talk and fetches items without being asked.”

Research on such machines is in its earliest stages, with relatively few scientists involved and funding scarce. Roboticists must first work with clinicians and potential users to develop technologies; then they must build expensive prototypes for “trials” to test the technology on real users. Only then can they hope to get support for a larger study, and even then eventual commercial viability is a long shot. “There are a handful of researchers working on socially assistive robotics, versus thousands working on robot navigation, particularly for military applications,” Matari? says. “We are doing early studies to show that robotics can have a positive therapeutic effect, and we’re hoping that the National Institutes of Health will eventually approve funding for preclinical studies. But right now we have to battle a lot of skepticism.”

Still, whether they’re chunky vacuum cleaners or upright machines that can, after a fashion, walk, talk and respond, robotic creations seem to fascinate their human companions, and that, increasingly, is helping them ambulate toward new roles as medical caregivers just when there’s a growing need. “The number of younger adults for every older adult is decreasing dramatically, and we’ve never before seen these percentages of people over 85,” says Martha E. Pollack, dean and professor at the University of Michigan’s School of Information. “Robots will never replace human interaction, but they can augment it.”

ROBOTICISTS ALREADY KNOW HOW TO BUILD machines that could serve patients in several ways, and though most such robots remain prohibitively expensive, the cost is coming down. Machine vision, portable lasers, infrared sensors, and radio frequency identification (RFID) tags that use a microchip, an antenna and radio waves to instantly store and retrieve data are now smaller and cheaper than ever and process data in real time, according to Matari?. “We can write algorithms to allow the robot to sense what a person is doing so it can respond immediately, appropriately and safely,” she says. “That wasn’t possible 10 years ago.”

Sensors attached to a person’s wrist, elbow or clothing, for example, can allow a robot to detect the human’s movements and respond. A heat sensor can instruct the machine to turn or move toward a warm body, enabling it to participate in a game of chase or create the appearance that a person has its full attention. “Artificial audition” technology has improved so that a robot can now track one conversation when several people in a room are talking. And eventually some machines might even provide a hug. To make that happen, François Michaud, professor of electrical and computer engineering at the University of Sherbrooke in Quebec, is building a prototype with an elastic element inside its motor so that it responds to feedback from the environment. And if you’re not in a hugging mood? “The robot could sense how you’re responding,” Michaud says. “If it felt pushback, it would abandon the hug.”

Building a robot that can “see” has been one of the stiffest challenges. “A robot that is much shorter than you are will have a hard time tracking and processing your facial expressions,” Matari? says. As an alternative, her research team is creating machines that interpret other physiological data—such as skin temperature, heart rate and galvanic skin response, which is a measure of electrical resistance that corresponds to heightened emotions. The researchers have found they can use galvanic skin response to reliably predict frustration or boredom and warn a robot that its human companion is losing interest. (Mounting a camera on the robot and analyzing facial expressions proved much less accurate.)

The better a robot can “perceive” and respond to human signals, the more likely a person will be to take its direction. A robot that seems to convey empathy, for example, may be able to gain someone’s trust, be a credible coach and maintain a “natural” social relationship. In this vein, it’s important that responses not be static. “People may be open to direct encouragement when they’re fresh and require more empathy when they’re tired,” says Reid Simmons, research professor in robotics and computer science at Carnegie Mellon University in Pittsburgh. “So the robot may need to change its speech and expression, just as a good therapist would.” That means either being able to draw from a large menu of prerecorded speech or being preprogrammed with a giant database of words and grammar that lets the robot generate sentences on the fly.

It also helps if the machine’s “personality” matches the user’s. Working with stroke patients, Matari? has found that those who are extroverted prefer outgoing robots who stand close and speak faster with higher-pitched voices than average and exhort them with such phrases as “You can do more than that, I know it” and “Concentrate on your exercise.” Introverts like a bit of distance between themselves and the machine and respond well to gentle nudging: “I know it’s hard, but it’s for your own good” and “Very nice; keep up the good work.”

While most children seem to have an easy time working with robots, adult users are more demanding. Still, the majority of people eventually embrace having a robotic companion and may even start playing games with it, hiding from it or trying to trick it into thinking they’ve already completed their exercises. “The stroke patients we’ve tested have been very open-minded, perhaps because they were focused on their health needs and saw how the robot could help them,” Matari? says.

A robot must also be able to follow certain social conventions, such as getting on an elevator without mowing down other passengers. “Parents teach their children to wait until other people get off an elevator before stepping in,” Simmons says. “But that’s not the true rule, because if you wait until everyone gets off, you’ll stand there forever. The real question is, which people are intending to get off?”

Those are particular challenges for robots designed to accompany cognitively impaired individuals. Walking side by side with someone is much harder for a robot than following or leading, according to Simmons. “The robot has to keep pace, not bump into the person, and know how to give directions based on the side of the person it’s on,” he says. “When you’re walking with someone and talking, you give nonverbal cues to change direction, such as gesturing or moving slightly ahead. And if you’re going through a doorway, who goes first? These are the social signals we have to teach a machine.”

W09_robots_spot_630x420

BEYOND DETERMINING WHAT SOCIALLY ASSISTIVE robots can do, researchers must consider how the machines should look. A phenomenon called the “uncanny valley,” a phrase coined in 1970 by Japanese roboticist Masahiro Mori, suggests that the more humanlike the robot, the more humans will relate to it—but only up to a point at which that good rapport suddenly drops off (the valley). “Machines that are almost, but not quite, like a person are worse than those that are either completely humanlike or a bit further away,” Simmons says. Asked to suspend disbelief that they are interacting with a nearly human machine, people tend to develop unrealistic expectations and are disappointed when the robot can’t deliver. What’s more, the robot’s strangeness stands out when its behavior is judged by human standards, preventing people from empathizing with it. “People think they want to interact with a robot that totally mimics a human, but they really don’t,” Simmons says.

That’s particularly true of robots designed to work with autistic children, who often have trouble looking at human faces and want something decidedly machinelike. Kaspar, for example, a diminutive robot being tested with autistic children in the United Kingdom, has a minimally expressive face and wires sticking out of its neck and wrists to make it clear to the kids that they’re playing with a robot. “We tested another robot that looked like a doll with eyelashes and color on its lips, and the children didn’t like that one as much at first,” says Dautenhahn, who headed the team that created Kaspar.

Certainly no one would mistake CosmoBot, a 16-inch-tall robot designed by AnthroTronix, an engineering company in Silver Spring, Md., for a person. And that seems to suit Libby, an autistic six-year-old, just fine. Before being introduced to CosmoBot, Libby couldn’t imitate even the most basic actions. But after several weeks of playing with the robot, she was mirroring its motions as it led her through a Simon-says game of raising her arms, patting her head and clapping.

“Her mother and the professionals who saw this were in tears,” says Carole Samango-Sprouse, director of the Neurodevelopmental Diagnostic Center for Young Children at George Washington University. “It was incredibly encouraging that the robot, through repetition and predictable behavior, was successful in getting her to perform the motions she had seen adults doing for years.”

Robots can also teach children with special needs how to play with one another. If a child is touching the robot inappropriately—slapping, say, instead of stroking—the robot may back away or emit a warning beep to encourage the child to change his behavior. Then, as the child begins to master interactive skills, the robot’s behavior may become increasingly unpredictable, preparing the child for dealing with humans.

Dautenhahn has noticed that autistic children playing with Kaspar may also spontaneously begin interacting with their teachers. “One withdrawn boy who never played with other children or his teacher became very interested in Kaspar’s eyes,” she says. “He pointed to Kaspar’s eyes, then to his own, and then, smiling, to his teacher’s eyes. This was an invitation to share, and the boy and his teacher played together.”

Children with physical disabilities, too, respond well to robots. In three schools in Austria, PlayROB gives children with cerebral palsy and other severe disabilities the chance to play independently. Controlling the robot with a joystick, buttons, their mouths or even just head movements, the children can direct it to build LEGO structures and do additional activities that let them experience the creative expression, spatial recognition and accomplishment that other children get from playing.

Now PlayROB’s developer is designing a robot that will help disabled children play with other kids. “We have 24 play scenarios that the robot can do, and some need more than one child,” says Gernot Kronreif, head of advanced service robotics at PROFACTOR, an applied research company in Steyr, Austria. The robot is being developed as part of a €3 million ($4 million) project, Interactive Robotic Social Mediators as Companions. “We’re hoping to keep the price well below €10,000 [$13,300] so schools can afford it,” Kronreif says.

MAKING ASSISTIVE ROBOTS AFFORDABLE IS one hurdle; proving that the machines can be truly useful is another. To help on both counts, Michaud, the Canadian ?researcher, has created a comparatively simple and inexpensive device, a videophone on a mobile platform that can follow its user around the house. In a recent test involving an actress playing the role of someone returning home after hip surgery, the actress was able to see the face of a clinician in her office and could talk to her without holding a phone. And the clinician was able to assess how well the actress was navigating the house. “From here,” Michaud says, “we can move toward much more complicated artificial intelligence applications for social interactions with a robot.”

“We already have the technology,” Michaud continues. “Now we just have to show clinicians what robots can do so they can tell us which features are needed to provide the right solution to a particular disorder.” And, of course, getting the grants to prove that socially assistive robotics is ready to move out of laboratories and into schools and homes is also crucial. “Right now, funding is available to create robots to support the elderly and the very young, but what happens to people in the middle?” Dautenhahn asks. “Older children who are autistic or in wheelchairs grow up to be adults with those disabilities. I’m waiting for others to identify those needs so we can analyze how robots can help.”