Skip to main content

Want to Get Humans to Trust Robots? Let Them Dance

A performance with living and mechanical partners can teach researchers how to design more relatable bots

Robots interact with live dancers in a stage performance.

Georgia Institute of Technology’s interactive FOREST robots perform with Kennesaw State University dancers.

Gioconda Barral-Secchi

A dancer shrouded in shades of blue rises to her feet and steps forward on stage. Under a spotlight, she gazes at her partner: a tall, sleek robotic arm. As they dance together, the machine’s fluid movements make it seem less stereotypically robotic—and, researchers hope, more trustworthy.

“When a human moves one joint, it isn’t the only thing that moves. The rest of our body follows along,” says Amit Rogel, a music technology graduate researcher at Georgia Institute of Technology. “There’s this slight continuity that almost all animals have, and this is really what makes us feel human in our movements.” Rogel programmed this subtle follow-through into robotic arms to help create FOREST, a performance collaboration between researchers at Georgia Tech, dancers at Kennesaw State University and a group of robots. The project premiered last month, and was performed in concert last week.

The goal is not only to create a memorable performance, but to put into practice what the researchers have learned about building trust between humans and robots. Robotics are already widely used, and the number of collaborative robots—which work with humans on tasks such as tending factory machines and inspecting manufacturing equipment—is expected to climb significantly in the coming years. But although they are becoming more common, trust in them is still low—and this makes humans more reluctant to work with them. “People may not understand how the robot operates, nor what it wants to accomplish,” says Harold Soh, a computer scientist at the National University of Singapore. He was not involved in the project, but his work focuses on human-robot interaction and developing more trustworthy collaborative robots.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Although humans love cute fictional machines like R2-D2 or WALL-E, the best real-world robot for a given task may not have the friendliest looks, or move in the most appealing way. “Calibrating trust can be difficult when the robot’s appearance and behavior are markedly different from humans,” Soh says. However, he adds, even a disembodied robot arm can be designed to act in a way that makes it more relatable. “Conveying emotion and social messages via a combination of sound and motion is a compelling approach that can make interactions more fluent and natural,” he explains. 

That’s why the Georgia Tech team decided to program nonhumanoid machines to appear to convey emotion, through both motion and sound. Rogel’s latest work in this area builds off years of research. For instance, to figure out which sounds best convey specific emotions, Georgia Tech researchers asked singers and guitarists to look at a diagram called an “emotion wheel,” pick an emotion, and then sing or play notes to match that feeling. The researchers then trained a machine learning model—one they planned to embed in the robots—on the resulting data set. They wanted to allow the robots to produce a vast range of sounds, some more complex than others. “You could say, ‘I want it to be a little bit happy, a little excited and a little bit calm,’” says project collaborator Gil Weinberg, director of Georgia Tech’s Center for Music Technology.

Next, the team worked to tie those sounds to movement. Last year the researchers had demonstrated that combining movement with emotion-based sound improved trust in robotic arms in a virtual setting (a requirement fostered by the pandemic). But that experiment only needed the robots to perform four different gestures to convey four different emotions. To broaden a machine’s emotional-movement options for his new study, which has been conditionally accepted for publication in Frontiers in Robotics and AI, Rogel waded through research related to human body language. “For each one of those body language [elements], I looked at how to adapt that to a robotic movement,” he says. Then, dancers affiliated with Kennesaw State University helped the scientists refine those movements. As the performers moved in ways intended to convey emotion, Rogel and fellow researchers recorded them with cameras and motion-capture suits, and subsequently generated algorithms so that the robots could match those movements. “I would ask [Rogel], ‘can you make the robots breathe?’ And the next week, the arms would be kind of ‘inhaling’ and ‘exhaling,’” says Kennesaw State University dance professor Ivan Pulinkala.

Pulinkala choreographed the FOREST performance, which put into practice what the researcher-dancer team learned about creating and deploying emotion-based sounds and movements. “My approach was to kind of breathe a sense of life into the robots and have the dancers [appear] more ‘mechanized,’” Pulinkala says, reflecting on the start of the collaboration. “I asked, ‘How can the robots have more emotional physicality? And how does a dancer then respond to that?’”

According to the dancers, this resulted in machines that seemed a little more like people. Christina Massad, a freelance professional dancer and an alumna of Kennesaw State University, recalls going into the project thinking she would be dancing around the robots—not with them. But she says her mindset shifted as soon as she saw the fluidity of the robots’ movements, and she quickly started viewing them as more than machines. “In one of the first rehearsals, I accidentally bumped into one, and I immediately told it, ‘Oh my gosh, I’m so sorry,’” she says. “Amit laughed and told me, ‘It’s okay, it’s just a robot.’ But it felt like more than a robot.”

Soh says he finds the performance fascinating and thinks it could bring value to the field of human-robot relationships. “The formation and dynamics of trust in human-robot teams is not well-understood,” he says, “and this work may shed light on the evolution of trust in teams.”