The Forest Robotic Arms lined up with color lights for performance.



FOREST is the performative outcome of an NSF funded project aimed at enhancing trust between humans and robots through sound and gesture. As part of the project, we trained a deep learning network to generate emotion carrying sounds to accompany robotic gestures. We also developed a rule-based AI system for creating emotional human-inspired gestures for non-anthropomorphic robots. The performance aims to create trustful connections between human and robotic musicians and dancers, which would lead to novel creative and artistic ideas for both machines and humans. 

Signature Areas of Study

A human and robot face to face. An artistic illustration of the proximity of human and robots.

Human Robot Interaction

Human Robot Interaction (HRI) is the study of interaction between humans and robots. In FOREST, this interaction is augmented by the non-verbal emotional communication channels of sound and gesture.

Singer and trumpet player interacts on stage with robots dancing in the background.

Prosody and Emotional Contagion

Prosody is the elements of speech such as pitch, intonation, stress, and rhythm that do not carry linguistic meaning. The robots in FOREST use these prosodic elements to convey emotion. With these prosodic elements, we study emotional contagion, the processes of spontaneous spread of emotions between humans. We study how robots can effect this emotional contagion through sound and gesture. 

FOREST Technologies & Research

Generative Choreography

Choreography that is generated by computational processes

Mo-cap Suit

A wearable set of sensors that captures human movements


The extraction of information from musical signals

Computer Vision

An AI field that derives data from images and videos

See Our Work


Watch our first performance with the FOREST robots and dancers.

The research team behind FOREST includes undergraduates and graduates from the Music Technology programs: Richard Savery, Michael Verma, Raghav Sankaranarayanan, Amit Rogel, Jumbo Jiahe, Rose Sun, Nikhil Krishnan, Sean Levine, Mohammad Jafari,  Nitin Hugar, Qinying Lei, and Jocelyn Kavanagh.

Georgia Tech’s robots performed with Kennesaw State University dancers Christina Massad,  Darvensky Louis, Ellie Olszeski, Bekah Crosby, and Bailey Harbaugh.

Behind the Research

Center for Music Technology director Gil Weinberg and Kennesaw State University professor of dance and interim provost and vice president for Academic Affairs Ivan Pulinkala explain the “disruptive collaboration” between dancers, researchers, and robots.

FOREST Concert

The FOREST project will be displayed in concert on December 11, 2021 at 7:30pm in the Caddell Flex Space.


If you can't find the information you were looking for, we'll get you to the right place.
Contact Us