New Robot Musician Makes Modern Magic with the Wave of Eight Arms

Video: Center for Music Technology
The medusai debut at Trilith Studio.
Ann Hoevel | January 22, 2024 – Atlanta, GA

The robot medusai knows where you are. It must—because it plays music with you.

Made from beautifully fabricated steel and eight mobile arms, medusai can play percussion and strings with human musicians, dance with human dancers, and move in time to multiple human observers.

It uses AI-driven computer vision to know what human observers are doing and responds accordingly through snake gestures, music, and light. Gil Weinberg, the director of Georgia Tech’s Center for Music Technology, knows it’s unsettling.

“There is a lack of trust and general wariness about AI and robotics,” Weinberg said. “In the last year people have really started to understand both the promises and the risks.”

The Greek myth of Medusa best encompasses this current anxiety, he said.

The widely known Medusa narrative goes like this: Once a mortal maiden, Medusa was known for her beauty and piety. The lecherous god Poseidon targeted Medusa and raped her in Athena’s temple. Medusa was an innocent victim, but Athena was furious at the desecration of her temple. She cursed Medusa, turning her into a deadly monster with snakes for hair.

“Something can start out beautiful, with good intentions, promise of being godly, and then things can go wrong,” Weinberg said. “Medusa was abused and punished and became something that threatened humans. Even after Perseus severed her head, it was still used as a deadly weapon, turning those who looked at it into stone.”

A Robot Musician of Mythic Proportions

Video: Center for Music Technology
A medusai collaborative performance called "Play."

Amit Rogel, a Ph.D. in Music Technology student who researches human-robotic interactions in the Robot Musicianship Lab has worked on medusai since its inception.

“Our goal with medusai is to address the risks of AI and robots,” Rogel said. “For every risk, there's two or three benefits. We think the risks are worth the benefits. And it's worth acknowledging and mitigating the risks.”

Much like Athena sprang to life from the head of Zeus, medusai manifested from Rogel and Wenberg’s last project, FOREST.

An NSF-funded project that researched trust between humans and robots, FOREST transformed industrial robot arms into musical, dancing, tree-like objects that reacted to human collaborators.

While FOREST was a positive, gentle, and beautiful robot/human interaction, medusai is intentionally designed to be uncomfortable, Rogel said.

“The sculptural aspect of medusai does not look approachable. Tristan Al Haddad designed the face asymmetrically -- it’s not what you go for if you’re trying to make something elegant and inviting. The robot arms staring at you and coming at you with Tristan’s sculpted snakeheads are intimidating. The sounds it makes are not natural, calming melodies.”

These choices induce fear or metaphorically show it, Rogel said. But the beautiful thing about medusai is how it changes over time.

“There's so much it can do,” Rogel said. “Every time you play with it, it's different. Drum on the face and it responds with drums. Or the next time it responds with lights. Or the arms are mostly gesturing and another time the arms are playing a melody that's as complicated as it can be.”

Emotion and the Monster Project

Video: Center for Music Technology
A performance with a prototype version of medusai.

The Center for Music Technology’s Robotic Musicianship Lab already has a stack of robot projects that could be called “greatest hits.

Ever heard of Shimon, the rapping, marimba-playing, movie-scoring, singing, four-armed robot who can jam with jazz musicians? Or the Skywalker Hand, a prosthetic that can play piano? The Guitar-bot, delicate enough to play a guitar with human-level expression? How about the Robotic Drumming Third Arm or the Robot Drumming Prosthesis that helped a drummer make it into the Guinness Book of World Records?

To the outside world, this lab is full of modern-day wizards. For music technology students and faculty, it’s all about robotics, algorithms, and new music that enhance real-time collaboration between humans and robots.

Ripken Walker, a Dual Bachelor and Master in Music Technology student is part of the team making medusai one of those hits. He studies how lighting affects interactions between humans and robots.

After being in the Robotic Musicianship Lab for less than a semester, and encouragement from his fellowship benefactors Colonel Stephen and Pam Hall, Walker convinced Weinberg to apply his lighting ideas to medusai.

“Ripken used lights as anticipatory cues to let you know where to look,” Weinberg said. “Whenever a string is about to pluck you will have lights indicating where to look and showing you what's going to happen.”

Walker wants to build his career around crafting experiences for people.

“Being able to influence someone’s emotions with technology, that’s what the magic is,” Walker said. “We can build as much technology as we want, but if it doesn’t evoke any sort of emotion, then what is it really for?”

medusai Surrounded by Movie Magic

Video: Center for Music Technology
A medusai performance called "Dance."

Hannah Schilsky from Lux Stage at Trilith Studios agrees. She’s part of the team that programs and operates one of the world’s largest 360-degree LED sound stages, where movies like The Avengers are filmed.

Lux Stage partnered with Weinberg to provide an interactive environment for medusai’s intimate debut. Schilsky collaborated with Rogel, Walker, and the rest of the medusai team to program the soundstage walls in ways that connected with medusai’s AI programming.

Linking her Medusa-inspired watery pallet to Rogel’s programs was like musicians jamming, she said.

“When I saw the data driving the animations in real-time for the first time, that was like magic. Amit’s mesh had the correct rotational values, the data was seamlessly streaming into Unreal, and then seeing the robot’s guitar picking trigger events in Unreal, all of these layers of interactions and choreographed movement being performed by medusai synchronously affecting the visuals on the LED wall” she said, “that was incredible for me. I have never worked with robots.”

Schilsky was impressed by the musicianship of the medusai team. “This is not just installation art. They were making real music with robots. This is next level,” she said.

The crossovers between the technical research of the medusai team and their natural musicianship combined with real-time interactive fx is part of the future of the entertainment industry, Schilsky said. “More and more technology is developing to empower performers and audiences to become a part of the digital creation process in ways that were never before possible. It's becoming one big immersive experience.”

“The accompanying visuals are generated and rendered in real-time using Unreal Engine 5,” said a Lux Stage spokesperson.

“The data that is driving medusai’s movement is also being sent to Unreal Engine and distributed across different systems within Unreal to drive the CG robotic arm's animation, Niagara particles, and material fx. Concurrently, the audio from this performance is analyzed in real-time using Unreal Engine and the resulting spectral analysis dynamically updates parameters within the Niagara particle systems.”

“The Unreal Engine programming and VFX was executed and operated by Lux Machina and run on the Lux Stage at Trilith."

Credits

Robotic Musicianship Group

Formation Studio

Lux Stage

Concept and music: Gil Weinberg
Project leader, robotics: Amit Rogel
Interactive lighting: Ripken Walker
Spatial sound and performance: Nicollet Cash
Computer vision: Emily Liu
Gesture design and dance: Hope Phan
Performers: Jiahe Qian, Joe Cleveland
Construction: Marcus Parker, Keshav Parthasarathy, Xinyi Yang 

Design: Tristan Al Haddad
Construction: Chastain Clark, Jared Abrahamian, Devin Lohman, John Wilson

Animation Programming: Hannah Schilsky and Jeptha Valcich
Art Direction and VFX: Hannah Schilsky
Systems TD: German Perl
LED Engineer: Andrew Paul
Virtual Production Supervisor: Jason Davis
Coordinator: Dominique Moxie
Operator: Wesley Goins

For more information about the medusai project, including photos and video, visit medus.ai

Media Inquiries

Ann Hoevel
Director of Communications
College of Design
E-mail Ann