Shimon is an improvising robotic marimba player that is designed to create meaningful and inspiring musical interactions with humans, leading to novel musical experiences and outcomes. The robot combines computational modeling of music perception, interaction, and improvisation, with the capacity to produce melodic acoustic responses in physical and visual manners.
Real-time collaboration between human and computer-based players can capitalize on the combination of their unique strengths to produce new and compelling music. The project, therefore, aims to combine human creativity, emotion, and aesthetic judgment with an algorithmic computational capability of computers, allowing human and artificial players to cooperate and build off each other’s ideas. Unlike computer- and speaker-based interactive music systems, an embodied anthropomorphic robot can create familiar, acoustically rich, and visual interactions with humans. The generated sound is acoustically rich due to the complexities of real-life systems, whereas in computer-generated audio, acoustic nuances require intricate design and are ultimately limited by the fidelity and orientation of speakers.
Moreover, unlike speaker-based systems, the visual connection between sound and motion can allow humans to anticipate, coordinate, and synchronize their gestures with the robot. To create intuitively as well as inspiring social collaboration with humans, Shimon analyzes music based on computational models of human perception and generates algorithmic responses that are unlikely to be played by humans. When collaborating with human players, Shimon can, therefore, facilitate a musical experience that is not possible by any other means, inspiring players to interact with it in novel expressive manners, which leads to novel musical outcomes.
Shimon has performed with human musicians in dozens of concerts and festivals from DLD in Munich, Germany, to the U.S. Science Festival in Washington, D.C., to the Bumbershoot Festival in Seattle, Washington, and Google IO in San Francisco. It also performed over video-link at conferences, such as SIGGRAPHAsia in Tokyo and the Supercomputing Conference in New Orleans.
View videos of the project here
- Bretan, M., and Weinberg, G. “A Survey of Robotic Musicianship,” Communications of the ACM, Vol. 59 No. 5, pp. 100-109, 2016.
- Cicconet, M., Bretan, M., and Weinberg, G. (2012) "Visual cues-based anticipation for percussionist-robot interaction, " In HRI 2012, 7th ACM/IEEE International Conference on Human-Robot Interaction, Boston, Massachusetts
- Hoffman, G., Weinberg G., “Interactive Improvisation with a Robotic Marimba Player,” Journal Autonomous Robots, Vol. 31. Springer Press. 2011.
- Hoffman G., Weinberg G. “Gesture-based Human-Robot Jazz Improvisation,” Extended Abstract in the Proceedings of the International Conference of Machine Learning (ICML 11), Seattle, USA. 2011.
- Marketplace Tech. Robots that play like Monk, and growth in the online job market. January 4, 2013.
- NY Magazine. The Rise of Robot Jobs. June 9, 2013
- BBC. Is the US losing its innovative edge? October 2012
Robotic Drumming Prosthesis
The robotic drumming prosthesis attaches to amputees, allowing this technology to be embedded into humans, using two drumsticks. The first stick is controlled both physically by the musicians’ arms and electronically using electromyography (EMG) muscle sensors. The other stick “listens” to the music being played and improvises.
View a video of the project here
- Gopinath, D., and Weinberg, G. “A generative physical model approach for enhancing the stroke palette for robotic drummers,” Robotics and Autonomous Systems, Vol. 86, pp. 207–21. 2016.
- Bretan, M., Gopintah, D., Mullins, P., and Weinberg, G. “A Robotic Prosthesis for an Amputee Drummer,” Preprint; Arxiv (co.RO/1612.04391), 2016.
- Tech Crunch. Play That Funky Music, Cyborg. March 5, 2014.
- NBC News. Robotic Drumming Prosthesis Gives Musician an Extra Hand. March 5, 2014.
- The New York Times. An Environmental Film Festival and High-Tech Music. Feb 24, 2014.
Robotic Drumming Third Arm
The Robotic Drumming Third Arm project explores how a shared control paradigm between a human drummer and wearable robotic third arm can influence and potentially enhance performance. A wearable system allows us to examine interaction beyond the visual and auditory that is explored in non-wearable robotic systems such as Shimon or systems that attach actuators directly to the drums. Here, we extend the interaction to the physical and garner the communicative functions that are inherently available as a result of the constant physical connection between human and robot. The primary research challenges addressed focused on design and usability. This included developing a prototype that allows for comfort and robust functionality and designing functions that increase the autonomy of the robot and decrease cognitive load for the human.
View a video of the project here
- Khodambashi R., Weinberg, G., Singhose. W., Rishmawi S., Murali V., Kim E. (2016) “User Oriented Assessment of Vibration Suppression by Command Shaping in a Wearable Robotic Arm,” IEEE-RAS International Conference on Humanoid Robots, Cancun, Mexico.
- Washington Post. Scientists Created a Three-Armed Cyborg to Play the Drums Like No Human Can. February 18, 2016
- CNET. Robotic limb Turns Drummer Into Three-Armed Musical Cyborg. February 19, 2016
- IEEE Spectrum. Cybernetic Third Arm Makes Drummers Even More Annoying. February 18, 2016
Shimi is a smart-phone enabled robotic musical companion that can respond to and enhance your musical experiences. Developed in collaboration with Media Innovation Lab at IDC Herzliya, Shimi is controlled by an Android phone using its built-in sensing and music generation capabilities. This allows easy development of additional custom mobile apps.
The first application developed for Shimi allows the robot to listen to and analyze rhythms played by humans, and respond by choosing songs similar in beat and tempo from the phone music library. The robot then dances to the music using a set of expressive gestures that fit the desired tempo and genre. Future functionalities will include expressive search and discovery, interactive improvisation, gaming, and educational applications
View a video of the project here
- Bretan, M., Hoffman, G., Weinberg, G. “Emotionally Expressive Dynamic Physical Behaviors in Robots,” In International Journal of Human-Computer Studies. 2014.
- Bretan, M., Weinberg, G. “Chronicles of a Robotic Musical Companion,” submitted to Proceedings of the New Interfaces for Musical Expression Conference (NIME 2014), London, UK.
- Popular Science. Meet The Next Generation Of Smartphone-Based Robot Companions February 11, 2013.
- NY Times. Our Talking, Walking Objects January 26, 2013.
- TechCrunch. Georgia Tech’s Musical Robots July 31, 2012