This past spring at the University of Virginia, a first-time joint class was offered that brought graduate students from the Virginia Center for Computer Music (VCCM) together with undergraduates in the School of Architecture.The undergraduate Robotic Ecologies class merged with the Emergent Systems in Music graduate class, and was co-taught by professors Jason Johnson (architecture) and Matthew Burtner (music), with assistance from music graduate student Troy Rogers. I had the opportunity to participate in this exciting new venture between our departments. The goal of this year’s class was for students to create and fabricate “performative spatial and acoustic instruments that sense, compute and interact to/with emergent atmospheric inputs.” The class’s group collaborations resulted in three new robotic sonic-spatial instruments. Movies and descriptions of the instruments are provided below. Descriptions were provided by the groups and video footage was provided by Jason Johnson.
E.X.S.O. (Emergent Proximity Sensing Object)
Team Members: Scott Barton, Jaime De La Ree, Steven Johnson, Steven Kemper, Kezia Ofiesh
E.X.S.O. is designed for human participation in the production of rhythms. As people interact with the moving arms, the arms respond in an immediate one to one fashion, and additionally generate rhythms played on resonant tubes. The tempo of these rhythms is based on proximity to the device. As the arm moves in relation to the human participants, the pitch of the tube changes. At first, participants will notice a one to one relationship between their proximity and the rhythms produced, but as time goes on, the system will begin to react on its own to the humans in the room–working with them, working against them, or ignoring them completely.
The skin that connects one arm to the next is a sub-structural system intended to create lateral structural stability and also to serve as a generative spatial component. As the arms move independently of one another, the skin takes on several dynamic shapes that conform to the three arm positions. The structural skin can take on many spatial qualities that result from the proximity sensors input. While the Infrared sensors serve the scale of a small scale presentation, the input could work of any type of sensor; this could make the space changing quality of the arm become a more functional component or larger scale design.
Arm movement is controlled by a DC motor attached to gears that interface with the part of the arm that enters the tube. This motor simultaneously changes the tube’s pitch and the arm’s position. A solenoid motor is connected to a beater that strikes the tube to produce sound. This sound is captured and amplified by electric microphones at the end of the tubes. LEDs attached to the arm inside the tube will illuminate when the arm moves, providing a visual trace of each arm’s movement and a visual notation of the sound being produced. The entire process is controlled by a computer running Max/MSP which interfaces with an Arduino microcontroller attached to the sensors, motors, and LEDs. Software parses the data received from the sensors and internal algorithmic processes produce emergent behavior as the arm reacts to its human observers.
Team Members: Steven Brummond, Taylor Burgess, Yuri Spitsyn, Jonathan Zorn, Susanna Wong
In Greek Mythology Medusa was once the most beautiful woman in the world until she angered the goddess Athena who turned into a hideous monster whose hair was made of snakes. She could transform any active man into stone with a single look. The hero Perseus eventually defeated her by cutting her head off; from which Pegasus the winged horse was born.
Medusa is an emergent instrumental environment which reacts to human force. Medusa depends on a field of modules that are individually activated by the touch of a person. When one module is activated it will change the states of its neighbors. State changes are registered by the humming of the module. The individual modules are comprised of a half spherical acrylic structure, a single solenoid in the center, a drum head, LED lights, a rotating motor on one side and a piezo disc connected to piano wire on the other side. The basic module is triggered when a person hits the piano wire. This in turn triggers the solenoid which hits the drum, effectively changing the state of the module. The state of the module refers to the humming. The humming is produced by a gear which rubs against a guitar string creating vibrations into the drum head generating sound. The speed of the motor is a function of the force a person applies to the piano wire. Once a module is triggered the delay time does not allow for the module to be triggered again for another ten seconds. The emergence of MEDUSA develops from the array of people hitting the piano wire with different forces. The different modules will continuously change state and react with different speeds of the motor. The myriad of reactions begin to develop a pattern of emergence through variation and consistency of reactions.
Team Members: Andrew Hamm, Lanier Sammons, Jen Siomacco, Wendy Stober. Peter Traub
The concept of Panta Rhei derives from the philosophy of Heraclitus, the pre-Socratic Ionian philosopher. Translated, Panta Rhei means “everything is in a state of flux.” Heraclitus is well noted for his belief that constant change is central to the state of the universe.
Panta Rhei is an audio/visual instrument capable of displaying an emergent system in light, allowing human interaction with that system, and translating the resulting information into both music and robotic choreography. Human interaction happens within the grid as observers insert their hands to block the flow of light between LEDs and corresponding photoresistors. The sonic elements of the piece are realized with Max/MSP. The brightness levels of individual LEDs (or groups of LEDs) may be made musical in several ways. In the current incarnation, LEDs are tied to a bank of oscillators whose envelope and pitch are determined by the level of light. A Mylar skin manipulated by solenoids provides the robotic choreography. The solenoids also respond to changes in the light level of the LED/photosensor grid. Data from the grid is monitored in Max/MSP and relayed to the solenoids through a microcontroller.
Instrument Materials: Acrylic, piano wire, plastic zip-ties, mylar, metal brad connectors. Hardware: 12 Solenoids, 4 Arduino Microcontrollers, 18 LEDs and 18 photosensors. Software: Max/MSP