Music in Motion is an innovation in musical performance in which a relationship is built between audio, gestures and dynamic images. The basis of this project is to capture the movements of a musician during a performance and use that information to manipulate music and graphics in real time. The performance system consists of two parts. The first part captures the movements of an electric guitarist during a performance. This motion data is then used to manage various audio effects and parameters such as tone, pitch, volume and overdrive. For example, raising the neck of your guitar will change the position of a wah pedal or increase the gain and distortion. The second part is to create a visual presentation driven by music and motion data. This real-time dynamic display can be projected during a show, on large screens behind the performers, at the sides of the stage, or on the floor and ceiling, depending on the location. With both parts working together, a new genre of performance is born. A performance that connects the audience to the musician through dynamic images and physical movements of the body. This system encourages a very expressive performance that makes a lasting impression. Literature Review Body Movements and Gesture Control A movement study was conducted in 2003 by Sofia Dahl and Anders Frigberg, entitled “What can body movements reveal about a musician's emotional intention?” [3] In this study, they filmed a marimba player with the intent to show sadness, anger, happiness, and fear. From the recordings they created four different clips focusing on key areas of movement and non-verbal communication. The clips were cropped to show the torso, the head, everything except the... center of the paper... of South Carolina University. The study concluded that low frequencies are commonly associated with values of darkness or dimness while high frequencies are associated with sensations of visual brightness [2]. Ultimately, I will have a real-time dynamic visual demonstration of how we are currently perceiving and memorizing the music we are listening to. Sources of Evidence There are many studies and experiments that are closely related to the audio, visual, and motion variables in the problem I am proposing. For example, Dahl and Frigberb's findings on what body movements reveal about a musician's emotional intent [3], Tom DeWitt's early oscilloscope experiments generating visual shapes of flowers driven by a synthesizer [4 ] and Golan Levin's pictorial interfaces for audiovisual performances [8] . However, I have yet to find a study that combines all three elements.
tags