TensioSonics
Interactive Algorithmic Improvisation/Composition through Motion Control
Project Overview:
This project represents an exploration into the intersection of technology, performance, and real-time music composition by integrating SuperCollider with Inertial Measurement Units (IMUs), facilitated by the GyrOSC iPhone application. The concept is grounded in the idea of using body movements to interact with and shape musical output, enhancing the expressive potential of the performer. By employing two iPhones as motion controllers, performers can map their gestures to various musical parameters, creating an interactive environment where physical movement directly influences sound production.
The left-hand iPhone controls overarching musical elements such as tempo, volume, and rhythm, providing a framework for the piece. The right-hand iPhone modulates more intricate musical features, such as pitch, timbre, and scale adjustments, offering fine control over melodic and harmonic content. This dual-control setup allows performers to engage in a fluid and dynamic musical experience, bridging the gap between physical expression and digital composition.
The system combines algorithmic generation and transformative improvisation to create music that responds organically to the performer’s movements. Sound synthesis techniques, including granular synthesis and the Karplus-Strong algorithm, are employed to enrich the sound palette, enabling complex and varied musical textures. The project not only demonstrates the technical feasibility of motion-based music control but also investigates the relationship between physical gestures and musical expressivity.
For a version of the demo without narration, please scroll to the bottom of the page.
Key Features:
Motion-Controlled Music: Users control global and detailed musical aspects using physical movements. The left hand manages features like tempo and volume, while the right hand modulates pitch, timbre, and scale changes.
Algorithmic and Interactive Composition: The system incorporates both generative and transformative algorithmic methods, producing dynamic musical sequences responsive to the performer’s gestures.
Sound Synthesis Techniques: The project employs synthesis methods such as granular synthesis and the Karplus-Strong algorithm, providing diverse timbral options.
Expressive Physical Interaction: Inspired by research on the role of body movements in music, the system links physical effort and posture to musical tension and release, creating an expressive and engaging performance.
Research Areas:
Algorithmic Composition
Human-Computer Interaction
Motion-Based Interfaces
Real-Time Sound Synthesis
Digital Musical Instruments
Future Goals:
Instrument-Specific IMUs: Adapt the system to incorporate IMUs for different musical instruments, enhancing the versatility and expressiveness of traditional performance.
3D Spatial Simulations: Integrate 3D sound environments that respond to IMU movements, enabling users to navigate and shape musical landscapes.
Responsive Visuals: Develop a visual component that reflects user movements, adding another layer of feedback and artistic interaction.
Non-Narrated Demo Video