ConcertVR

An Immersive VR Music Experience

ConcertVR is a virtual reality project centered on an immersive live performance of a percussion piece by Iannis Xenakis. Designed for deep auditory and visual engagement, ConcertVR employs spatial audio and multiple microphone placements to capture the dynamics of the performance, enveloping users in a rich, interactive soundscape.

Project Overview

ConcertVR creates a sensory-rich experience by integrating spatial audio from multiple sources with dynamic visual effects. Using Unity, the VR setup transports viewers into the performance, allowing them to experience the percussive rhythms from within the performance space itself.

Key Contributions

Spatial Audio Design: Placed multiple microphones strategically around performers to capture nuanced sound variations and enhance spatial immersion.

Audio-Responsive Visuals: Developed a particle system in Unity that responds dynamically to audio input from each microphone. Visual elements, including a “portal” effect, shift in real-time based on audio parameters, creating a visual representation of sound intensity and texture.

Visual Enhancements: Leveraged Unity Asset Store effects and a custom skybox to create an atmospheric environment, with plans for future optimization to improve contrast and synchronization.

Project Challenges

Lighting and Visual Contrast: The bright environment limited contrast. A darker setup with focused lighting is planned for future iterations.

Audio-Visual Synchronization: Refining timing between audio cues and visual responses to further improve immersion.

Unity Limitations: Addressing Unity’s constraints in VR rendering, particularly for optimized visual fidelity.

Research Areas

Virtual Reality and Immersive Audio

Spatial Audio Design

Real-Time Audio-Driven Visualization

Team and Contributions

Team: ConcertVR was developed as part of the Immersive Media class in Winter 2024, under the direction of Principal Investigator Anıl Çamcı.

My Contribution: I played a collaborative role in both the recording process and post-processing stages. My primary focus was on the development of the audio-responsive particle system and optimizing human-computer interaction for VR. These particle systems, which react dynamically to audio inputs, enhance the immersive quality of the experience by visually representing the intensity and rhythm of the performance.

Previous
Previous

Kill the Priest

Next
Next

Fear Follower