Composing Music in 360 Degrees

Last summer (2019), I was introduced to ambisonics at EMPAC as part of their “Spatial Audio Summer Seminar.” In addition, I was introduced to the VR work of the Spatial Audio Research and Development Group at UCSD as an invited participant in the Opera Hackathon organized by the San Diego Opera Company.

Overlapping with these exciting workshops, I was finalizing “Ruptures” for solo pianist and 8 speakers written for pianist Marilyn Nonken (premiered at NYU, Feb 2020). Composing music using spatial paths is not a new concept but it is now more affordable to acquire the necessary software. I am using Abelton Live and the Envelop plug in.

In “Ruptures,” the pianist and audience are placed inside of the circle of speakers so they could hear the movement of sound. While not ambisonic, I planned out the trajectories of the moving sound using a sine wave as notation in my pre-composition and then by wrote in the signal location for the performer (see score).

In early 2020, I wrote my first ambisonic piece for VR using audio captured by a Vuze 360 camera. In this case, I placed sound within the captured soundscape in fixed locations and the notation I used corresponded with the coordinates in the visual. This was much easier than spatial paths. I was working with Miriam Peskowitz on the head tracking for this Google experience but we didn’t fully realize this part of the project.

I have to admit, it is taxing to imagine and then notate spatialized sound when composing these pieces and I would love to hear how other composers do it. Ideally, I would like to have a visual way to see spatial paths and stationary placements. Please feel free to comment!

Ways to notate spatial information when composing.

Notation from "Ruptures"
Previous
Previous

Can sound have agency?

Next
Next

The Price Paid for Interactivity in Live Performance