This week I moved out of 2D stereo and into 3D, learning how to make 360 spatial audio or ambisonic recordings with the Zoom F8 MultiTrack Field Recorder and AMBEO VR microphone. The AMBEO contain four capsules pointing in different directions for four-channel recordings. My sketch pairs this audio with 360 degree video footage shot simultaneously with a Ricoh Theta M15. Though this first and foremost was a sound assignment first, I devised a scenario to embed my listeners within a sphere of both ongoing visual movements and audio. My goal was to give viewers reasons to move around and explore the scene.
Creating this was a multipart process, each with multiple steps. Here’s a quick summary with a few tips for future endeavors.
Part 1: Equipment Setup & Recording
Both the recorder/mic and camera need to be set up separately. Ideally, you want them to “face” the same direction so that the sound is mapped correctly (although this can be fixed in post-production). Be sure to test and adjust your audio levels such that you’re not clipping. While it’s easy to control the camera through a Ricoh app on your phone, you’ll still have to start the recording devices separately, which means marking the recording with a loud clap of the hands (what’s the official name for this filmmakers?) to line up the tracks later. (Although I later trimmed this out of my video and used the visual cues of the waveforms to sync up the tracks in Reaper.) I am very fortunate to attend a school with a amazing film program with super sound studios, to which I found access. I recorded in “dead” room, insulated from the all the surrounding city sounds.
Part 2: Processing Files
The video transferred to my laptop via Image Capture, and I then stitched it together using Ricoh’s desktop app. I then dragged that into the Facebook 360 Spatial Workstation Reaper template project file, which gave me a readout of just the audio track. No processing was necessary for the sound recording, it was already a 4-channel B-Format AmbiX file that I dropped into a spatialized track in the project file. (However, I still had to open the spatial plugin on that track and set the format to B-Format 1st order AmbiX.) I aligned the AmbiX track to the video’s audio, and then muted the latter to edit, add some compression, and another sound track I found elsewhere. This post was a useful resource in reminding me of that syncing workflow. Finally, I rendered the final version as a .wav file.
Part 3: Encoding
Using the Facebook 360 Spatial Workstation Encoder, I combined the video processed in Part 2 with my new .wave file into a file ready for YouTube. Unfortunately, uploading it to YouTube altered my added soundtrack from stereo to mono, but the spatialized audio was still intact.
Give it a whirl above! Literally. Click and drag your cursor to move around the video. How many marbles can you follow? Headphones recommended.
Credit: Thanks to Anthony Ouradnik for his Ballpark Organ Music. Go Twins! ♥️