ScanIR version 2 is out!

ScanIR version 2 is now available on Github

This tool, developed at NYU MARL, helps with the measurement of acoustic impulse responses, whether for room acoustics or for binaural filters. The new version allows for SOFA output format to be saved out of the measurement, automatic measurement sequences using ARDUINO and rotating step motors, acoustics analytical metrics and more customization options.

Accompanying publication here

GitHub Link below:

NVSonic Headtracker NYU

NVSonic Head Tracker
NVSonic Head Tracker

What is head-tracker? Why should I care? 

Head-trackers send accelerometer and gyroscopic data from your head-tracker hardware to the software of your choice (in this case Reaper) allowing you to listen to your mix the same way your audience would listen to it if they were wearing a HMD. Why should I care? VR music and video experiences are becoming extremely popular but there currently does not exist a solution for mixing or mastering inside VR, the head-tracker is the next best thing! In our case, we decided to go with the NVSonic head-tracker because it was the easiest to use and most affordable. A little heads-up before you start on your journey building one, in order to load the boot software unto the chip on the tracker you will need access to a windows computer since the bootloader by my friend Tomasz Rudzki is only available for Windows at the moment. I borrowed a friend’s Windows computer for this step, it only takes one second.  

Why did you write this piece? 

While Tomasz’s site does have a great tutorial on how to set up and operate the tracker, I thought we should give you some more information regarding possible complications you might experience in the process and what we did to overcome them. Happy tracking!  šŸŽ§

NVSonic Head Tracker Instructions

  • Build the tracker and bootload it
  • Some notes: 
    • We had a wiring problem, you might need to test different gauges of wire if communication gets lost repeatedly.
    • We also recommend getting some zip ties and wireless headphones to make work more pleasant. 
  • Install Reaper https://www.reaper.fm/index.php
    • There is a free trial.
    • $60 for a full license.
      • Reaper is the only DAW that enables OSC communication. There is another head tracker that goes by the name Mr HeadTracker that is also OpenSource but the build is more complicated for that one. This other tracker sends connections via MIDI instead of OSC.
  • Install 360 FB Plug-in  https://facebook360.fb.com/spatial-workstation/
    • The Spatial Workstation not the SDK
  • Open the Reaper template found in your applications folder. 
    • /Applications/FB360 Spatial Workstation/Reaper/SpatialWorkstation

    • Its file name should be called: ‘SpatialWorkstation.RPP’. 

  • Note that all templates are protected by Reaper, this means that you cannot make changes to this template. If you want to make your own template you can save as template, this option is found under the File menu. 
  • When you open up the template start by saving as whatever you want to call your project. Then proceed to make changes.
  • The first four tracks of the template are identical in terms of routing, the only difference being that the second track has the input of the FX VST set to 1st order ambisonic while tracks 1, 3 and 4 expect a mono file. (a bit of an oversight by FB there)
  • This site https://www.reaper.fm/sdk/osc/osc.php tell you how to access OSC preferences in Reaper, this is the gist:
    • To enable network communication between REAPER and an OSC device, go to Options->Preferences->Control/OSC/Web, and add a new OSC (Open Sound Control) control surface ā€œmodeā€.
      • You can always go to preferences by using the shortcut (āŒ˜ + ,).

  • Call the Device Name whatever you want. Enable ā€œReceive on portā€ and match the port number to the Bridge Applicationā€™s port number (9001 by default on Mac OS X).

  • Enable “allow binding messages to REAPER actions and FX learn”, this setting is found in that same window where you wrote in the port number.
  • Hit Ok.
  • In Reaper Mix window, click on FX on the Control Plugin track to show the FB plug-in.
    • Drag down to expand the track if not visible.
    • Alternatively, use the shortcut (āŒ˜ + m) to show mixer.
  • Click ā€œGet From Videoā€ to disable the feature, the yaw, pitch and roll should now be modifiable.
  • Mute pitch and yaw in the bridge application by clicking on the M, it should turn red.
  • Click on the parameter you want to learn, which is roll (Y-axis)
  • Click on Param in the FB Plug-in, itā€™s at the top of the window.
  • Click Learn, you should see the parameter you are trying to automate in the command text window: ā€/Rollā€.
    • Hit Ok.

  • Do the same thing for the other parameters, muting the other messages you donā€™t want to pass and one by one assigning them to the correct Listener parameter: Roll, pitch, yaw.
  • After all the parameters are assigned you may need to click on the parameter again to refresh the communication.  
  • Have fun!

AES Award to Immersive Audio Group Members

Members of the Immersive Audio Group, working in collaboration with the Cooper Union, were awarded the Bronze Saul Walker API Bootstrap award at the 143rd AES Convention Student Design Competition in New York for their work on a cost-effective ambisonics microphone designed with MEMS capsules (full paper can be found here).

Congratulations to Gabriel Zalles, Yigal Kamel, Ian Anderson, MingYang Lee, Chris Neil, Monique Henry, Spencer Cappiello, and Dr Charlie Mydlarz!