Let's High Fidelity this:
What would be your All Time, Desert Island Top 5 feature requests?
1) Equi-Rectangular Overlay. This is really a "must". Ideally you would be able to see/modify multiple objects at a time, and gang them together for overall rotation, "size", and so on (much like how Spanner works). I think the AudioEase guys have done a pretty good job with their overlay feature on their new spatial panner, but I've yet to actually use it...
2) High-Resolution HMD Playback. One of the constant battles we're fighting is lip sync. With a mono equi 1080 pic in PT, it's impossible to see sync, and the 1080 mono Dolby player version is only moderately better. We need a high-def, synchronized playback so we can really nail the lip sync.
3) Visual Feedback Option in the Player. Currently all of the VR tools are more or less "guess and check" systems. You mix in Pro Tools, to a mono equi video, guessing at how far people are, the lip sync, what that little thing in the corner is... and then you play it back on the HMD, realize all the things that aren't right, and take another pass at the same process. I would love to see more visual feedback in the HMD to assist with mixing. Maybe object balls that move around in the viewer, much like the Atmos Monitor app? Maybe a metering overlay? Maybe even be able to pan objects while in the HMD with your gaze? The options here are really endless, but this is something that, to my knowledge, is not a part of any other tool at the moment.
That's all that's on the top of my head... I'll chime in with 4 and 5 when they come to me.
In the 2.2.1 release this is possible to do, although you'll have to setup the L and R of your music track as objects.
I'm not sure what Travis meant by overlay, but it might be something that I was also thinking of. It would be very helpful if, when viewing the video in the PT player in equi-rectangular view, you could view the equi-rectangular panner grid on top of the picture and/or see the yellow ball as well. This would make automation very fast and simple. You could just set the ball on top of the object making the sound and drag it in real time to follow the movement of said object, recording the movement. Maybe this would be a global window that would display all the objects on the same screen or maybe you would only view one object at a time. This may best be achieved by including a separate player app, the way the monitor and the renderer are separate apps. A separate app could also accommodate picture rotation en lieu of HMD monitoring.
By the way the spatialization algorithm sounds outstanding. The Z axis is very convincing especially.
@sebatian Video Player to remember the IP of the local renderer - Should be fixed now along with some new parameters in that window that will help you out.
@Red Bull if you render your mix out of Atmos VR to a B format then you can use the newest iteration of the FB360 Encoder to marry the video to the B format mix and even at a head-locked stereo file separately. It will encode the file specifically for compatibility with Facebook to hear spatial audio (on mobile app only as of right now). I haven't actually done this myself, but I don't see why it wouldn't work. But yeah going straight to a .tbe file would be really cool as I find it sounds better.
It probably goes without saying, but I'd love to see a Mac-compatible video player that supports the Oculus DK1 / DK2 HMD for monitoring 360 video and the spatialized audio mix in realtime, to have the ability to run the whole session off the same machine. I say DK1 / DK2 because those HMDs are compatible with Mac OS and there is precedent in the FB360 player to have this option.
For FB360 compatibility check the Dolby Mixes on a Mac thread here. There is a sample session attached to that posting that would enable you to create a FB360 compatible mix directly in our toolset. While export to .tbe is not supported this makes it pretty easy to utilizer our tools to create content and therefore have an Atmos ready mix as well as outputting to other formats.