Start a new topic

Atmos integration with Unreal Engine for linear cinematic content

Hello, Atmos VR Team.


I have been experimenting with the beta for a few weeks and am enjoying the experience so far. I’m about to start work on remixing and adding spatial audio to an animated, cinematic VR project that is being done in Unreal Engine. The client creates a build of the project to playback on HTC Vive while controlling / syncing motion seats. It’s pretty cool!


My question is how can I implement my Atmos mix into the Unreal Engine build of the cinematic experience?

I saw the Knowledge Base post saying that it’s not supported for gaming/ interactive experiences because Atmos is designed for linear experiences. Since this is a linear experience using Unreal Engine, is it possible?


Thanks so much for your time and support.


Cheers,

Daniel


Good Question.


In order to fully understand we'd need to delve a little deeper into the implementation side of things. At a basic level what you seem to be describing is using Unreal as an Animation Render engine to output a movie which happens to also interact with external devices for movement etc. Within this use-case you may be able to use Atmos for the content creation and playback depending on how the movie file will eventually be played. 


We have other experiences that have been made in Unreal in this manner and are being played back with Atmos soundtracks on publicly available platforms (one such example is the History of Flight piece on Jaunt that was mixed by Benedict Green at EccoVR. The visual aspect was generated using Unreal engine).


Please feel free to drop an email to vrcontent@dolby.com and we'll delve deeper into this particular project offline. 


Ceri


1 person likes this

Will do. Many thanks, Ceri.

Login to post a comment