Click to view : VR Recording Studio Webstory

A development blog article about the creation and building the application VR Audio Engineer, where people can learn and experience mixing in a virtual recording studio.

Unity3D development

Preface

In 2013 I was hired by a company to do the research and project management on a virtual reality experience that was being developed for an indoor attraction park. The Oculus DK1 was not yet released and room scale VR solution was not yet invented. A lot of time went into research and trying out different hardware/software solutions. We ended up building a markerless motion tracking system for 4 persons at a time and a modified DK1 with a laptop in a backpack to create 6 DOF room scale VR. Wireless HDMI was available, but with a latency of 40 ms, it was not really an option. After two years the project sadly came to an end. The VR equipment and software by the end of 2015 just wasn’t good enough or affordable for our client yet, to do a full theme park installation. The work and research done for this project have given me experience with marker-less motion capture, Unity3D game engine and the possibilities of virtual reality.

After several projects, I explored opportunities to create a VR project of my own and thought about what kind of reaction I would like to see in VR. I wanted something that could invoke creativity. I came up with something close to my own liking and that was a virtual reality recording studio.

The first step was to do research to see if it was possible to create a working mixing desk in a game engine and continue from there. I did a quick successful test in late 2015. Due to limited time, it stayed on hold until 2017. In between, I worked for many different clients and projects which improved my Unity, programming and 3D modelling skills.

Prototype

In 2017 I started with the creation a full-scale prototype in Unity3D / VR. The first step was to research and decide what kind of equipment I would try to replicate. The centre piece in the studio had to be the console, so my first idea was to build a Focusrite (inspired by the Focusrite documentary).

The story of the Focusrite Consoles

But after finding little reference material or information on the Focusrite consoles on the internet, I continued looking for an alternative. Another
legendary console to replicate would be a Neve console (Soundcity documentary) but those consoles had a lot of different variations and I didn’t find enough of documentation available online either.

Soundcity Documentary Film trailer


While I was looking for information on Neve consoles, I did manage to find a pdf manual of the AMS Neve VR Legend. A console named VR Legend to be used in VR seemed like a logical step. Luckily there was a lot of reference material available on the internet. I started modelling and building the console in 3D based on pictures and drawings and began programming it using the manual.

After a while I had the basics of the console working (level, eq, monitoring) with a virtual 24 track Studer as a wave/mp3 file player. I quickly created a mockup virtual control room to give the sense of space and scale when in VR. Then came the hard part, the following three challenges during development took the most amount of time.

1st challenge

The first challenge was optimising graphics & code for VR. As a console has many buttons, there are a lot of elements that need to be drawn to the VR headset at a minimum of 90 frames per second on two screens. I had to find the most optimal way of doing so which involved optimising 3d models, using instancing and re-texturing objects.

2nd challenge

The second challenge was to reduce audio CPU time. Although the first test I did in 2015 was working, it used too much DSP CPU power to be able to expand. I learned a bit of C++ to modify audio plugins for Unity to do parts of the processing in more efficient code and leave enough CPU power for VR.

3rd challenge

The third challenge was developing a natural VR interface to control the relatively small buttons, sliders and pots with VR controllers. This went through many iterations before settling on the current implementation where haptic & visual magnification is used in conjunction with natural gestures. (turning, sliding and pressing)

Research

While on holiday I came acros an article in an old tape-op magazine from the audio engineer and author John La Grou, who in 2014 really predicted a similar VR application to be used in a studio environment.

“The future of audio engineering”

John la Grou – Tape Op magazine –

Following an excerpt from his conclusion :

By 2050, post houses with giant mixing consoles, racks of outboard hardware and patch panels, video editing suites, box bound audio monitors, touch screens, hardware input devices, and large acoustic control rooms will become historical curiosities. We will have long ago abandoned the mouse. DAW video screens will be largely obsolete. Save for a quiet cubicle and a comfortable chair, the large, hardware cluttered “production studio” will be mostly a quaint memory. Real space physicality (e.g., pro audio gear) will be replaced with increasingly sophisticated headworn virtuality. Trend charts suggest that by 2050, headworn audio and visual 3D realism will be virtually indistinguishable from realspace.

Microphones, cameras, and other frontend capture devices will become 360degree spatial devices. Postproduction will routinely mix, edit, sweeten, and master in headworn immersion.
……. (removed content)

Future recording studios will give us our familiar working tools: mixing consoles, outboard equipment, patchbays, DAW data screens, and boxy audio monitors. The difference is that all of this “equipment” will exist in virtual space. When we don our VR headgear, everything required for audio or visual production is there “in front of us” with lifelike realism. In
the virtual studio, every functional piece of audio gear every knob, fader, switch, screen, waveform, plugin, meter, and patch point will be visible and gesture controllable entirely in immersive space.

Music postproduction will no longer be subject to variable room acoustics. A recording’s spatial and timbral qualities will remain consistent among any studio in the world, because the “studio” will be sitting on one’s head. Forget the classic mix room monitor array with big soffits, bookshelves, and Auratones. Headworn A/V allows the audio engineer to emulate and audition virtually any monitor environment, including any known real space or legacy playback system, from any position in any physical room.

You can find the entire article at tape-op magazine archives or :
https://www.stereophile.com/content/audio-engineering-next-40-years

Development halt

So far the project had been self-funded. But while this was a work in progress, it didn’t create any income which limited the amount of development time I can spend on this project. To be able to dedicate
more of my time and resources to the development, it needed to have funding. And although I found a quite a few people who were interested, none of them have been able to supply funding for this project at that date.

You can see on my portfolio page what the application could do at that time.
https://www.anymotion.nl/portfolio/vr-sound-engineer-retro-recordings/

Almost all development came to a halt in the end of 2018, as I needed to provide more income with paying jobs in video and photography, I had less time to work on this project.

In April 2020, when normal work (photography, video editing, filming) came to a complete standstill due to the Corona Virus, I had jsut learned about the new XR modules and ECS system for audio and data processing in Unity3D 2018 and 2020 versions. This gave me new energy to continue working on the project once more.

Current Status

The current status of the development is that I’m rebuilding the entire application & 3d models in the latest version of Unity and Blender 3D. This to make it work with all XR / VR headsets that are on the market now, while offering a standard interface across devices. It’s also being doen to port all the audio processing which was done in a mix of C++ and C# to the new C# audio ECS in Unity3D. This should not only expand the possibilities, but also make it a lot easier and efficient to program and manage all the code.

In the next BLOG post, I’ll go further into possible use cases or this application, and how other developers can help development.

#audiocomponentsystem #mixingconsole #development #ecs #audioengineer #mixing #netherlands #neve #Oculus #nevevrlegend #recording #soundengineer #soundcity #recordingstudio #unity3d #Virtualreality #Vive #xr #application

Geef een reactie

  • Berichtcategorie:Blog / XR
  • Bericht reacties:0 Reacties