in ,

VR/AR/MR and XR Interview with Scott Millar, Technical Director of LongerDays.IO

VR/AR/MR and XR Interview with Scott Millar, Technical Director of LongerDays.IO

Scott Millar

June 3, 2020 – Today, Scott Millar talks to VFX Online about Virtual Reality/Augmented Reality/Mixed Reality and Extended Reality in our Industry.

Scott Millar, technical director of LongerDays.IO is one of the proponents of disguise xR technology in live production and discusses how Augmented and Mixed Reality workflows will shape the future of the industry after the COVID-19 global pandemic.

// From Scott Millar, Technical Director of LongerDays.IO

How do you describe yourself professionally?

From roots in engineering, I’m a technical producer and director for shows that focus mostly on video in live events, installations, TV and film.

How did you enter the industry? What sparked your interest in working with reality?

I started in the AV industry about seven years ago, from a previous background in engineering, working predominately with large experiential projection mapping shows in the early years.

I have worked with disguise R&D over the last few years playing a key part in its development of xR technology, which grew out of my work to get into TV broadcast, working together on the studio sets of Bloomberg and ITV Sport. The idea was to build a solution to deliver compelling stories with live interactive graphics and real-time CG played off large LED backgrounds, to enhance the delivery of news and other live broadcast events.

I always wanted to work in engineering, particularly in film and TV, and when I was younger, I wanted to build largescale mechanical sets for movies. Now, thanks to the virtual realms brought to life in production by technology, I can now straddle both of these passion projects!

What are your favourite VR/AR/MR/XR Projects and why? What was the Best VR/AR/MR/XR Project you worked on?

Recent productions I’ve been involved with include some of the most groundbreaking live events to incorporate AR and MR into their production, one being the Pakistan Super League Opening Ceremony – the longest ever AR-centered broadcast. With disguise, we were able to composite all six cameras feeds with a delay, with AR, and with the graphics on top to present that composited output to the vision mixer to the camera returns, and we were able to do that for two cameras per server.

Another highlight for me was the HP Omen Challenge tournament – the first esports event to use xR, which played out to a live audience in a studio in London, and broadcast to millions of viewers worldwide. The event organisers wanted to put the players and casters into the game world and as much as we could have done that with green screen, the fact that we were able to do it with LED meant that when people were in the game they could see themselves and what was going on in real-time. Using the xR workflow on an LED stage also allowed the live audience to engage with the production, and that perfect line-up could only be done with disguise.

The ability to use xR in live productions gives a whole new world to creatives and video directors. With the large LED screens seen in touring concert shows, it’s the perfect medium to apply these new tools. In broadcast or corporate events, AR can be used to bring objects and products to life in a way not seen before with minimal additional equipment.

Pakistan Super League

What challenges did you face when creating VR/AR/MR/XR Projects? Any fun stories from behind the scenes?

Overall, with industry moving so predominantly into the virtual world, one of the biggest challenges that we are facing is breaking the cycle of pre-existing workflows for largescale entertainment shows. These productions have been running for years, sometimes for over a decade; they know their budgets, and their audience, and are reluctant of change, reserving the right to play a relatively safe game.

When trying to break down those timelines and workflows, our goal is to help educate producers and directors to try something new and explore the options on offer to them to push the envelope and break new boundaries in production. That education can go a long way.

Take the Eurovision Song Contest 2019, broadcast to over 80 million people around the world from Tel Aviv. disguise’s software allowed teams onsite to pre-programme the playout with cameras and automation, providing the production team with the ability to programme one track while another is playing, plus the power to easily feed content to DMX screens and quick calibration of projectors. This education and preparation allows for teams to be extremely time and cost efficient and is the foundation of new visions in broadcast production.

What are the most utilised VR/AR/MR/XR Applications from your clients? What demand are you seeing?

In a time BC (before COVID-19!) we were experiencing demand for big AR events, particularly for esports, and this demand was also crossing over into the world of sports. A popular option was particularly in pushing xR stages for untrained for participants to not have a green screen around them have a sense of pace inside LED cubes, and to engage and watch sports action happening around them.

We were seeing a theme of the industry being somewhat reluctant to move with the times, and it was a challenge to drive our core concepts of mixed and extended reality through. It was really In Camera VFX peaked people’s interest in what can be done with real time content.

As previously seen in movies from the Star Wars universe, disguise technology allows production teams the luxury of not having to render video beforehand, concept of real time rendering. disguise is positioned nicely in the middle of this – bringing streams of pre-rendered content together, amalgamating the output on screen, and there is no media playback tool quite like disguise for this.

In a PC (post-COVID) world, the complete shutdown of a predominately travel orientated industry has driven home the concept of how we can shoot high quality content with minimal people and support. It raises the difficult questions that make producers or directors have always thought but never asked, and to consider the technology available to them to make productions continue to keep audiences gripped in a ‘restricted new world’.

HP Omen Esports Championships

How is the xR utilized in the current state of the film and gaming industry?

We’ve all seen the recent shifts to the online concept of interacting, but as of now, it’s still hard to replicate social norms, like going to a nightclub, pub or bar. COVID-19 has opened people’s eyes to how traditional work places will change in light of the pandemic, and with production being on hold for non-essential programmes, it has put workflows under the magnifying glass on how to be more efficient, including staffing and technology. COVID has accelerated this re-evaluation of methodologies, and it will be interesting a year or two from now to see it’s longer term effects on the industry; it might spur some changes of hearts and minds, and open up the adaptation of more virtual concepts.

How do you make sure you are always helpful to 3D artists and VFX artists as well as industry professionals?

The biggest challenge with the move to In-Camera VFX shoots will be that VFX artists are working before a shoot, rather than after. With this consideration, we should be asking the question about finding a balance, making sure they’re comfortable in that new world.

Interesting advances in render engines are making it easier to produce changes to content on site, so VFX artists can now be a bigger part of the conversation whilst filming is happening. They’re no longer restricted to working with the camera plate given to them in the edit, and this revolutionises production – putting the artist in the room, therefore providing a much faster turnaround time.

In providing artists with these wonderful new tools, we just have to ensure that they’re comfortable with that. It’s all part of the education process that I discussed before, ensuring everyone feels comfortable with these new technologies in the new world.

How is AR or XR changing the VFX and Animation Industry?

Driving forward the concept of real time, since it generally only works in these environments. It’s exciting from our side to see the drive from real-time graphics, the world of high frame rate, photorealistic graphics, and that all of these industries are now looking to virtual environments.

Sharing of ideas across industries it isn’t going to fix every production out there, but if we utilise this shared knowledge pool and the technology, we can make huge steps forward, and make it interesting, not just realistic. Who knows, we end up in a ‘Ready Player One’ style dystopian world as we aren’t restricted to the world of physics so much anymore!

We would like to thank Scott Millar for the great interview. For more information on disguise xR (Extended Reality) technology, please visit the disguise website

What do you think?

Written by VFX Online

VFX Online, now writing with a focus on Visual Effects and Animation and Gaming, writing at VFX Online Blog since 2016. VFX Online in India.

Comments

Leave a Reply

This site uses Akismet to reduce spam. .

Loading…

0

Comments

0 comments

VIEW Awards 2020 Submissions Now Open! (Until 15th September 2020)

Netflix Animation Brings Together recent grads for Virtual Speaker Series