in ,

Gaming Interview with Gary Edwards, Principal Animator, nDreams

Gaming Interview with Gary Edwards, Principal Animator, nDreams

Gary Edwards

July 6, 2020nDreams are an award-winning VR studio, dedicated to high-end games and experiences

Today, We spoke to Principal Animator, Gary Edwards, about the team’s latest project – the VR stealth/action game, Phantom Covert Ops.

// From Gary Edwards, Principal Animator, nDreams

What can you tell us about the latest VR game you have been working on?

Phantom Covert Ops is a stealth action game that we are creating for the Rift and Oculus. Dispatched into remote, hostile wetlands in your tactical kayak you play as a Phantom, an elite and deadly covert operative with a single night to prevent all-out war.

What has been the most rewarding aspect of developing games in VR?

For me personally the most enjoyable part of working on this game has been being involved with creating the immersive illusion you experience as you kayak through the fantastic environments. You really do lose yourself within the VR world as you paddle through the reeds, out of the corner of your eye you may momentarily catch sight of a lizard running along a log, or see droplets of water dripping from an old pipe into the water below, creating life like ripples on the rivers surface.

I view the immersive difference between a VR experience and using a traditional controller and console as being like comparing Peter Jackson’s cg heavy King Kong with the original 1933 version. They both have their merits of course, but I personally see the difference as being that great in terms of believability.

Have you encountered any challenges in development, how did you overcome them?

Probably the biggest challenge for the team has been trying to get the same level of experience on the Quest as you do on the Rift, the Art and Code teams have been working very hard to deliver on this.

A lot of work has gone into the first-person body too, we haven’t gone for the ‘glove’ approach but created fully working arms and torso, it all adds to the believability. You can even push yourself from objects using the oars as you might in real life when kayaking, it’s very clever and I have no idea how the programmers make it happen, it’s witchcraft!

When did you start using motion capture on the project?

Xsens has been used from day one of the project. I joined the team after the prototype had been made and the product was going into full production, so I’ve been using it personally for about 8 months. I was amazed with how easy it was to use, as with all technology there’s that initial learning phase but there were no real difficulties as long as you followed the correct methodology when setting it up for a shoot.

Have you had experience using motion capture solutions before using Xsens?

My own personal experience of using motion capture has been very limited throughout my career, I’m a traditional hand key animator, not a motion editor. That said I did first use motion capture on a product for the very first PlayStation console when motion capture was in its infancy around 20 years ago. We used a fully rigged camera system by Vicon but the data we got back wasn’t that great and we ended up having to hand animate everything… The turnaround of us getting the data back was quite slow too. So although it was a fascinating experience working with this new technology, I wasn’t fully impressed with the results at the time!

The agile approach we adopted to make Phantom would not have been possible without the use of the Xxsens system, the freedom of spontaneity it provided for design is something we dreamed of only a few years ago.

How does this experience compare?

Before being asked to take on this project I had already had an interest in the Xsens solution, I had seen clips of it being used and became fascinated with its ability to provide excellent data but without the need for a large expensive camera setup. This type of technology has been long overdue.

Last year I was working at Framestore Film and had thought of trying out a VR project, preferably using the Xsens system. Coincidentally at this time I was kindly invited down to the nDreams studio by some ex colleagues that I’d previously worked with (on Silent Hill: Shattered Memories) as they wanted to show me what they had been creating. Whilst visiting I mentioned how interesting Xsens looked, being completely unaware that they were using it at the time. I was delighted when they filled me in that that is what they were using to create their animations. They kindly let me try out some of their projects and it was an absolute blast! Four hours later as I was leaving for home I remarked to them that I felt like I had been on a vacation, it was incredible.

Xsens enables the team to have a very fluid workflow. We can design a set of moves from week to week, no need to plan an expensive external shoot at a motion capture studio with a list of 6 months worth of animations that we may want to change later in the production. We instantly get the data, convert it on the same day and then start editing the next. It’s perfect for our workflow.

Can you tell us about your workflow when developing VR titles? How does motion capture fit in?

The workflow on this project has been quite straight forward for the most part. The first step is the design team let us know of their ideas and requirements and then the animation team creates a list of animations that should fulfil the vision of the project.

We implement the animations in stages, creating the enemy locomotion first, then any additional bespoke animations after. We film everything in-house using Xsens, dividing up the animations into blocks of importance. The data is then converted to fbx files, so far it’s a pretty standard workflow but our next step is quite different to many other studios.

We don’t use Motionbuilder to edit the animation, instead we have our own in-house Maya tools called the Dewe tools, named after our talented technical animator. The Dewe tools enable us to import and edit straight into Maya so we can use all of its fantastic animation tools. The animation team has found this approach to be an estimated 4 times faster than using Motionbuilder. A bold statement but it’s true.

We use the Epic ArtRig to create our base skeleton and rig. The ArtRig controls lean much more to how you’d create a conventional rig, enabling us to easily hand key any animations when needed. It also keeps us using one set of Maya animation tools when animating bipeds or creatures and not having to bounce between using Motionbuilder, Maya and Unreal. The cleaned-up animation is finally booted back out of Maya into Unreal using the Dewe tool.

I feel very fortunate to have a great team around me. Artists, designers, coders and animators all contributed to us winning the critics award at E3 for the best VR product on show but we know we can’t afford to be complacent, there’s a lot of work ahead to deliver the best experience we can.

Xsens is very much a pivotal part of this fast turn-around which maybe 10 years ago or even less would not have been possible.

What do you think the future of VR looks like?

I personally see VR becoming more portable and affordable with ease of use and shared experiences also key to its success. Each year the headsets will definitely become much more powerful allowing us to create even more engaging and realistic experiences. I also see the headset designs becoming less bulky and perhaps more comfortable and accepting in aesthetics.

I would love to see virtual animated tv shows becoming the norm’. Obviously the narrative and staging would need careful attention but I believe it’s certainly doable, just imagine being in the Simpsons house for instance whilst the show is running.

I definitely see VR will becoming much more widely used in the healthcare and engineering industries. Be that either used in training surgeons and vets, using virtual theatres or used to train oil rig workers for instance, to help cure people of phobias, such as the fear of heights and spiders. The training sector I believe is limitless.

We may even see headsets being used in retail, enabling the buyer to see a virtual self in their chosen garments and accessories before they buy. This may well be done from their home as we see more retail being done on-line.

Education is another sector it could play a big part in for sure. Transporting children into historical environments for example, imagine being taught about Victorian politics (a rather boring subject as I remember) but if you could be transported onto the streets of London, observing events as if you were there (minus the hangings or Jack the Ripper) then it would become incredibly fascinating even to the most anti academic child.

I do have a small personal story to back up some of these opinions:

Last year I encouraged my 80 year old mum to try the Oculus Go. She’s a traditional little old Welsh lady whom I thought would be very much against trying this ‘new tech’ but she put on the headset and took to the experience right away.

She loved being able to go into space ‘ooh I’m floating in space, well no wonder people want to be astronauts’ she said, then sitting on a beach, doing simple physical exercises and meditating and finally wandering around a haunted house. I was genuinely amazed. The whole family loved watching her reactions. So using these headsets in retirement homes is definitely a huge plus point I think, especially when someone’s not able bodied anymore.

VR may be in its infancy but it’s definitely here to stay, becoming more commonplace every year.

On reflection, what was the most rewarding aspect of developing the game?

For myself personally, and perhaps the company, it was proving the studios ability to be able to shift up several gears in all aspects of production to deliver a successful AAA project.

The company had to expand rapidly to over double its size to develop Phantom for both the Quest and Rift and keep momentum with our prototypes for projects post Phantom.

It was a large undertaking for all concerned, a lot of new staff had to learn the ropes with how the company works, pipelines needing re-designs on the fly, marketing keeping apace with social media, IT coping with the technical complexities of rapid expansion and of course HR needed to grow to keep up with the demands this all has legally and culturally.

Did you have to complete the game during lock-down? Or had you already finished development?

We were at a critical moment in the development cycle when lockdown happened as there was just 4 months to our release date.

Fortunately, the company had taken the threat of COVID19 seriously very early on and had put in place a detailed plan to deal with the scenario of a lockdown.

We had extra security software installed, couriers were put on stand-by for shipping equipment anywhere in the UK and additional equipment was ordered, such as web cameras and VR headsets.

Everything was logged and tracked meticulously. I was very impressed with how well it was handled.

The second phase was for each individual to get themselves set-up at home and get used to attending meetings using Microsoft Teams. Pretty quickly our new way of working on a daily basis seemed to become second nature and everyone embraced the challenge very well.

This way of working does of course require a large amount of trust that everyone is proactive and disciplined with their time.

Everyone has their own set of circumstances and dependents of course, whether you’re a parent, have elderly parents’ dependent on you or perhaps you are working in a cramped shared household etc. All these things were taken into account and peoples mental health well-being has been an important issue to track.

With regards to the animation department we made sure we had enough data to keep us busy for the long haul.

Because we use Xsens, when we did need animations recorded, one of our team members was able to go into the studio on his own and capture what was needed. We followed the webinars on the Xsens website for doing this and it posed no problem at all.

Even though the sensors were left dormant for several months they only took a half hour to charge!

How do you think VR as a platform will fair in a post-lockdown world?

This is an interesting question.

Home entertainment has certainly grown a great deal over the past 4 months. The fact people are not able to attend musical and sporting events, exhibitions etc will I think fuel a surge in filming these for VR use.

There are also the obvious restrictions on the tourism sector which could also help with the growth in use of VR.

Companies and Universities need to continue to train individuals be that for hospitals, armed services, architects for instance and of course there are the heavy industries, the scope for growth there is staggering.

For all the positives for real growth, portability and much better pricing need to be seriously addressed. It does cause a bottleneck.

We would like to thank Gary Edwards for the great interview, and if you like to know more about, Feel free to reach out Website.

What do you think?

Written by VFX Online

VFX Online, now writing with a focus on Visual Effects and Animation and Gaming, writing at VFX Online Blog since 2016. VFX Online in India.

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Loading…

0

Comments

0 comments

Screencraft Animation Screenplay Competition 2020 (Until 31st July 2020)

‘Flipbook Feedback’ review sessions launch for aspiring digital artists