news

NVIDIA Edge Program Winners – November 2017

In partnership with NVIDIA, Epic Games created the NVIDIA Edge program, through which we award high end hardware to incredible teams or individuals for creating outstanding projects in Unreal Engine. Each project will be awarded with an NVIDIA GTX 1080 Ti.

This month's winners are:

Visage – SadSquare Studio

Visage_770.jpg
Visage is a wonderfully creepy first-person survival/psychological horror game, developed by SadSquare Studio. The visually impressive game takes place in a large house with a dark history.

Real-time Marching Cubes – Hirofumi Seo

MarchingCubes_770.jpg
Hirofumi Seo and the team at SCIEMENT created Real-time Marching Cubes, demonstrating some inspiring medical applications created within Unreal Engine. SCIEMENT also developed a  Real-Time Heart Viewer, among other enlightening projects.

PatientX: Survivalist – DualFuelStudios

PatientX_770.jpg
The team at DualFuelStudios is developing a gorgeous online, open-world, post-apocalyptic, survival game called PatientX: Survivalist.

The Unreal Engine community never ceases to impress us with your unending talents and ability. Thank you for sharing your incredible projects with us.

Do you have an incredible project to share with us?

Share in-engine stills or video of your stunning projects on Twitter, Facebook and Instagram, and mention both the Unreal Engine and NVIDIA accounts in your social media posts. We'll be on the lookout!

It is always difficult selecting only three submissions. If you’ve entered previously, but haven’t been selected thus far, we encourage you to submit again.

Good luck!

Blasting Off with ‘Morgan Lives in a Rocket House’ in VR

Unreal Engine 4 has proven its versatility and benefits countless times across a wide range of creative disciplines. Animation is no different, and in 2015 Epic Games set out to showcase UE4's power as an animation tool with the release of “A Boy and His Kite“.

 After watching the short, self-taught animator Peter Monga knew he had found the solution he needed to begin working on Morgan lives in a Rocket House – a fun and colorful pre-school show inspired by classic stop-motion and animated by none other than Monga himself.

 Tempted by UE4's VR capabilities, Monga recently released Morgan lives in a Rocket House in VR, turning his show into an interactive experience for Oculus Rift and HTC Vive. We caught up with Monga to find out more about the project.

Tell us about your background in animation.

I'm a self-taught animator from Auckland, New Zealand. I got my first job as an animator in 2004 at Huhu Studios, a little (at the time) animation studio in Warkworth, NZ. There I worked on various TV and direct-to-DVD cartoons with Veggie Tales most likely being the show with the highest profile. In 2010, I moved back to Auckland to work at Oktobor Animation, which was doing work for Nickelodeon at the time. I was an animation director on shows such as Penguins of Madagascar and Kung Fu Panda: Legends of Awesomeness, as well as director on a few in-house projects.

What was the inspiration for Morgan lives in a Rocket House?

I've always wanted to create my own little animated series in the style of old stop-motion TV shows like The Wombles, Postman Pat, and especially Pingu. I also had a desire to design a show that would allow a small-scale team (i.e. just me) to create something that would hopefully be indistinguishable from contemporary CG preschool shows.

Before coming up with the actual concept, I came up with ideas on how to make a show with no budget. I wrote down a set of rules that would restrict the scope and design of ideas for the show.

My Self Imposed Restrictions:

  • No hands or feet, so less bits to animate
  • Use narration so no need for lip-sync
  • No superfluous, dangly things like loose clothing or tails
  • Animate at 12fps to mimic a stop-motion aesthetic and save on frames to polish
  • Simple textures
  • Only one location where all the stories take place
  • Limited cast of characters
  • Keep character seconds low (i.e., fewer characters per shot)
  • 5 minute episodes

770_MorganLivesInARocketHouse_MRH_04.jpg

With those restrictions in mind I sat down and just wrote pages and pages of ideas that would fit the criteria. I would then draw up sheets of concepts based on those ideas, whittling it down to what eventually became Morgan lives in a Rocket House.

You could say that Morgan is the direct result of intentionally trying to come up with a TV show developed with a distinct set of restrictions to allow rapid production. That makes it sound rather soulless, but I feel I have managed to create a show with heart despite its 'intentional' origin.

Apart from specifically designing the show in a way that would make it easier to produce, I also looked for software and techniques that could potentially speed up production. It was around the time I was looking for solutions when Epic released “A Boy and his Kite”. The visual fidelity of that short was much higher than I was aiming for and convinced me real-time rendering using UE4 was definitely the way to go.

770_MorganLivesInARocketHouse_MRH_08.jpg

How was Unreal Engine 4 incorporated into your workflow?

I use Unreal Engine 4 as a replacement for the shading, lighting, effects, rendering, and compositing stages of a usual animation pipeline.

The lighting is baked in a “master” map that I duplicate for each new episode. I tried to design the lighting so it could work with any camera angle, but occasionally I'll need to add an extra fill light on a shot by shot basis. In those cases, I'll just add a dynamic light and not worry about re-baking.

Using Sequencer, I bring my animation assets into UE4 and set up each scene with the various props, characters and cameras. I was originally using Matinee, but Sequencer is so much easier.

770_MorganLivesInARocketHouse_vr-unreal-1.jpg

Post Processing effects allow me to get a look that would usually require extra work in After Effects. Things like colour grading, vignettes, chromatic aberration, bloom etc, can all be done in-engine. I still do some effects in After Effects, such as when a character draws a line with a pencil. Little things like that could be done in UE4 if necessary, but some things are just quicker in After Effects.

I have not done much effects work in my career, but I found UE4's Cascade particle system to be easy to learn. Morgan isn't exactly effects-heavy, but I have done a few effects like pouring water out of a watering-can and a glitter explosion, both of which turned out pretty good. I even made a swirling vortex of paper planes for the VR episode!

770_MorganLivesInARocketHouse_MRH_07.jpg

What are some of the benefits of using UE4 as an animation tool?

The absolute, main benefit of using UE4 is the time it saves. I can light, render and composite an entire show in about three days. Each show is about 80 shots and there is no way I'd be able to light, render and composite 80 shots in three days using my old production pipeline.

Aside from time, UE4 also saves on machines –no need for render farms– which in turn saves on render licenses and power use.

Higher grade rendering effects that I originally thought would be too cost prohibitive are easily achieved in Unreal, such as depth-of-field, global illumination, high-quality ambient occlusion, reflections, refraction, etc. If I was using off-line rendering I would have made compromises on effects like these to reduce render times.

Another advantage of using Unreal is faster iteration times. If a change is needed you don't have to re-render the entire shot. For example: if I need to nudge a camera over by half a field, I can shift it without any hesitation because I know it will only take a minute or so to re-output the shot rather than wait for hours for a new render.

770_MorganLivesInARocketHouse_MRH_02.jpg

How would you describe the transition to working with UE4 for animation?

I would say the transition from animation to UE4 was fairly simple. Coming from Maya, I felt very at home in the Unreal Engine interface. Things just kind of made sense.

To learn Unreal Engine, I basically read the Docs top to bottom, and then followed various “Getting Started with UE4” tutorials on the Unreal Engine YouTube channel. Even though I had no plan to create interactive experiences when I first started, I felt it was important to know about all aspects of UE4.

770_MorganLivesInARocketHouse_MRH_03.jpg

As a one-person project, did you have to take any specific measures to manage scale?

In addition to the previously imposed restrictions, I try to keep scope in mind when writing stories. For example, I try to make sure the scenes don't have too many characters in them for too long.  More characters on screen equals longer time to animate. If there are multiple characters in a scene, I try to cut away to individual shots as much as possible. It cuts down on animation but also makes the scene feel more lively.

I also try not to introduce too many new props in an episode. The less pre-production I need to do on an episode the better.

What motivated you to create a VR episode of the show?

I'm a big fan of VR. I think it's a lot of fun and seeing as my assets were already in UE4 I thought I may as well give it a go. I didn't go in thinking I had some grand plan to create a new form of storytelling or try to be on the cutting edge of a new medium. I just thought it would be fun.

770_MorganLivesInARocketHouse_vr-screenshot-1.jpg

What sort of challenges did you encounter in the shift to VR?

All of my assets for the flatscreen show are fairly high-resolution, so I thought I would have to manually down-res my assets to reach the performance targets for a VR game (11ms a frame), but Unreal Engine's built-in LOD system essentially made it a non-issue. Setting up the LOD for the models and choosing a Maximum Texture Size for textures was very easy to do and saved potentially hours of manual work.

Epic's various videos and livestreams detailing the production of Robo Recall really helped when it came to optimization. Learning about the forward renderer, MSAA, Quad Overdraw, composite textures, and not spawning things on a single frame all helped to increase performance in my project.

770_MorganLivesInARocketHouse_vr-screenshot-3.jpg

Storytelling in VR was something entirely new to me. I think it's more akin to theatrical productions rather than screen. I found it quite difficult to direct a story without the luxury of cuts. I had to find ways to get the viewer to look where I wanted, rather than just being able to cut to what I wanted them to see. I think I managed to pull it off using contrasting movement (i.e. have the thing I want the viewer to look at move faster than the other objects to grab their attention), character's facing directions and pointing (people tend to look where other people are looking or pointing), and positional sound.

There's always the chance the viewer could just look in the wrong direction, but I figure that someone watching a TV screen could potentially look away from the TV too. It's just about keeping the viewer engaged so they won't want to look away.

I also couldn't really think of a good way to storyboard a three-dimensional space, so I ended up skipping storyboards. I blocked out where I wanted the action to occur using Unreal's VR editor, and then I drew a playbook style game plan of where I wanted the characters to be. Animation layout was then based off this game plan.

770_MorganLivesInARocketHouse_vr-episode-gameplan.jpg

Animating for VR was a challenge I hadn't really anticipated. I'm accustomed to animating at 12 or 24 fps, but anything below 90fps in the Vive or Rift stutters unbearably, so I had to make sure to polish at a higher frame rate. I ended up animating at 24fps, polishing at 48 and baking at 90. In hindsight I should have animated at 30fps so the original frame rate was a multiple of 90, allowing the keys to match up better with the frames.

Another challenge was that the user could potentially look at any character at any time. Normally you wouldn't animate a character if they were off-screen, but in VR there is no off-screen. So that meant more animation than I would usually expect to need for an episode of this length.

770_MorganLivesInARocketHouse_MRH_09.jpg

Do you have any tips for other animators considering incorporating Unreal Engine 4 into their own projects?

Try to get an overall view of the engine before diving in. Learn the unique terminology that you may not have encountered in offline-rendered animation, such as lightmap resolution, static and stationary lights, cascading shadow maps, lightmass, lightmass importance volumes – information all found in the docs. You can probably get away with not knowing everything about the engine, but once you learn the differences it's easier to see the similarities.

770_MorganLivesInARocketHouse_MRH_11.jpg

What's next for Morgan lives in a Rocket House?

I plan on releasing more episodes on YouTube soon via https://www.youtube.com/MorganlivesinaRocketHouse.

The VR episode is on Steam and the reception so far has been great. If it is popular enough, I'll look into creating more episodes and offer them as DLC.

I haven't actually thought this through properly yet, but maybe I could try writing episodes that work as both VR and flatscreen versions. Maybe the VR version could even reveal secrets not shown in the normal episode.

I'm also keeping a blog where I'll be posting about the production.