news

Setting Up A Chroma Key Material in UE4

A bit over two years ago, someone on the UE4 AnswerHub asked about chroma keying (a.k.a. green screen) in a material. That post led to me mocking up a very basic chroma key function called Chroma Key Alpha that was added to UE4 a while ago. While it was in the release notes, our notes can be pretty long and it is easy to forget about features that are buried deep within them, especially if you had no reason to use them at the time.

AR is on the rise with many new tools and devices being supported all the time; UE4's recent support of ARKit and ARCore are a testament to this. You can read about that, including some amazing projects, in Tim Sweeney's post surrounding WWDC here.

With all this going on, it seems like a good time for some examples on how you can set up a chroma key material in UE4. While most AR demos are about compositing digital objects onto live video, some projects require mixed reality, involving projecting live subjects into digital scenes. This usually involves a green screen setup of some kind which can be a bit more challenging since you won't have a built-in alpha.

The process of extracting an alpha from a green screen is referred to as chroma keying. We recently added a UE4 plugin called Composure which makes it easier to mix and match post-processing and rendering elements, and that could be a great place to try out using a live chroma key material.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRyan+Brucks+Chroma+Key%2F770_RyanBrucks_GreenScreen_Pic1-770x416-c17605cbdee13943cb7ea4164115badd3b98624b

I will preface this by saying that getting high-quality chroma keying results is difficult and often requires different techniques to be mixed. Numerous software packages offer pretty advanced methods for this, such as Nuke. Many of the things Nuke will do to get a very high-quality chroma key alpha can be computationally intensive. Certain projects, such as live broadcasts, will tend to use expensive dedicated hardware solutions.

That means we need to assume that our real-time version will be fairly basic in comparison, and will probably require lots of content-dependent tweaks to look acceptable. But still, the value of being able to previs and test our chroma key right away in the editor can be very useful.

The Basic Methodology

The idea behind chroma keying is to create a color comparison mask from which to generate an alpha mask. Then a secondary mask is used to 'despill' or remove the green cast from objects. The despill mask is usually just a softer, inverted version of the alpha mask result. Finally, a version of the despill mask can be used to add back in some fake ambient lighting, replacing the green cast with a color matching the environment to be composited into.

There are almost infinite ways to go about the relatively simple steps above. The first version I have made is pretty basic and has been in UE4 for a while. As mentioned above, this is Chroma Key Alpha, and a basic example of its usage appears like this:

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRyan+Brucks+Chroma+Key%2F770_RyanBrucks_GreenScreen_Pic2-770x507-8ee15c405eb0a8df8272a7f02c041138653c2c54

It takes an input from an image color, a chroma color, and a few inputs for alpha and despill masking. Rather than sticking to just examples using this node, I am going to show how each step of the function is performed as well as some better versions of the original setup.

Color Extraction

The first step in generating the color comparison is to remove luminance from the image so that subtle shadows and creases or lighting gradients on the green screen will not interfere. In the first version of this function, I ended up simply normalizing the colors. This works fairly well, but results in a little bit of edge artifacts.

Another way to remove the brightness it to just divide by the component sum. This gives a nice even tone, but it tends to create floating edge halo artifacts. I realized that using a color layer in Photoshop did not give such edge artifacts so I decided to think about how to remove them.

The answer was to use luminance-based saturation, which prevents very dark saturated pixels from becoming saturated in the color map. As it turns out, the dark saturated edge pixels are from bicubic texture resizing.

This test image is from the TV show Lost in Time, which uses UE4 for rendering scenes. Note that this test image is not high-quality raw source; it is a low-quality passed down jpeg.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRyan+Brucks+Chroma+Key%2F770_RyanBrucks_GreenScreen_Pic3-770x809-d21f1752a15e8d6da65efa9656e75e688c25e75e

To generate the luminance-based saturation map, the image is first desaturated, and a luminance curve is generated using a simple exponential function of e ^ -x. The luminance value x is scaled by a parameter for defining the strength of the luminance mask. It is normalized so that 1 is a good default and 0 will give a result exactly like the divide by sum version. Here is the code for that:

float3 ExtractColor(float3 Color, float LumaMask)

 

{ float Luma = dot(Color, 1); float ColorMask = exp(-Luma * 2 * PI / LumaMask); Color = lerp( Color, Luma, ColorMask); return Color / (dot(Color, 2));
}

 

Once the color map is extracted, the next step is to generate a mask by doing a comparison. This step is pretty simple. First, the ChromaColor should be run through the same ExtractColor function (or normalize etc. if that is used). Then, the difference between the color map and ChromaColor is taken. Then, the length of that difference is calculated and a sharp mask is extracted from the broad gradient to isolate the specific desired range of error.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRyan+Brucks+Chroma+Key%2F770_RyanBrucks_GreenScreen_Pic4-770x817-3f4733fa169b4bb447102b78da75d28d91731635

Here is an example of that in nodes, with the ExtractColor function above:

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRyan+Brucks+Chroma+Key%2F770_RyanBrucks_GreenScreen_Pic5-770x246-4cc2fcfc7c1d7786ee9eec54248b86cdff489ef4

Note that in the above example I am using 'Chroma Alpha Strength' as a multiplier. I think that is a more intuitive way to specify the edge sharpness. In the UE4 packaged material function, I specified this using a Min and Max. The result is the same, but it means Max always needs to be set higher than Min, so Min needs to be constantly adjusted.

Despill

After extracting an alpha mask using the chroma key, the next important step is to despill the remaining pixels. That means getting rid of all the green cast on the silhouette of the subjects. This is necessary due to a variety of effects from both lighting and cameras. For one, the larger the green screen, the more green bounce light it will cast onto the subjects. Also, cameras tend to pick up a variety of lens artifacts which can cause a bit of bloom from bright pixels onto subjects, even if the bounce light is minimal on set.

A good start to handle the despill is to use the same setup as the alpha mask, but using a wider range of values to give a softer mask. In the built-in function, this is done by exposing a separate Despill Max. The Min used for despill will be the same as the one used for the alpha.

Once you have a despill alpha, it can be used to remove the cast of the chroma color from the image. Originally, I was just desaturating using the despill alpha, but better results can be had by subtracting the portion of the color that matches the chroma color from the source image.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRyan+Brucks+Chroma+Key%2F770_RyanBrucks_GreenScreen_Pic6-770x821-b4adda5df76b9c6a8443fc1f7f73b7fd5708628f

Here is the basic logic for how the Despill Alpha can be used to remove chroma spill from the image:

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRyan+Brucks+Chroma+Key%2F770_RyanBrucks_GreenScreen_Pic7-770x253-59f858884bacbec47411b37ea73c6c098e6dd635

Note that for this image, a much higher quality result can be achieved by using two Chroma Colors and two Chroma Key Alpha nodes. You can tell because the color map has a pretty different color on the floor versus the wall, so when using only a single chroma comparison, a much wider cutoff must be used. This means you lose flexibility in the edge softness, which makes things like the motion-blurred lever tricky to fix. To use two, the result of each alpha would be combined using Min. The Despill Alphas would be combined using Max.

Adding fake bounce is pretty simple. It is a good idea to desaturate the base image colors before using them with a manually specified bounce or background color. That is for two reasons. First, you need to get rid of any existing cast in the source, and you really just want the luminance. Second, most of the cast will be from specular surfaces due to glancing angles, and for all non-metals, specular is completely desaturated and only colored by lighting.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRyan+Brucks+Chroma+Key%2F770_RyanBrucks_GreenScreen_Pic8-770x435-d0b0a3de816258b08c660f0ccaaa7f20bb4fe045

Somebody who does compositing professionally may have to get pretty detailed about how they mask and add fake lighting, including using separate rendered elements to help pick up the right lighting response. That goes a bit beyond the scope of this article but hopefully this at least gives an idea for how you can approach handling live compositing.

Here is another example, this time using a higher quality image that was given to me by Joe Wilson from Epic's video production team. This is a shot of Tools Programmer Lauren Ridge that was taken during the making of the Star Wars VR Experience demo that was shown at WWDC (link at top of page). With a higher quality source, you can get much better results. This example uses the existing version of Chroma Key Alpha.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRyan+Brucks+Chroma+Key%2F770_RyanBrucks_GreenScreen_Pic9-770x365-fc30c77692da035135bef776aa12ef63d51f31bb

The added bounce in the last frame may be difficult to perceive except for around the edges and especially on the darker reflective bits such as the Vive headset and controllers. Notice the added bounce brings back reflections of the sky on those elements which helps them fit into the comp.

Here is the whole material for the above setup with despill and bounce/edgebleed color added:

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRyan+Brucks+Chroma+Key%2F770_RyanBrucks_GreenScreen_Pic10-770x395-a0d26587038e162706257bc0b89bbdc27e18a215

Composure

At the start I mentioned Composure briefly. I have not yet tried adding a chroma key mode to Composure, but I have dug around in the example and it looks like it would be pretty simple to set up. There is a Final Compositing Material that handles the various compositing setups. For the example project, all of the color masks were imported from a program like Nuke, but the usage in the material could just as easily be replaced with the chroma key node setup described here. 

That is everything for now, and we look forward to seeing what the community does with mixed reality and Unreal Engine.

EDITOR’S NOTE: This blog post first appeared on Ryan’s personal blog, ShaderBits.com, and has been repurposed here for broader Unreal Engine community consumption. You can follow Ryan on Twitter, @ShaderBits.

Revving the Engine: Everspace

If you are wondering what Everspace is, one enthusiastic fan put it best, saying, “Everspace is like Freelancer and FTL made sweet love and had a baby!” Even without the exclamation points of fandom, Everspace is an impressive new breed of space game – a narrative focused roguelike that is easy to pick up and play, yet challenging to the very end.

The core of Rockfish Games are best friends Michael Schade and Christian Lohr. The pair have been making beautiful digital imagery since their student days in the late 80s, eventually becoming the talent behind one of the leading German CGI studios in the 90s, creating on architectural visualizations using game engines. From there, they moved into development of mobile games as Fishlabs Entertainment, developing highly rated titles like Rally Master Pro and Snowboard Hero. Their biggest hit was Galaxy on Fire 2, a space fighting sim. Despite predicting the gaming potential of mobile, they failed to see the market’s shift to free-to-play and Fishlabs folded. Undeterred, Lohr and Schade have embarked into independent development for PC and console games. Everspace is their big comeback.

Everspace is currently available on Steam, Xbox One and PS4. The team at Rockfish Games is currently working on a large add-on featuring a new player ship, weapons, questlines and characters adding about ten hours of gameplay. Though set for release in October, Michael Schade was kind enough to take time and answer our questions about the game. It’s a deep dive, so I’ll cut the preamble short and get right to it.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRTE+-+Everspace%2FRTE_Everspace_Pic_1-770x433-9697900a18f832c83a983e45b33a089bc17ed25a

Why develop space sims? What’s your attraction?

Well, we have been creating 3D space shooters since 2004 and some 30 million gamers have played the Galaxy on Fire games. Galaxy on Fire 2 and its HD version became wildly popular, so we thought we might be able to reactivate our fan base and bring them over to PC and console. Plus, everyone of the core team has always been a big fan of Wing Commander and Freelancer, so we felt there is more than enough space for yet another space game, even next to genre juggernauts like EVE Online, Elite: Dangerous, No Man’s Sky and Star Citizen.

Space sims seem to be having a bit of a resurgence, especially in the indie sphere. Can you ruminate on why space sims fell from favor, and why they’re coming back now?

I think it all started back in 1990 with Wing Commander from Chris Roberts, which brought a good-looking three-dimensional game experience to the PC for the very first time. When GPUs were coming to the PC during the sixth generation of consoles in the mid-90s, we saw the release of lots of great 3D space games like Star Fox 64, Colony Wars, Descent and Rogue Squadron, just to name a few.

The reason for this is obvious. The rendering performance was still the bottleneck, but it only takes a few polygons to move around in 3D space in front of a beautiful backdrop to make a good-looking game. And, unlike in a first or third person shooters, there is no character animation to worry about. Actually, this is pretty much why we started making Galaxy on Fire for Sony Ericsson and Nokia phones back in 2004, when we had to deal with a 100 MHz CPU, no GPU whatsoever, a screen resolution of 128 x 128 pixels and a maximum file size of 350 KB for the whole game.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRTE+-+Everspace%2FRTE_Everspace_Pic_2-770x433-c755639a47a2cbb9d384bd18dafdf8e1f2b53007

With the rapid improvement of rendering power on PCs and the seventh console generation, in combination with sophisticated game engines and 3D design tools, first person shooters quickly stole the thunder from space shooters. We believe it is because of the immersive experience you get when playing in first person view combined with less complex controls and better orientation thanks to mostly moving on a plane instead of dogfighting in a 3D-space environment with six degrees of freedom and enemies approaching from all angles.

The last major space game hit of that era probably was Freelancer. There were other great space games afterwards but with less commercial success. Publishers turned away from the genre. Ironically, in spite of the commercial success of Elite: Dangerous, No Man’s Sky and even Star Citizen, the latter making over $150M in revenue even before being released, the reluctance towards space games had not changed one bit when we pitched Everspace to publishers with a rock solid pre-Alpha about two years ago.

There certainly still is a large audience for space games out there. And since the big publishers haven’t served that niche for quite a while, it was just a matter of time until indies took the risk of filling that void. Lastly, since we’ve been getting lots of praise for Galaxy on Fire from more than 30 million players on mobile phones for over a decade now, we were sure that we’d always been onto something. Obviously, Chris Roberts, David Braben and Jean Murray thought so, too.

Can you speak a bit about your decision to marry the space sim to the roguelike? It seems an unusual choice at first, but once you get into the game, it’s a perfect accompaniment. What about the mechanics of roguelikes appeals to you?

Our core team has been playing a lot of roguelikes and has become quite fond of the genre over the years. A roguelike 3D space shooter had not been done before, so we figured that roguelike elements were something that would set us apart from all the other space games out there. Also, since many roguelikes are indie productions with 2D retro graphics we felt that we could offer something new and draw a wider audience, including players who’ve never even played a roguelike before, by making the game as visually appealing as possible.

What we love most about roguelike mechanics is the high replay value you get with randomization and procedural generation and that you can’t get better at these games by learning certain encounters by heart and trying the exact same thing over and over again. As you’re always going to be thrown into different situations you only get better by gradually learning more and more about the game mechanics and developing a better understanding of which works best under which circumstances. Roguelikes demand a lot of fast decision making and improvisation – they keep you at the edge of your seat at all times and can be quite hard, but it’s a great feeling when you manage to get out of yet another sticky situation.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRTE+-+Everspace%2FRTE_Everspace_Pic_3-770x433-b6f0f5b7dc5e4fe1e42bab8e4e290b4b138dbe71

How do approach narrative in Everspace considering the roguelike elements?

Everspace takes place in the distant future, so we made use of the cloning theme, which is a perfect fit for a game in which you die a lot. While this constant resurrection is explained with being a clone pilot, the randomly generated areas are a result of the newly plotted course through space at the start of each run. There was no need to explain the roguelike elements in Everspace, but since they were a good match for the story we wanted to tell, we took advantage of that and brought both together.

While the player is in a constant loop of dying and reviving, the characters he meets are not. We carefully placed encounters along the player’s journey that, in combination with the flashback cinematics, slowly reveal both the player character’s past as well as the nature of his mission.

One could say that we embraced the logical implications that come with certain roguelikes and tried to explain them within the narrative structure. The sci-fi them just seemed like a perfect fit.

The game looks gorgeous. How’d you go about making space visually sumptuous?  

We had been creating 3D space shooters for low-spec feature phones all the way up to powerful smartphones for more than ten years, so we knew how to make space look cool. However, we knew there was no point in trying to compete with the photorealism in Elite: Dangerous or Star Citizen with their AAA budgets and huge teams of hundreds of developers. More importantly, we wanted to have an art style that fits the character of an arcade space shooter, featuring vibrant colors and stunning visual FX for firing weapons, big explosions, natural hazards and whatnot. It had to make you want to dive right into the action. For this, Guardians of the Galaxy was one of the greatest influences.

So we went for a more romanticized interpretation of a fantasy space setting with a unique color palette for each of the sectors. These define the colors of up to three suns per solar system and the pre-dominating tint of the sector. Typical for roguelikes, the jump targets of each sector are procedurally-generated levels, filled with various types of backdrops, planets, nebulas, asteroids, space stations, shipwrecks and other points of interest.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRTE+-+Everspace%2FRTE_Everspace_Pic_4-770x433-954b8f107479cc053d4c6925b7c6745f494f799d

These assets are combined following fairly strict rules to create a specific look and feel with enough freedom to make every location look unique. Now, to achieve that AAA-ish look, all 3D props were designed in a classic art pipeline, starting out quite rough with concept art thumbnails, followed by 3D block-out models with several variants and complexity for playtesting, concept art overpaintings to define details of hero assets, 3D high-poly meshes to create textures and maps for various shaders, and finally in-game low-poly models with five LODs, each.

A lot of effort also went into creating top-notch visuals FX by taking full advantage of modern GPUs. This is where the Blueprints in Unreal Engine shine. Marco Unger, our Lead and Technical 3D Artist, had great fun going crazy combining all sorts of shaders to render GPU particles like there is no tomorrow. Don’t ask me how exactly he does it. Every time you walk past his workplace, you’ll see amazing stuff on his monitor. The design process of making top-notch VFX in UE4 is so fast there is almost no need for pre-production.

Was the process similar for the sound design?

Often in game development, creating and implementing great sound is neglected. We believe a captivating sound experience is almost as important as having stunning visuals in the game. Especially in a space shooter without a lot of things around you, firing an energy weapon or ballistic gun does not only have to look good, it also has to sound satisfying. How disappointing would it be, watching baddies blow up in a huge explosion without a massive KABOOOM!? Of course, we still get smart comments that there are no explosions or sound in space. Don’t tell George Lucas, I guess.

The sound design in Everspace goes way beyond those basics. Gero Goerlich, our Sound Director, has been working with us for ten years, mostly on space shooters, and he has spent countless hours of research for cool real-world sounds, to tweak and mix them with electronics samples in our in-house sound studio until every single weapon, device, ship engine, shield charge, damage impact, warning message and every little blip got its right sound. In total, there are over 900 unique sounds and more than 60 originally composed music tracks in Everspace, most of which are blended in real-time, depending on what’s going on in the game.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FRTE+-+Everspace%2FRTE_Everspace_Pic_5-770x433-c236aa40aa7858a4b39170d52279882f8e56cfda

Your past work was primarily on mobile. Why make the move to indie? How did the transition go? Learn any valuable lessons?

We didn’t plan on going indie. When our previous mobile games studio was in financial trouble, we fought to adapt to the massive changes in mobile gaming. Ironically, we had to get forced out of the studio before we sat down and asked ourselves what we are good at, what we would really like to do and what our opportunities are.

The transition from mobile to PC went smoothly. We worked on the PC and Mac versions of Galaxy on Fire 2 before, so we already had a glimpse of what to expect. When the time came, the switch from a proprietary engine running on restrictive devices to a high-end cross-platform engine on the most powerful platform was exciting for us. Our mobile legacy helped us contain ourselves and take small steps. After all, we suddenly had much more power and with more power comes more responsibility – gamers on PC and consoles expect another kind of quality level compared to mobile gamers, who are more inclined to forgive graphical glitches, crashes and choppy performance. However, in the end it all boils down to transforming a good idea into a fun and contemporary game, regardless of the platform you are developing for.

Are there any unexpected benefits or challenges to using Unreal Engine 4?

When we entered mobile gaming in 2004, there was no tech available to develop 3D mobile games. We had to create our own proprietary 3D mobile games engine. It was super lean on features but heavily optimized on performance. With the rapid performance gain of smartphones, it was an uphill battle for our small engine team to keep up with the feature set and functionality of Unreal. Our artists pushed towards making the move. Eventually, we had to use Unreal Engine 4 because the AAA publisher we signed with had licensed it and we had lost the rights to our previous tech. In hindsight, this was one of the best things that has ever happened to us.

Working with Unreal is a totally different experience. We have everything in one place: meshes, textures, materials, UI elements, code, Blueprints and so on. Pressing play lets us immediately test things, which allows for very fast iteration cycles. Cooking times for release builds are high and as some bugs only occur there, so we often have to wait half a day to see if a problem has been fixed by a change.

Engine updates are also a double-edged sword. They bring a ton of cool features and bug fixes, but also introduce new bugs and sometimes dictate new ways of doing things that require refactoring on our end. We often delay before migrating the project to a new major engine version.

We were surprised by the amount of features that artists can carry out themselves. Our technical artist took great pleasure in constructing smaller Blueprints and became so good at it that he is now able to create complex Blueprints without any help. Even artists who don’t want to delve into the Blueprint system at least understand the greater context, how assets are related to each other and can thus find and fix smaller issues. This is a great benefit compared to multiple proprietary tools that demand specialized operators.

This series of articles is made possible through the generous sponsorship of Epic’s Unreal Engine 4. Every month, we profile the recipient of an Unreal Dev Grant. While Epic puts us in touch with our subjects, they have no input or approval in the final story.

Unreal Engine at EGX 2017

With nearly 80,000 attendees storming through the doors of the NEC in Birmingham, it was another record breaking year for EGX and a perfect way to mark the event's 10th anniversary!

I headed up to the show to give a talk on Real Time Rendering for Artists at Ukie’s careers bar. In-case you couldn’t make it, I’ve converted my slides to a blog post which you can read here.
Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FEGX+2017%2Ftalk-770x426-a2ee5105bfbbf26adea3744699f935dfa8aca9a5

With my talk successfully given, I started scouting for all of the playable Unreal Engine games on the showfloor – and my goodness I was kept busy counting with 41 Unreal Engine-powered titles all over EGX! Here’s my EGX experience in pictures:
Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FEGX+2017%2Fpanel1-770x771-3c6edfbaadfd645cbe889972542569a48069069c

From top left: 2000 to 1: A Space Felony by National Insecurities, Absolver by Sloclap, Alchemaic by Broken Pixel Studios, Among the Stones by Bluedoor Games, Ark: Survival Evolved by Studio Wildcard, Ascend by Final Forge, Battalion 1944 by Bulkhead Interactive, Beyond Arm’s Reach by Round Squares Studios & Death is a Desk Job by Grim Inc.
Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FEGX+2017%2Fpanel2-770x771-1002ac1e7677d0d3ccfca322c00130fd8bfb9b99

From top left: Dragon Ball FighterZ by Bandai Namco, Exposure by Drift, Falling Sky by NFTS, Genesis Alpha One by Team 17, Hero Science! And the Crystal Conundrum by Foxtrot203, Hidden Agenda by Supermassive Games, Honeypot Espionage by Pocket Sized Hands, I Am Here by Lavalamp Games & Jeff’s Tower by Filthy Fresh Studios.
Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FEGX+2017%2Fpanel3-770x771-4a4a790a2de831ea54f7df04ed3abf2fd57a948f

From top left: King of the Castle by 8Bit10, Moo Moo Move by Mochi Software, Moss by Polyarc Studios, PlayerUnknown’s Battlegrounds by Bluehole, Prowl by Ethereal, QUBE2 by Toxic Games, Sea of Thieves by Rare, Shape of the World by Hollow Tree Games & Shift Quantum by Fishing Cactus.
Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FEGX+2017%2Fpanel4-770x771-e955824825b710f36b1b2088ab5f7832be744858

From top left: Shooty Fruity by nDreams, Six-Slided Slime by Slime Time Studios, Skye by Puny Astronaut, Stellarmania by Dark Planet, Street Fighter V by Capcom, Switchblade by Lucid, Tetra: Elemental Awakening by Ocean Spark Studios, The Occupation by White Paper Games & The Turing Test by Bulkhead Interactive.
Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FEGX+2017%2Fpanel5-770x512-55c0cfc14741135702b51c65f58172fde66e94b4

From top left: This Dead Winter by Nocturnals Games, Tracks – The Train Set Game by Excalibur Studios, Unbox: Newbie’s Adventure by Prospect Games, Unjust by Static Shell Studios & Vert by Inside Out Games.

Before we wrap up, there are also a couple of shoutouts we need to make. First to 8Bit10 and their game King of the Castle – congratulations on placing second at Dare Academy and we hope you enjoy your trip to China! Second, it was awesome to hear White Paper Games being featured on BBC Radio One’s newsbeat of EGX!

We hope you enjoyed the show!

Epic Games Goes to GitHub Universe

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FEpic+Games+Goes+to+GitHub+Universe%2FGitHub_770-770x433-19d0daaa632fdba0c3dba7a03d4df92563db8ab6

From October 10-12th, GitHub Universe will bring its community of developers and business leaders to Pier 70, in San Francisco, for advanced skill building sessions, deep dives into how some of the most successful teams use GitHub, and keynotes from leading software company executives.

It’s the Unreal Engine team’s first visit and we’ll be on the floor, demoing a number of projects:

Ghost Paint

A VR graffiti app that allows live air brushing and spray painting onto a variety of surfaces, Ghost Paint was developed by Epic’s Shane Caudle.

VR Editor

The VR Editor enables you to design and build worlds in a virtual reality environment using the full capabilities of the editor toolset combined with interaction models designed specifically for VR world building, and we’ll have our beach scene on hand for you to have some fun in the sun.

AR

ARKit and ARCore samples will be on display, running on Apple and Google devices for attendees to take advantage of. We’ll have everything running in editor as well for everyone to see how it all works!

Be sure to take a look at the full program of sessions that will take place. We hope you’ll stop by and see us at our booth!

NVIDIA Edge Program Winners – September 2017

With the NVIDIA Edge program, Epic Games has partnered with NVIDIA to award high-end hardware to individuals or teams whose projects are pushing the boundaries and striving for visual excellence. For their hard work, each winner will be receiving a GTX 1080 Ti or GTX 1080.

This month's winners are:

Armstrong Townhouse – Michael Banks, Tim Polyak, and Elizabeth Smith

UnrealEngine%2Fblog%2Fnvidia-edge-program-winners-july-2017-copy%2FProceduralEcosystem_GokhanKaradayi_Blog-770x428-027a067b0b570cec1e72376945e174a76bdea165
Armstrong Townhouse is a beautiful architectural visualization of a contemporary living space featuring custom craftsmanship and detailed amenities.

Dynamic Grass System – Sławek Krężel

UnrealEngine%2Fblog%2Fnvidia-edge-program-winners-july-2017-copy%2FSmallApartmentTour_Virtualarts_Blog-770x433-d8fc7e39d0096cf5274dd698c0f309a2bdcb1ec4
Sławek Krężel created an outstanding Dynamic Grass System that is reactive to wind, fire, and other elements.

Wrench – Alec Moody

UnrealEngine%2Fblog%2Fnvidia-edge-program-winners-july-2017-copy%2FDiodeGame_AldonFilion_Blog-770x395-8dfe758ad680cdacd9bce9bc885fc65a09eb9ed7
Wrench is a VR experience, by Alec Moody, in which you repair and modify cars for motorsports. 

As always, the submissions this month were incredible. Thank you for showing us your projects and inspiring those around you in this great community. Congratulations to this month's winners!

Want to submit your project?

Share in-engine stills or video of your stunning projects on Twitter, Facebook and Instagram, and mention both the Unreal Engine and NVIDIA accounts in your social media posts. We'll be on the lookout!

It is always difficult selecting only three submissions. If you’ve entered previously, but haven’t been selected thus far, we encourage you to submit again.

Good luck!

Exploring the Tactics Behind TINY METAL

It is said that imitation is the greatest form of flattery. This is true for Area 35 – a small team of independant developers who have taken on the task of developing a spiritual successor the beloved cult classic real-time strategy title Advance Wars. Their game, TINY METAL, will be instantly recognizable to fans of the aforementioned franchise when it arrives this fall for Nintendo Switch, PlayStation 4, Windows, and Mac OS X.  Of course, the team is also looking to attract new players who might have missed out on the ‘Wars’ series itself or have never touched an RTS to begin with.

I recently caught up with Producer Hiroaki Yura of Area 35 to find out more about the origin of TINY METAL, what actually makes it different from its inspiration and how Unreal Engine 4 is helping a tiny team accomplish its biggest mission yet.

Q: How did the initial concept for TINY METAL come about?

Quite simply, we missed Advance Wars, and seeing that our favourite title hasn’t come back in some time, we thought we’d continue as a spiritual successor. However, we believe that making a clone was extremely bland, and it needs to fit in with the times today, so we added flavours of our own. We’re a small team so we couldn’t do everything we want to, but we believe it has enough originality to pleasantly surprise gamers and Advance Wars fans alike.

Q: Why is TINY METAL a must-play for fans of the turn-based strategy genre?

We’ve fine-tuned TINY METAL to be simple but fun. We’ve taken a lot of inspirations from our favourite games such as Advance Wars and Final Fantasy Tactics but kept it simple and fun to play. Although Japanese games in this genre tend to be in 2D, we’ve made our game in 3D and put in some great animations as well as a simple but intriguing story in between missions.

Quite simply, it is what we hope – a true spiritual successor to Advance Wars.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FTINY+METAL+Interview%2FTinyMetal_Pic1-770x433-5b15840d14f52359feee32a5bce8b4474ae69a61

Q: What all-new mechanics does TINY METAL bring to the genre?

We have several interesting mechanics, most of which are designed to prevent pinning each other down and clogging the map down.

Facings

Facings are a very important facet of the game. If attacked from the flank or rear, the unit will receive more damage, and the possibility of a critical strike increases.

Fog of War

A common mechanic used in RTS/TBS, this makes having units with good visuals important. Having an infantry unit up on the hill increases sight range and without sight prior to their movement, that unit cannot attack any unit they find after the movement phase.

Radar

A radar truck allows the display of blips of enemy units in the fog of war and allows units to attack the blip even without vision.

Focus Fire

This allows multiple units to attack one enemy unit. Should the enemy unit survive the onslaught, they can only return fire at the unit that initiated the focus fire.

Hero Units

These are collectable, unique units that you can find throughout the campaign missions.

The units are stronger than the vanilla counterparts and may also come with extra weapons / parts.

The Hero Units are called in from Comm Towers and can be deployed anywhere you have vision on the map.

Also, these units are capable of retaining their veterancy rank at the end of the battle.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FTINY+METAL+Interview%2FTinyMetal_Pic4-770x433-052599286556c1b5645006a3c5d9546a5e3fd16f

Q: What can you tell us about the game’s story?

We kept the story simple enough for kids but intriguing enough for adults.

The story revolves around a young Artemisian Lieutenant Nathan Gries who gets caught up in an outbreak of war. However, the reason for the outbreak seems out of place and Nathan finds himself in a position to find out why the war broke loose. The story will allow the player to delve further into the world of TINY METAL and with the help of Nathan and his friends, find the mastermind of the war and put an end to it.

Q: TINY METAL has a fantastic visual style. Did you land on this right away or did it take several iterations to achieve?

We had a very clear vision on the visual style but of course, we tested a few variations of it to make sure we were happy with this. Hopefully if this title sells, we will be adding more units and environments to the game.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FTINY+METAL+Interview%2FTinyMetal_Pic3-770x433-6186a9f5c4545a5b4242ef584aeb62835f1a6174

Q: While visually impressive, the game has very accessible system requirements on PC and will reach a broad audience on console as well. How important was accessibility to the overall project goals?

We feel that players shouldn’t be restricted by the spec of their PC. We’d love for people to be able to play on a wide range of consoles and on PC and Mac.

Q: You recently showed the game at PAX West in Seattle. What has feedback from the community been like and how have you incorporated that feedback into the project?

The feedback has been astoundingly positive. We did receive a lot of feedback on usability and UI. We will be making changes to those to make those quality of life types of improvements.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FTINY+METAL+Interview%2FTinyMetal_Pic5-770x433-548a49a07f13cf99640f47ef145a40156f62f4cc

Q: What can you tell us about the industry experience of the team behind TINY METAL?

A lot of the devs are very experienced, having worked on major titles in Japan and abroad, although the majority of the devs are based in Japan.

Many are freelancers who have the same kind of life goals. We’re all amazed at Western indies and amazed at what they can do, and we all feel that we want to become a challenger to this industry again.

Q: What did the team choose Unreal Engine 4 for this project?

The decision was based on Epic Games’ strong commitment to building a solid foundation for Unreal Engine 4. Our decision was made after the announcement for Linux support, something only possible if the engine source code is well maintained and designed. Our experience has not betrayed this expectation. Upgrade after upgrade, Unreal's strong foundation has us focusing on making TINY METAL the best game we can make.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FTINY+METAL+Interview%2FTinyMEtal_Pic2-770x433-b9668cb6e1ccdb071ce86b6e5cbe86bd0138b904

Q: Have there been any aspects of Unreal Engine that have proven to be particularly useful throughout development?

UE4 has incredible tooling and workflow for artists and non-programmers. Thanks to these tools our programmers have been able to focus on player-impacting gameplay or UI without wasting any time on plugins or developing artist tools.

Q: Where can people go to learn more about TINY METAL?

You can follow us on @tinymetalgame on Twitter, https://www.facebook.com/tinymetalgame on Facebook or simply go to tinymetal.com

A Layered Approach to AI Locomotion in ‘The Occupation’

Hello, this is James Burton – Technical Artist at White Paper Games. This post intends to show the process we used to create the locomotion system of our AI characters in our upcoming game The Occupation.  I worked on this system with; Jonny Pickton, our AI designer, Robert Beard, our animator, and Martin Cosens, our programmer.  We should preface this by saying that we, as a team, had never had any experience with character animation (or characters at all for that matter!) prior to making this system. We don’t claim to be pioneering this work, we simply landed on a solution that worked for our project, and we thought it’d be good to share with you all. We’d love any feedback or comments on how we could improve our setup! Our contact info is at the bottom of this article.

The key things we needed from this locomotion system were:

  • Modular, non destructive elements (adding or removing layers wouldn’t mean the character would stop working or affect the other layers).
  • A small coding requirement since we only have one programmer.
  • A proper tone as our game has no combat and the AI is calm and observable. Strafing animations won’t work.
  • A system that sticks to UE4 defaults as much as possible.
  • A system that sticks to constant speed and Turn Rate where possible.

Before doing this, we tried a couple of other methods:

A Full Root Motion Setup: Looked great, however, didn’t give us the precision we needed for our AI to complete other tasks.

Root Motion/ Capsule Driven Hybrid: Used Root Motion for turns and capsule driven movement for everything else. Blending between Root Motion velocity back to capsule was problematic at the time – decided to ditch it.

In the end, we went for fully capsule driven system with a couple of tricks that allowed us to match the speed of the capsule to the animation playing – but more on that later.

In any case, here is what the UE4 defaults look like with no editing. You can see the character just has a walk cycle and navigates the scene, but doesn’t exactly look natural:

So here’s our attempt at fixing this.

Layer 1 – Turning on the Spot

First we focused on getting the base layer setup. This way, designers could work with the characters and we had a basic level of fidelity. For us, this meant getting the character turning on the spot to face the next target point in the Nav path:

  • Animations were authored to a constant rotation rate.
  • We scaled the Play Rate of the animation depending on the angle the character needs to turn. To do this, we divide the angle we need to turn by the rotation rate of the character. This gives us the desired rotation time for the turn. We then divide length of the original animation (in seconds) by the desired rotation time and this will become the Play Rate of the animation for the turn.
  • It’s a very basic setup, might even be shippable for some games.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FIMG1_PlayRateFormula-770x132-40cf6c29f58fbaa0033e828f464617ccc7d473e3
Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FIMG2_ScalePlayrateforSpotTurns-770x175-98be8e33b4c57bc70b85d6a9b95c2c021713904a

Here is a video with the result, as well as some GIFs with the authored animations for this layer:

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FGIF1_-TurnonSpot45-6b7174660396fbf8fd3f40b479cd6b6ec61d4752

   Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FGIF2_TurnonSpot90-073c83b7a4226c2fc49d361f7e961acb6f1e0a51Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FGIF3_TurnonSpot180-e4e18cc4f6b2d412ec8dfa69909f45e3b26d9a21 

  [Turn On Spot 45]             [Turn On Spot 90]         [Turn On Spot 180]

Layer 2 – Walk Starts

The next layer would consist of walk starts – this way the character could accelerate into the base walk speed without just popping into it. Since we’re dealing with acceleration here, and we wanted to keep everything capsule driven, we wrote a small tool. It’s very simple, since my experience with coding is very, very limited it was made in MEL. The tool writes a curve that describes the character’s speed at each frame, mapping the acceleration. So how does this work? Well, we let our animator make an animation and then sample through it with simple maths:

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FIMG3_Speed-Formula-770x132-88308d0bf30bd9c23e0257059d7cb20906cad618

Please note that TranslateX.F2 – TranslateX.F1 simply means subtracting the current frame to the previous frame of the anim, to work out how much movement there has been between the frames. We then divide that by the time it takes to go through one frame at your desired FPS (some animators work with 24fps, others with 30… etc).

Thankfully, we can key this data to an attribute on our Skeleton’s Root Bone and when importing into UE4 make sure “Import Custom Attributes” is on. This will ensure it comes in as a curve in the animation asset. I should mention this is SUPER cool feature in Unreal Engine that opens the doors to a lot of other cool setups!

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FIMG4_CustomSpeedAttribute-770x136-807d4ff0104f8ea1c74066bf17f4dbdc1777ca83

We then use this Curve to set the max speed of the character, and we’re good to go!

Here are the results of this as well as some GIFs of the authored animations:

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FGIF4_StartWalkFWD-825c8ec73e8e1cb5fa734fb78a7286463a763b0aUnreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FGIF5_StartWalk90-a82bd63e1a2266b368bc4147c1e2fad50705c5a4 Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FGIF6_StartWalk-90-aa415b13dc1ba39be1fcdcf4a87b6aef44304537             

 [Start Walk FWD]              [Star Walk 90]                  [Start Walk -90]

Layer 3 – Walk Stops

This one is pretty simple since the setup for it is exactly the same as the Walk Starts, however, in order for the character to know when to start playing the “Stop Animation” we needed our programmer to write a bit of code in UE4 to get it working.

The code checks for the last point in the path, and fires an event to “play stop animation” at the required distance from the end point (This is specified by our animator when making the animation, but try to keep it short!).

There’s a tiny bit of foot sliding due to the acceptance radius of the final point – In the example below it’s noticeable, but with a bit of tweaking you can get it just right. For us it was important that the character reached the final point perfectly since other animations will play off that, so we snap them to the point if there is a bit of error to make sure the rest works well.

Here is the video of the Result as well as a GIF of the authored animation:

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FGIF7_WalkStopFWD-2362243ea8baeb8abf63c7e2146448b9f023f6ea

[Walk Stop Forward]

Layer 4 – Faking Curved Paths

This was the toughest issue to fix and required a bit of code, but it’s also the layer that makes the biggest difference in the overall look of the locomotion system.

Here, we essentially take the character off the nav path and allow her forward vector to drive her movement while she turns until she faces the next point. This is better explained with an image so here it goes:

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FIMG5_TurnExplanationImage-770x770-e93ebd25ac4ef7413b9c28c7a9e8f2015a8e60bd

The important thing is to take advantage of the constant rotation and speed of the character capsule – it makes the maths easier.

Here is a video of the result:

Luckily we didn’t have to author any animations for this as it looked fine without them!

Layer 5 – Turn Anticipation (Oh! A piece of candy!)

This was a nice add-on in order to make the character feel like she was anticipating a turn. For us, this consisted of having her slightly turn her head and body in the direction of the turn before she actually started turning.

This required a little ingenuity, since we couldn’t simply use her next point and Lerp the rotation towards it.

We ended up making a system that uses a “lure” that Lerps between the Nav points ahead of the character and makes the character’s body and head always aim towards it using an AimOffset. In the example below you’ll see a yellow ball that the character is constantly chasing after (unfortunately she can never get it! Hehe).

Here is the result as well as the Blueprint setup for it and a GIF of the Aim Offset:

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FIMG6_PieceOfCandySet-770x223-be0a1732766d1d33c4c9e3ab2882745a0351a344

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FIMG7_PieceOfCandySet_2-770x211-e0d79bd178843175c1f2cd5a130df762bcc13341

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FGIF8_WalkLeansAimOffset-ec813037004b2ac7b6e013c0ad0a6fc92253f623

[Walk Leans Aim Offset]

So there you have it, this is the locomotion system we’re using in The Occupation. We’ll leave the final AnimGraph for this setup below as well as a video with the comparison of a path traversed with Our Locomotion system vs UE4 Defaults.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FIMG8_AnimGraph-770x390-e9a679c94be1f88c5a2c671f1bd5edef3bc387cd

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FThe+Occupation+Animation+Tech+Blog%2FIMG9_AnimEventGraph-770x297-2e54738a785f965abcb9e89e97150f681b663480

We hope this is useful to you, and if you have any questions or comments, please leave them below or feel free to reach us through Twitter (@whitepapergames) or email us at contact@whitepapergames.com.

Unreal Engine 4.18 Preview 1 Released

The release of Unreal Engine 4.18 is rapidly approaching, with many fantastic updates for you to try out. The first Preview build is now available on the Epic Games launcher, so you can experiment with brand new changes, such as volumetric lightmaps, enhanced static skylight directionality, major clothing tool upgrades, the revamped Physics Asset Editor and Media Framework 3.0. In addition, you can access improved support for both Google's ARCore and Apple's ARKit. The desktop forward renderer on iOS, used in the Wingnut AR demo at WWDC 2017, is now available as well (recommended for devices with an A10 or later processor running iOS 10.3 or newer).

To get Unreal Engine 4.18 Preview 1, head to the Library section on the launcher, select “Add Versions” and choose 4.18 Preview 1.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2F4.18+Preview+1%2FUE_418_Preview1_Launcher-770x256-6a97a51b24a585e250dd7a90c2f3b93c03b15b0b

A more detailed list of upcoming changes in the build has been posted on our forums by Stephen Ellis. Please keep in mind that Preview releases are intended only to provide a sample of what is going to be released in the update and are not production-ready. We recommend waiting for the full release of 4.18 before updating your projects to the new engine version.

Epic Games Heads to Oculus Connect 4

This October, Oculus is bringing back Oculus Connect, its annual VR conference held at the San Jose McEnery Convention Center. Developers can expect the latest news and insights surrounding all things Oculus while catching talks and connecting with industry professionals.

Similar to previous years, the Unreal Engine team will be in attendance to host sessions, answer technical questions and connect with you, the VR community! Here’s a breakdown of our involvement at the show.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FOculus+Connect+4%2FUE_OC4_Screen1-770x394-84d14d7cadbcfb188ebbd4efc8b588a5a62ef97c

OC4 Unreal Engine Office Hours

Have technical questions about development with UE4? We’ll have a room filled with experts to help you through any challenges you’re facing. If you’d like to schedule a time to talk with someone from the Unreal Engine team, shoot us a message to oc4@unrealengine.com with the following info:

  • Name of Developer / Team:
  • Project Name:
  • Technical issue/Questions:
  • Is this based on Gear VR or Oculus Rift:
  • Will you have your project with you?
  • Do you have a prior relationship with Epic? If so, who is your business contact? (This will help us know more about your project.)
  • What is the nature of your question?
    • Gameplay
    • Performance
    • Unreal features
    • Platform-specific questions (Oculus Rift, Gear VR, etc.)
    • Getting started with Unreal Engine 4

Due to time limitations, we can’t guarantee that every request will be fulfilled, but we will do our very best!

Wanting to connect but don’t have a technical issue? We’ll have folks from our business and community teams on site as well. You can hit the alias for those appointments as well.

Under the Hood of Robo Recall & UE4 Roadmap

Our very own technical director of AR and VR, Nick Whiting, will be giving an in-depth technical postmortem on the development of Robo Recall, Epic’s first-person action game for Oculus Rift + Touch released earlier this year. Here’s the official session title and description:

The Road to Release: A Technical Postmortem for Robo Recall

Nick Whiting, Technical Director, AR and VR, Epic Games

Achieving high-end visuals while maintaining 90FPS for VR is a delicate balance and requires both deliberate technical decisions and bleeding-edge technology. This session will cover optimization techniques employed during the development of Robo Recall as well as the technical advances made in Unreal Engine 4, including huge upgrades to both the rendering and VR framework code. We'll also take a look at the latest development roadmap to see what's coming to developers in Unreal Engine 4.

OC4 Unreal Developer Highlight: Future Lighthouse

Future Lighthouse, creator of the Unreal Engine-powered storytelling experience MELITA, will discuss evangelism and advocacy for the VR community. They will analyze what is happening now and how can we help move the industry forward in a panel, “Building Global Communities,” which brings together developers from around the globe: Roberto Romero Pérez of Future Lighthouse (Spain), Judith Okonkwo of Ìmísí 3D Creation Lab (Nigeria), and Nausharwan Mir of Psychic VR (Japan). Sara Vogl of VRBase (Amsterdam/Berlin) will moderate.

Read more about Future Lighthouse and their technical strategies for MELITA right here.

Check out the full conference schedule to view the full line-up. We hope to catch you at Oculus Connect 4 and hear all about what you’re building in VR. Attending the show? Mention @UnrealEngine on Twitter and let us know!

OC4 Unreal Developer Highlight: Future Lighthouse

At Oculus Connect 4, Future Lighthouse, creator of the Unreal Engine-powered storytelling experience MELITA, will discuss evangelism and advocacy for the VR community. They will analyze what is happening now and how they can help move the industry forward in a panel, “Building Global Communities,” which will bring together developers from around the globe to share their experiences working across regional communities.

Speakers include Roberto Romero Pérez of Future Lighthouse (Spain), Judith Okonkwo of Ìmísí 3D Creation Lab (Nigeria), and Nausharwan Mir of Psychic VR (Japan). Sara Vogl of VRBase (Amsterdam/Berlin) will moderate.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FFuture+Lighthouse+at+OC4%2FHighresScreenshot00006-770x416-3da72f203eb1a0d5472b41eb6c0b9b4ae42e3ac8

Future Lighthouse’s cinematic experience follows MELITA, an advanced AI, and Anaaya, a brilliant female scientist, who are appointed to find a new planet that humans can inhabit while they embarking upon a personal journey exploring the meaning of life.

“As the immersive technology industry continues to mature, women are playing an essential part in elevating the medium to reach its true potential,” said Malia Probst, Partner at VR Scout and Women in XR, a venture-capital fund focusing on women-led startups in immersive technology. “The fact that MELITA, one of the most ambitious VR projects to date, features a strong female lead is more than fitting.”

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FFuture+Lighthouse+at+OC4%2FHighresScreenshot00025-770x408-683278114d7e64a968824603cc4366b89e57e6d5

The MELITA story consists of three parts, with the second and third still in the development stage and currently seeking funding in the Venice Film Festival’s Co-Production program. In addition to premiering at the VFF, which provides an opportunity to qualify for The Oscars, the project will also be featured at OC4.

“What really sets this apart from other VR experiences is a combination of cinematic movement, character development, actors, and dialogue,” said Nicolás Alcalá, director of MELITA and CEO of Future Lighthouse. “The experience pushes the boundaries of VR cinematic narratives and feels as though the viewer has been placed inside any of your favorite animated films.”

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FFuture+Lighthouse+at+OC4%2FMelita_Lake-770x433-93366500bda7c95f6b092cf335e41b25e67ae859

Throughout the three-part experience, Future Lighthouse explores new ways to interact with an immersive story by developing two different camera views. The “Director’s Mode” guides viewers through the emotion and narrative of the experience in a cinematic way. The “Dollhouse Mode” allows you to see everything from the perspective of an invisible giant, setting the point of view yourself. In addition to this, there will be ways to interact with different scenes and characters out of the main storyline.

“Melita is a technical challenge like no other, and we wanted to push the limits,” said Roberto Romero, CTO of Future Lighthouse. “Building this in the Unreal Engine allowed us to push the boundaries on all fronts, and add interactive elements to explore throughout the film.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FFuture+Lighthouse+at+OC4%2FHighresScreenshot00003-%282%29_%28CON-TITULO-Melita%29-770x397-e3273e2ca55c93a5e376f696020992a8693a75bf

Technical Strategies with Roberto Romero Perez, Chief Technology Officer, Future Lighthouse:

Holograms: We use material parameter collections and our own Blueprint systems to better manage our Hologram effects, enabling us control over their individual and global settings such as opacity, intensity, colors etc, instead of having to animate each element individually in Sequencer. This was a must considering the iterative process that each scene had to go through, both in terms of the events taking place and changes to the assets themselves.

Level Streaming: It wasn’t until the latest stages of the project that we were forced to use level streaming. Up until then, our 13 scenes were separated and loaded sequentially, but we obviously needed continuity in the soundtrack and the scene changes to happen seamlessly without loading times. We had a master sequence with the music track, independent from the rest of the other sequences, that also handled the basic controls.

We wrote the names of each scene in order and its sub-scenes (lighting, dynamic and static), and we added them to a data structure composed of an array of strings that determined the order of each scene, the order of these within the application and the total duration of each sequence. We read this structure in real time to know if we had to load or unload the scene and also controlled that it was visible at all times, to control the memory load and optimize performance.

Unreal+Engine%2FblogAssets%2F2017%2FSEPTEMBER+2017%2FFuture+Lighthouse+at+OC4%2FHighresScreenshot00011-770x416-4980cde3d02c17aa157026ffff743f4fb65a9c70

Transitions: Besides the Sequencer-included fade transitions, we had to implement another method for transitioning in and out of scenes using the controls at any time. To do this, we used a global postprocess effect, which only activates at the right moment gradually, to achieve the effect that the director wanted.

Story Expansion: The interactive exploration in MELITA reveals additional bits of the history of Melita in different locations. We created different interactive elements with characters – for example, we can caress a bear, play with a butterfly marking a path or reveal audio logs that expand the story of Anaaya and Melita through sound clips associated with these elements.

For more information on the upcoming OC4 panel, head to the official Oculus Connect website. Additional information on MELITA and Future Lighthouse can be found here.