Unreal Engine Developers Take Center Stage at The Game Awards 2017

The games industry gathered in Los Angeles last night at The Game Awards to celebrate the most outstanding creative and technical achievements of 2017 while looking forward to what’s to come in 2018 and beyond. Unreal Engine developers from around the world were featured throughout the evening as projects like Ninja Theory’s Hellblade: Senua’s Sacrifice took home well-earned hardware and others, such as Bandai Namco’s Soul Calibur VI, surprised everyone with delightful debuts.

With 12 UE-powered nominees across 10 different categories and several of the evening’s most talked-about announcements/reveals, the presence of Unreal Engine developers at the industry’s biggest annual award show was certainly felt.

We would like to congratulate all of the developers that made 2017 such a success as we recap the Unreal Engine-powered nominees, winners and special announcements that took place during The Game Awards 2017:

AWARD NOMINEES/WINNERS:

FallingSky.jpg
Falling Sky | Jonathan Nielssen, Nikolay Savov, Mohsen Shah / National Film and Television School
NOMINEE: Best Student Game

Farpoint.jpg
Farpoint | Impulse Gear / Sony Interactive Entertainment
NOMINEE: Best VR/AR Game

Fortnite.jpg
Fortnite | Epic Games
NOMINEE: Best Multiplayer

Hellblade.jpg
Hellblade: Senua’s Sacrifice | Ninja Theory
NOMINEE: Best Narrative, Best Independent Game
WINNER: Best Audio Design, Best Performance (Melina Juergens), Games for Impact

Hollowed.jpg
Hollowed | Erin Marek, Jerrick Flores, Charley Choucard / University of Central Florida – Florida Interactive Entertainment Academy
NOMINEE: Best Student Game

Injustice2.jpg
Injustice 2 | NetherRealm Studios / Warner Bros. Interactive Ent
WINNER:  Best Fighting Game

MvC_Infinite.jpg
Marvel vs Capcom: Infinite | Capcom
NOMINEE: Best Fighting Game

PUBG.jpg
PLAYERUNKNOWN’S BATTLEGROUNDS | PUBG Corp.
NOMINEE: GOTY, Best Ongoing Game
WINNER: Best Multiplayer

RocketLeague.jpg
Rocket League | Psyonix
NOMINEE: Best Esports Game

TEKKEN7.jpg
TEKKEN 7 | Bandai Namco Studios / Bandai Namco Entertainment
NOMINEE: Best Fighting Game

WhatRemainsofEdithFinch.jpg
What Remains of Edith Finch | Giant Sparrow / Annapurna Interactive
NOMINEE: Games for Impact, Best Independent Game
WINNER: Best Narrative

XCOM2WotC.jpg
XCOM 2: War of the Chosen | Firaxis Games / 2K
NOMINEE: Best Strategy Game

UNREAL REVEALS

While The Game Awards helps the industry celebrate the most outstanding titles from the year that was, the show also serves as a prominent platform for announcements and surprising reveals. These Unreal Engine-developed projects were in the spotlight throughout the night and continue to generate a ton of buzz online:
 

A Way Out | Hazelight /  EA
The highly-anticipated, narrative-driven EA Originals title, A Way Out, debuted a new trailer and announced that players can download the friends pass free trial to enjoy the entire co-op experience with a friend who has purchased the game. A Way Out will be available as a digital download for players worldwide beginning March 23, 2018 on Xbox One, PlayStation 4 and PC.
 

Fade to Silence | Black Forest Games / THQ Nordic
During The Game Awards, THQ Nordic announced the character-driven, survival game Fade to Silence. Currently in development for PC and consoles at THQ Nordic studio Black Forest Games, Fade to Silence is launching on Steam Early Access on December 14, 2017.
 

Fortnite: Battle Royale | Epic Games
Epic’s Donald Mustard took to the stage at The Game Awards to reveal Fortnite: Battle Royale’s brand-new, limited time 50v50 mode that was made available instantly. He also confirmed that the game has over 30 million players and has achieved 1.3 million concurrent users.
 

PLAYERUNKNOWN’S BATTLEGROUNDS | PUBG Corp
Teased throughout the night, the new desert map for PUBG showcased the intense action and arid environments that players will inhabit in Miramar when the GOTY-nominee’s new map makes its way to PC and Xbox One on December 12.
 

Soul Calibur VI | Bandai Namco
Bandai Namco’s Katsuhiro Harada and Motohiro Okubo celebrated the success of the UE4-powered TEKKEN 7 before going on to announce the long-awaited and highly-anticipated Soul Calibur VI, which debuted with the fantastic trailer above.
 

Witchfire | The Astronauts
The Astronauts, makers of critically acclaimed The Vanishing of Ethan Carter, revealed their latest Unreal-powered project with a visually stunning teaser during The Game Awards. Entitled Witchfire, the game is a dark fantasy shooter that’s heading to PC.

Congratulations once again to all of the amazing Unreal Engine developers who are creating award-winning, highly-anticipated and inspiring content around the globe. We look forward to seeing how all UE4 projects progress in 2018 and beyond, so be sure to give us a shoutout on Twitter @UnrealEngine using the hashtag #UE4 when sharing your beautiful screenshots and videos.
 

Discussing Multiplayer Infrastructure with GPORTAL

Crafting a compelling online experience can be a challenge and having to manage online infrastructure for your multiplayer-focused game adds an extra layer of complexity that many developers might not be equipped to handle. Recently released as a free plugin via the Unreal Engine Marketplace, GPORTAL Connector is providing free dedicated test servers in all global locations in order to ensure that Unreal Engine developers are able to think about their game’s online infrastructure early and often.

We caught up with Markus Schneider, CMO, GPORTAL, to learn more about best practices and how the plugin can assist Unreal Engine developers.

770_GPORTAL_Connector_Pic1.jpg

In your experience, how early in the development process of a multiplayer-focused title should server implementation be evaluated and tested?

Studios developing multiplayer-focused games should start with testing in a real environment with servers of data center quality as early as possible. Since the gaming experience for players is heavily reliant on server performance, we strongly recommend searching as early as possible for the right partner to set up multiplayer infrastructure with the studio.

What are some of the biggest mistakes that small teams make when planning to go live with a multiplayer-focused game?

The biggest mistake in our opinion is that many studios do not research potential partners when it comes to multiplayer infrastructure, instead thinking that it is something that can be easily resolved late in development. This ultimately ends up robbing the team of crucial development time and resources.

As an example, by the time we overtook the hosting of Citadel: Forged with Fire from our partner Blue Isle they had invested significant development resources into updating and deployment systems. Because this is done by our systems for over 100 titles, the studios working with us do not need to manage this aspect and can instead concentrate on developing the game and its core features. 

Please tell us about the GPORTAL Connector Plugin that recently arrived on the Unreal Engine Marketplace. What does it do and who is it for?

Right now GPORTAL Connector helps us get in contact with studios working on multiplayer-focused titles as early as possible. Using the plugin, they can simply apply to receive a free server and consulting. As the official host for Conan Exiles and the first hoster to realize in-game server rental on a console (Xbox One, Conan Exiles), we also have a lot of experience which we love to share with studios. 

Why do we do this? It’s pretty simple – we generate revenue by selling game servers to players, so if there are more successful multiplayer titles for players to enjoy it is a win/win for everyone.

770_GPORTAL_Connector_Pic2.jpg

You’ve worked to support teams of varying sizes. What types of projects do you think your services are best suited for?

We think our services are best suited for indie studios with a professional approach. As we see with our partners — Unreal Engine developers like Blue Isle (Citadel: Forged with Fire), Gamepires (SCUM) and Funcom (Conan Exiles) — it is no matter of size. That said, I would estimate that projects with 5-150 people are typically best for working with us when establishing multiplayer infrastructure. 

What are the direct benefits that Unreal Engine developers stand to gain from utilizing the G-Portal Connector plugin?

They will get direct connection to our team and consulting. They can apply for free test hardware in one of our global locations, no matter if it is US West, US East, EU, Asia, South America or Russia. Nine out of 10 applicants receive a free server, so all interested teams should give it a try.

Why did you want to offer the G-Portal Connector Plugin on the Unreal Engine Marketplace? 

As a company, we do a lot of research to connect with studios at the earliest possible stage so that we can establish a connection and produce a positive start for their game when it comes to multiplayer infrastructure. Despite our efforts though, it is often the case that studios get in touch with us 6-8 weeks before they launch, which is way too late. We recommend connecting with us to evaluate your specific needs 6-12 months before closed alpha launches. We are offering this plugin to facilitate that connection with the Unreal Engine community.

How can Unreal Engine 4 developers apply for free global test servers in all global locations?

Simply download the GPORTAL Connector plugin. Fill out the application and we will get back to you within 48 hours!
 

Building Inspiration with Stonehenge VR SANDBOX

VR has the power to transport us to far-flung regions of the globe, into the pages of history, and beyond the limits of our own physicality. Experience is only one side of the coin though. VR can also be a gateway to learning, forging personal connections otherwise impractical or even impossible.
 
Energized by their own experiences with VR, Jessica Villarreal and Christian Bretz immediately recognized that potential and shifted course to co-found VoyagerVR. But first, they had their own groundwork to construct. Filmmakers by trade, neither had any experience in software development before embarking upon their journey to create Stonehenge VR.
 
Initially fueled only by inspiration and DIY spirit, Stonehenge VR has since exhibited at the Pacific Science Center and currently resides at the Museum at Prairiefire. Thanks to an overwhelmingly positive reception, VoyagerVR recently released Stonehenge VR SANDBOX; a standalone title that expands upon the original with the ability to build, paint, and share your own monuments.
 

Tell us about VoyagerVR and the history of the studio.
 
Christian Bretz – We were at E3 in 2013 when I was pulled into a private demo of the Oculus DK1 with an early build of Eve Valkyrie. Sitting in the cockpit of a spaceship and looking down to see myself in another body was an instantaneously life-changing experience.
 
Jessica Villarreal – It was so early in the VR cycle that we couldn't find anyone who shared our vision in wanting to be at the forefront of something revolutionary, or willing to take on our ideas. We knew we wanted to change careers, but we weren't developers, nor did we know any.
 
Christian – My first instinct was to approach companies that had financed our past projects, but none agreed. In 2004, before YouTube, I had created one of the first viral videos, “how to be: emo.” I had to explain to studio executives what a video “view” was, and there I was doing the same thing with VR.
 
I knew there was a small window of opportunity to make a mark. Instead of waiting for the executives to catch up, I decided to do it myself. I deleted my Facebook account, locked myself in a room, and watched every Unreal Engine tutorial I could find. Three months later, we had our first build of Stonehenge VR.
 
Jessica – We never anticipated the heartwarming feedback we got. We heard from a man who was able to show his terminally ill father a place he had always wanted to visit. We heard from a woman who grew up near Stonehenge in real life, and was always going there on school field trips, but had a changed perspective on it after seeing it in VR. It was stories like theirs that motivated and inspired us to keep pushing forward with our ideas.
 
Christian – Throughout the whole process we've done whatever it took to make it work. When we installed our first VR exhibit at the Pacific Science Center, we loaded up our car with the computer and the Vive and personally drove it from Los Angeles to Seattle. I've always made the analogy that starting VoyagerVR was like that scene in Indiana Jones when he has to step onto the invisible bridge, just believing the next step will be there.
 
UnrealBlog_VoyagerVR_-(5).jpg

Why did you choose Stonehenge as the subject of this project?
 
Christian – I have a vivid memory as a kid of my dad flipping to a picture of Stonehenge in a history book and saying, “This is why you need to study, so you're not the type of person that looks at this and says, 'It's just a bunch of rocks in a field, so what?'” That always stuck with me.
 
There was also practicality from a design standpoint. I was eager to start a project of my own, but I knew the scope had to be something I could accomplish completely on my own with no budget. Had I picked something like the Roman Colosseum, there's no way I would have done it justice.
 
Jessica – Stonehenge also has several benefits to being viewed in VR. The scale of the stones is part of the appeal and in VR you get to see what nine-meter-tall stones look like, which simply doesn't translate to video. We've also heard from some people who have visited Stonehenge in real life who said that they like our version better, because the atmosphere of the real location can be ruined by all the other tourists and you can not get close to the stones.
 
UnrealBlog_VoyagerVR_-(11).jpg

What sort of research went into recreating Stonehenge for VR?
 
Christian – The whole thing was recreated based on real measurements, aerial photographs, 360-degree photos taken from the ground, and documentary footage. I did all the research for the script and wrote it myself, which was a bit more complicated than might be expected.
 
Jessica – So little is actually known about the monument because it's all pre-history, and it is very difficult to find facts that you can definitively say are true. There ends up being a lot of “it is believed” and “evidence suggests” type statements.
 
Christian – I was nervous taking it to our first museum, the Pacific Science Center in Seattle, because I had done all the research myself, built it in my bedroom, and there we were installing it in one of the most iconic museums in the country. I later had archaeologists from the University of Manchester review the script. To my relief, they approved and enjoyed it very much.

UnrealBlog_VoyagerVR_-(6).jpg

How did the idea for SANDBOX come about, and what can players do in the mode?
 
Christian – There's a credit sequence when you complete the original Stonehenge tour. To add a little entertainment, I included a table with a miniature version of Stonehenge to play with while the credits scroll. Some people would knock the whole thing over and others would try to rearrange the stones. Many guests suggested we add a mode for people to build their own monuments.
 
Jessica – The concept of being able to construct or destroy your own version of the monument came about very organically. It's funny, because just recently we were watching a Twitch streamer play the game and she suggested the exact same thing when she got to the miniature version.
 
Christian – At first, I added the ability to spawn and place stones, but then you had to have a selection of objects, and if you were going to spend all that time building then there had to be a save system, and if you wanted to make a large structure you needed to be able to change the size of the character. The concept snowballed for months until it became what it is now.
 
We also wanted to give people as many ways to share their SANDBOX Mode creations as possible. For instance, we just added a new feature using NVIDIA's Ansel plugin that allows you to export stereoscopic 360-degree photographs of what you build. These photos can be shared on Facebook and viewed on mobile VR devices. This is just one example of the great tools that are supplied in the Unreal Engine.

UnrealBlog_VoyagerVR_-(4).jpg 
Were you ever concerned about the tour and SANDBOX modes conflicting?
 
Christian – We considered this, but a few things helped alleviate our concerns. The first is the personal connection created through VR. Our goal was to instill a greater appreciation for the monument, and in theory the ability to construct and design your own monument should make that connection even stronger.
 
The second is that many people have never tried VR. We wanted to show the variety of experiences it could offer. For example, the ability to be as tall as a building or to change the sun and simulate any time of the year with a push of a button. Those are amazing feelings that we didn't have the opportunity to add when first designing the tour.
 
Jessica – We've never wanted our educational VR experience to be strictly about memorizing a bunch of facts. It's about the emotion and connection to the subject, and the SANDBOX Mode is an extension of that design philosophy.
 
UnrealBlog_VoyagerVR_-(10).jpg 
Why did you choose Unreal Engine 4 to help create this project?
 
Christian – While comparing engines, the UI seemed so welcoming and well thought out. Sometimes you can open other 3D design tools and you're presented with an all grey interface with 10,000 buttons, and it can appear a bit overwhelming for someone new. The clean interface of UE4 really makes it much more approachable, yet the possibilities of what it can do are basically infinite.
 
I'm continually blown away by the engine's speed and stability, and the performance of real-time rendering. One “wow” moment came while rendering our first trailer. It had a 20-second shot with hundreds of thousands of blades of grass all rendering in 4k at 60fps on a four-year-old computer. This process now took minutes to set up when previously it could have taken days.
 
UnrealBlog_VoyagerVR_-(3).jpg
 
Tell us about a favorite tool or feature of Unreal Engine 4 and how it aided development.
 
Christian – I fell in love with Blueprints. While I've always been heavily involved in computers, I never learned to code. My background was in directing, VFX, and writing, so for something to give me the ability to develop software on my own, in such a robust way, is insanely cool.
 
The Unreal Engine Marketplace was also a huge resource that opened a whole new world for our small team. We don't know anyone in the game development community to create assets for us, but they're all right there in the Marketplace. In Stonehenge VR SANDBOX, we used the Rama Save System, Ultra Dynamic Sky, and the Ultimate Rocks to get us started building our product, just to name a few.
 
I also have to mention the online communities, including the r/unrealengine subreddit, the Unreal Forums, and all the YouTubers that post tutorials, like Tesla Dev and Mitch's VR LAB. When I was just starting out and had no idea what I was doing, the answers I needed were a click away. Someday, I hope that we can help others in a similar way.
 
Jessica – Over the last two years, Christian and I have had our lives changed because of the abilities that Unreal Engine has given us. We went from an idea of wanting to start a VR company with no software development background, to creating VR software that is now on display in a museum. Since then, we were also a launch title on HTC's Viveport, where we were then featured on several prominent blogs, and have even been invited to colleges to give lectures to their design students on VR software development. There's no way we could have achieved what we've done, with virtually no budget, without the tools available in Unreal Engine 4.
 
UnrealBlog_VoyagerVR_-(7).jpg

How do you envision the future of VR for educational and explorative experiences?
 
Jessica – I envision VR giving people of all ages the chance to see, learn, and connect with something that's out of reach. Money, time, health, and so many other factors can play into why a person can't travel or explore, and VR is going to bridge that gap. The emotional response a person has from VR is vastly different from reading a textbook or watching a screen, and it is a very powerful way to absorb new information.  
 
Places like the depths of space, inside the human anatomy, and historic battles are impossible to explore. But, the impossible becomes possible with VR, and I envision a future where there are more opportunities for the masses to adopt this way of learning, to create empathy in people and make some changes in the world.
 
Christian – In the next 5-10 years or so, I see VR as a way to spark students' interest. They could virtually visit a place, take photographs, and hang out with classmates to form real memories and build connections with the material. My ultimate dream for VoyagerVR software is for it to become a deeper hub of experiences across many subjects.

UnrealBlog_VoyagerVR_-(12).jpg
 
What do you hope people gain from the experience of playing your game?
 
Jessica – Seeing players get lost in our world by learning about and understanding what Stonehenge was theorized to be has been so rewarding. We pride ourselves in giving people that “wow” experience in whatever project we do, and we hope that people walk away feeling exactly that.
 
Christian – My hope for what people gain from the experience, besides a greater appreciation for Stonehenge, is that they see the wonderful opportunity we have right now to make learning experiences like Stonehenge VR a form of mainstream entertainment. There have been some successful attempts with this idea in the past when other technologies emerged, such as the invention of the PC, CD-Rom, etc., but this seems different because the technology is so compelling, and we love sharing that experience with people.
 

The 2017 Epic MegaJam Results Are In

What an incredible jam! We had a record-breaking 279 game submissions, with more than 700 developers in participation. It’s inspiring to see the amount of innovation and talent that shines through in all the games from our community.

EpicMegaJam_Promo_Sponsors_1031_770.jpg

The Epic MegaJam began on November 2nd with the announcement of the theme; ‘However vast the darkness, we must supply our own light.’ Over the next seven days, a diverse range of creations emerged: games in which you avoid krakens in the night, give hugs to bring back joy to your neighbors, or travel between 3D light and 2D dark worlds. Many teams even documented their ongoing development via Twitter, allowing everyone to see their progression from concepts, to prototypes, to their final submissions.

MegaJam_Collage.jpg

Plenty more to see using #ue4jam!

All that hard work resulted in many outstanding games and we had a blast playing through all of the submissions. And lucky for you, all the games are posted on the announcement thread, where you can download and play them as well! 

It was a challenge to narrow down to a short list of winners and we’d like to congratulate all those who participated. As we played through all of the submissions, our judges were tasked with selecting three overall winners and three special category winners who the Epic MegaJam prizes would be divided amongst. And without further ado, the 2017 Epic MegaJam results:

Overall Winners

Team Spectrum – ASSIMILATION

Assimilation_770.jpg

Beard Envy – Filament

Filament_770.jpg

AnnieNation – Light Bearers (Direct Download)

LightBearers_770.jpg

Special Category Winners

Something, Something Reality: Ray the Robot – Matthew Palaje

RayTheRobot_770.png

Best Cinematic: CaveBoy – Jisan

CaveBoy_770.jpg

Army of One: Bureau of Happiness – Succubi in Hats

BureauOfHappiness_770.jpg

Once again, the high-quality and imaginative projects that the community creates is beyond impressive and we look forward to seeing your future endeavors.
 

Want to learn more about the winners? Check out the livestream of the Epic MegaJam results for our playthroughs of the winning games!

Many thanks to our wonderful sponsors, without whom we wouldn’t be able to offer fantastic resources and prizes to our jammers: Intel Software, Falcon Northwest, SideFX, Allegorithmic, Assembla, IKinema, SpeedTree, GameTextures, Crowdforge.io, Panda Dev Studios, and Blue Man

Thank you again to all those who participated, making this our biggest #ue4jam yet! We hope you learned a lot and enjoyed the fruits of your labor as much as we did. Make sure to keep an eye on @UnrealEngine and #ue4jam for ongoing Unreal Engine updates and information about the next big event!

Real-time is the Future, Why Change Now?

We’re in the midst of a massive transformational shift in the way design visualization content is created. In the burgeoning age of interactive experiences, static images aren’t always effective in conveying the full depth of a designer’s vision. When it comes to meeting the needs of stakeholders and clients, having the speed and flexibility to iterate quickly is particularly invaluable in product design and architecture visualization.

1_WhyRealTimeNow.jpg

Tadao Ando’s Church of the Light is brought to stunning  life by Fabrice Bourrelly in Unreal Engine

That’s why many of today’s designers and visualization specialists are turning away from slower, traditional offline rendering methods in favor of adopting more efficient real-time workflows that streamline the process of crafting great 3D design.

Understanding the power of real-time

Real-time design is making a world of difference for designers seeking a competitive edge across fast-paced creative industries. As they discover what’s possible with real-time tools, many are finding the old way of doing things just doesn’t cut it anymore.

With offline rendering techniques, making changes to a complex design can mean spending minutes, hours, and even days re-rendering with each iteration cycle. Additionally, assets created this way are limited to their initial intended use. By creating real-time assets, designers can unlock a range of possible use cases and targeted experiences. Adopting real-time techniques allows designers and artists to extract more creative value from a given effort — better design, better flexibility, and far, far faster feedback. What’s not to like?

2_WhyRealTimeNow.jpg

BMW Design uses Unreal Engine to explore interior alternatives

Experience-driven design lets clients step into a scene, explore it from different perspectives, and customize it to get what they need from it. There’s never been a more effective way for designers to sell someone on their vision.

Why real-time, and why change now?

Simply put: the future is already here. Real-time engines are changing the competitive landscape across the realm of architectural visualization, product design, engineering, industrial design, and other creative industries. 

Faster iteration speeds and greater design flexibility is putting early adopters of real-time technologies at a major cost advantage over artists who stick with traditional techniques for asset development. Offline rendering takes significantly longer to produce the same quality of work, meaning that wasted time could have been better spent improving the end result. 

3_WhyRealTimeNow.jpg

Boeing uses Unreal Engine to create a virtual training simulator for the Starliner project

As clients experience design in newer, more immersive ways — such as VR and interactive live rendering — their expectations are growing, too. Teams who ignore this fundamental shift will risk falling behind the competition and miss out on many opportunities in a rapidly evolving design space.

Finding accessible solutions today

Adopting new design techniques can be intimidating, but getting up-to-speed with real-time engines for design is now easier than ever.

For designers seeking artist-friendly solutions and frictionless workflows, Unreal Engine is a foundational building block that makes adapting real-time design to the needs of any creative industry a seamless process. The platform is fully expandable, and powerful tools like Blueprints — Unreal Engine’s accessible visual scripting system — lets designers build on top of Unreal without having to learn actual code. This has opened the door to bringing real-time capabilities to a much broader range of industries beyond games.

4_WhyRealTimeNow.jpg

Rotor Studios built a sales configurator for Toyota showrooms using Unreal Engine

Unreal Engine provides an easy on-ramp for integrating real-time design into many diverse industries, empowering designers to craft their own visualization strategy with the platform to get exactly what they’re looking for. From architects and automotive builders to product designers and marketing teams, it’s helping to bridge the gap into real-time and gain a competitive edge in numerous design fields.

Ready to learn more about what Unreal Engine and real-time design can do for you? Watch our free webinar series to help you get started!
 

Google Poly API Announced, Poly Toolkit SDK for Unreal Released

Today, Google has launched the Poly API, a REST API which allows developers to give their apps instant access to a growing collection of 3D assets. As part of the API launch, Google also launch an SDK, referred to as the Poly Toolkit for Unreal Engine, that will provide run-time access to public, remixable Blocks models and OBJ models on Poly (as well as a user’s own assets).

Poly Toolkit for Unreal Engine allows apps to interact directly with Poly to search, download and import objects dynamically across desktop, mobile, virtual reality and augmented reality platforms such as ARCore.

770_GooglePolyAPIArtboard-15.jpg

Google is supporting the launch with an example project in Unreal Engine as well as a dev site containing sample code showing how to use the API on Android and iOS, including ARCore and ARKit samples, and also a sample on how to use the API from web apps.

You can find out more about Poly Toolkit for Unreal Engine by visiting this post on the Google AR/VR blog.
 

Debugging UFunction::Invoke

Back in October Nick Penwarden highlighted several improvements that we made to Unreal Engine 4 while working on Fortnite Battle Royale.

One of the things that didn't make the list in his article is a little debugging gem I added while working on Battle Royale. The team made heavy use of Blueprint based Widgets so we would often see callstacks in C++ like the following, which don’t capture who in Blueprints is actually making the native call.

 

[Your C++ Code]
UFunction::Invoke(...)  
UObject::ProcessEvent(...)  
AActor::ProcessEvent(...)  
[Rest of the stack]

I was working with a lot of code in Fortnite that I was unfamiliar with, so it wasn't always obvious where the calls might be coming from. To aid in my search I added a C command you can run from the Immediate Window in Visual Studio that will print the current Blueprint/Scripting callstack.

When running an Editor build of your game, you can now use the command,

{,,UE4Editor-Core}::PrintScriptCallstack()

and just

::PrintScriptCallstack() 

in a monolithic build when using 4.18 and beyond.
770_DebuggingUFunction_Invoke.jpg

I hope you find this as useful as I have in getting a live snapshot of the callstacks in your UE4 projects.

Unity Technologies Acquires Game Hosting Division of Multiplay from GAME Digital, PLC

SAN FRANCISCO & SOUTHAMPTON, England, November 28, 2017 — Unity Technologies (https://unity3d.com/), creator of the world’s most popular creation engine that reaches nearly three billion devices worldwide, announced today that the company has entered into a definitive acquisition agreement under which Unity Technologies has acquired Multiplay Ltd and its game hosting business from GAME Digital plc (LON:GMD) for £19M GBP. The events business and its associated IP’s will remain with GAME.

Undefined