news

Исследователь из Гарварда использует искусственный интеллект для борьбы с самой распространенной причиной детской слепоты

Врачи могут предотвратить одну из наиболее распространенных причин слепоты у маленьких детей, но только тогда, когда могут ее обнаружить.

Красивый, стильный и умный: дебют Mercedes-Benz A-класса с технологиями искусственного интеллекта и 3D-графикой от NVIDIA

На прошлой неделе Амстердам побил все рекорды. Больше каналов, чем в Венеции. Больше велосипедов, чем людей. И новый умный Mercedes-Benz с мозгом больше, чем у любого другого автомобиля, благодаря кокпиту на базе технологий искусственного интеллекта (ИИ) от NVIDIA.

Глубокое обучение помогает диагностировать рак молочных желез, который не видят маммографы

Маммография спасает жизни людей, выявляя на ранних стадиях рак молочных желез, за исключением тех случаев, когда она не может его обнаружить.

Time Hack Your VFX Pipeline with Unreal Engine

Real-time rendering technologies are transforming the way VFX studios in the film and TV industry approach production. With Unreal Engine putting more flexibility and creative power into the hands of artists making changes further in the pipeline, real-time is becoming the technique of choice for VFX innovators seeking a competitive edge.

If you’re eager to gain a better understanding of how real-time rendering with Unreal Engine can revolutionize VFX production workflows on film and TV, our latest masterclass is a must-watch primer packed with insights to help you excel.
 

Last month, Epic partnered with the Gnomon School of VFX in California to present Unreal Engine for VFX: Building and Optimizing Worlds for Real-Time — a masterclass featuring over two hours of in-depth knowledge from from Epic Games technical artist Homam Bahnassi. Broken into two sections, the presentation covers everything you need to know to build a better VFX pipeline using Unreal Engine along with tips for optimizing your production cycle with real-time rendering techniques.

Missed the live class? No sweat. The info-packed replay is now available for you to watch online in its entirety for free!
 

Focusing on how to build a better VFX pipeline for Unreal Engine 4, the first part offers a comprehensive overview of the content pipeline and explores the fundamental concepts and workflows that drive real-time performance. It explains and visualizes the entire rendering process and provides a bigger picture look at performance and workflows for real-time in Unreal Engine.

Part two digs deeper into the VFX rendering pipeline in Unreal Engine, focusing on best practices for rendering scenes in real-time and tips for getting the most out of your workflows. It also delivers guidelines and profiling techniques to improving CPU and GPU debugging to get top performance from your VFX pipeline.

Ready to dive in and uncover the best techniques for taking your VFX pipeline to the next level with Unreal Engine and real-time rendering? Watch the full masterclass presentations above.

Interested in attending a training event like this? We'll be at the Vancouver Film School next. Details here.

51VR Crafts Photorealistic City Streets in Unreal Engine for AI Training

Tapping into the potential of using digital worlds for AI training, predictive analysis, and navigation, 51VR is on a mission to build a virtual domain for a brighter future. The task of meticulously re-creating interactive real-world locations in VR is no simple feat, but using Unreal Engine and photogrammetry technology, the company’s Chengdu Studio is blurring the line between the real world and the digital one with unparalleled fidelity.

With R&D teams located around the world, 51VR is China’s largest and fastest-growing virtual reality focused technology innovators. The company pairs Unreal Engine with VR and other technologies across disciplines, from real estate to education and beyond. More recently, the team’s impressive work re-creating a photorealistic interactive 440 meter stretch of city street in Unreal Engine showcased its dedication to detail in bringing the real world into VR.

Blog-body-img1.jpg

Despite looking like a photograph, this is really a virtual Chengdu Street running at 40+ FPS in Unreal Engine

“The so called ‘Chengdu Street’ is merely just the art part of a whole project,” explains 51 Chengdu Studio’s Kevin Guo. “We used photo scanning technology to recreate the majority of the material and small items of this 440-meter-long street. Since the whole project is designed to run trainings for artificial intelligence used for autonomous driving, it has a high requirement for the fidelity of street recreation in order to match the situation in real life and reduce the uncertainty in VR simulation.”

To fully capture and rebuild the level of detail required to suit the specialized project, the team scouted the location, taking over 3,000 3D photos as a reference for modeling and material making. From street materials and prop objects right down to plants and trees, photogrammetry was used to scan and build an in-depth reference base to help make the digital version of the the street as photorealistic as possible.

The process was painstaking but the high fidelity of the final result holds up amazingly well next to the actual reference photography. “In order to recreate the street, we went through a massive modeling and material making process…it took the five of us a month and a half to finish,” says Guo.

Blog-body-img2.jpg

The smallest details of Chengdu Street were captured in the Unreal Engine experience created by 51VR

“For a better performance in Unreal Engine 4, we made LOD for the models as well as other optimizations,” he adds. “Due to the final usage of this project, a very high video game FPS target is not what we aimed for. Instead, we focused on how to recreate and present all the information contained in this street on a GTX1080 setup while keeping the frame rates at 40 to 60 frames per second.”

All told, the team used nearly five million textures, two million shaders, and fourteen million triangles to recreate the Chengdu street in Unreal Engine, and the stunning results speak for themselves.

Importing large volumes of design data into Unreal Engine for major projects doesn’t have to be a time-consuming ordeal. Looking for a tool that streamlines and speeds up that process? Join the Unreal Datasmith beta today to shave hours, if not days off of your real-time design workflows in Unreal Engine!

Industries Collide, Collaborate at Build: London 2017

The stunning headquarters of the British Academy of Film and Television Arts (BAFTA) were buzzing with enthusiasm throughout Build: London 2017 as a diverse group of creative professionals discovered how they are each tapping into the power of Unreal Engine to unlock new potential in their disciplines.
 

Presentations from Industrial Light & Magic, Realtime UK, McLaren Automotive, The Mill, C4X Discovery, Soluis, and Abyssal demonstrated how each of these cutting-edge innovators are trailblazing new paths to success in their respective industries using Unreal Engine’s real-time capabilities. 

Check out the video recap of the event above and visit the Enterprise portal on out site to learn more.

How Unreal Engine Helped Q.U.B.E. 2 Achieve Stunning Environments and Level Design

Introduction

Hi everyone, my name is Harry Corr and I am the Art Director and Level Designer at Toxic Games studio in the United Kingdom. With the upcoming release of our new puzzle adventure game and sequel, Q.U.B.E. 2, I’m excited to finally be able to describe our team’s experience using Unreal Engine 4 to develop and ship a multi-platform game.

qube2image1.jpg
The Oasis Scene in Q.U.B.E. 2 which introduces a natural element into the environment.

This was both my first time working in the role of director and my first time using Unreal on a full production. The art team on Q.U.B.E. 2 consisted of just myself and a talented junior artist named Connor Stanley. As you can imagine with a small team, there were times during development where the amount of elements we had to work on felt overwhelming: level design; lighting; materials; particles; set-pieces; gameplay; performance; UI. There were a whole range of different features which the game required that we were lacking experience with. However, ultimately nothing proved to be insurmountable largely thanks to Unreal’s superbly designed interface and user-friendly tools. 

The ‘Baptism by Fire’ approach to learning the engine during active development was made significantly easier by the absolutely incredible amount of free Unreal content available online. The online community provided so much invaluable support and information which made learning the tools a relatively quick and easy process. I think it’s also great to see so many other creators sharing their work and techniques online, and being able to actually engage with them on their process. This would often inspire us to try something we may never have considered otherwise.

Art Direction

qube2image2.jpg
A corridor in the Industrial environment which showcases the lighting and reflections.

Early on in development, a key decision was made to maintain the minimalist aesthetic and cube structure of the original game. We didn’t want to alienate fans of the first game just for the sake of a new and advanced style and so Unreal Engine 4 became the biggest driving factor in evolving the visual style beyond the game’s predecessor. We looked at the engine’s strengths, which is undoubtedly lighting and materials, utilising these elements to push the original game’s art style as far as possible.

With a small art team and around 5-7 hours worth of environments to build, we decided to keep the unique asset count to an absolute minimum and focus our time on making the geometry of each room feel organic and bespoke. We drew inspiration from the excellent shape and lighting designs of sci-fi artists, such as John Harris and Nicolas Bouvier, to create the organic, architectural style of the towering QUBE structure the game takes place in. Using this approach, we managed to make every room feel unique and interesting, something which I’m super pleased with considering all of the rooms have been created with a very small number of unique assets.

qube2image3.jpg
We introduced a diverse range of environments in the sequel.

In contrast to the prequel, there are several different areas in Q.U.B.E. 2 that the player will visit. To make these areas visually distinctive without creating entirely new environment assets, we created bespoke materials, colour palettes and lighting styles that gave each area its own distinctive characteristics and would also reinforce how we wanted the player to feel at a particular time. For example, in the beginning of the game we wanted the player to feel anxious, lost and curious about their situation and the environment they have found themselves in. To induce these feelings, we applied a cold metallic material to the environment and lit all of the rooms with dramatic red spotlights which left much of the environment in shadows. 

Following this area, once the player is comfortable with the game’s mechanics and narrative, we change the environment to a far brighter area. This environment is comprised of large open spaces built from clean white cubes and lit by warm sunlight to influence a positive feeling in the player. So although both of these areas were created with identical assets, they feel very different from one another.

qube2image4.jpg
Above: The Industrial Environment showcasing the cold, metallic materials.

Below: The Minimalist Environment showcasing the clean, white cubes.
qube2image5.jpg

In terms of our process for creating the environments, it was quite straightforward. Our puzzle designer, Dave Hall, had designed all of the puzzles in empty square rooms connected in a straight line. When he was happy with all of the puzzles in a level, we would take those rooms and rebuild everything that surrounded the puzzle gameplay elements. Now, this wasn’t simply a case of building whatever looked the coolest. Obviously, the player’s experience is paramount so it was always the first thing to be considered: a room can’t look so crazy and organic that the player struggles to make sense of the space, adding unnecessary difficulty onto the already challenging puzzles. Similarly, when we light a room we can’t make it so dramatic that it’s difficult to actually see all of the puzzle elements. Accordingly, we tried to maintain a balance between what looks good and what is actually fun to play.

qube2image6.jpg
qube2image7.jpg
A comparison between the basic puzzle design and the resulting environment.

 

qube2image8.jpg
The modular kit used to construct all of the game’s environments.

Every environment in the game is constructed from a modular set of pieces. To achieve the appearance of uneven surfaces each piece has two variations with cubes extruded at different levels. When we first begin working on a puzzle room, we take the whitebox and rebuild the level’s geometry using simple boxes followed by a very quick lighting pass. We continue to iterate this stage until the room offers a comfortable playing experience and the overall shape and composition of the room looks good. Once we’re happy, we then replace the boxes with pieces from the modular kit. Unfortunately, as our team lacks a tools programmer, we had no way of automating the placement of these parts which meant everything had to be placed by hand. This proved to be very time consuming but ultimately, it’s worth it. When all of the pieces are in place, we then add additional ‘dress’ assets (light panels, sand buildup, foliage etc) before finishing the level with a lighting pass. 

Moving on to the Unreal Marketplace, I was initially somewhat sceptical about how useful the Marketplace would be but it ultimately turned out to be hugely helpful, particularly for the creation of particles. In the beginning neither myself or Connor had much experience with creating particles and so the Marketplace proved to be invaluable. For example, Polysplendor’s ‘Pyro Particle Texture Pack v1’ and Tharlevfx’s ‘VFX Weather Pack’ gave us a great starting point for the game’s fire and lightning effects respectively. Even if many of the assets from the store required modification to get them working or performing correctly for our needs, it saved us so much development time to have a quality base asset to work and learn from.

qube2image9.jpg
The use of the oil and fire mechanics in Q.U.B.E. 2.

So after a lot of hard work, learning and perseverance, our team has managed to deliver a great looking game that we are all exceptionally proud of. I don’t believe our team could have achieved the level of quality we have with any other engine. The quality, ease of use and power of the tools Unreal provides have enabled our team to focus on building a fun and engaging puzzle adventure game.

Thank you so much for reading! If you want to see more, you can watch our Official Gameplay Trailer which will give you an idea of what to expect when Q.U.B.E. 2 launches this year on PC, PlayStation 4 and Xbox One. For more information, visit our website.

Divine Intervention – A Behind-the-Scenes Look at the Development of the AR God Simulator ARrived

EDITOR'S NOTE: Thanks to the team at Luden.io for taking the time to tell the development story behind ARrived – one of the App Store's Best of 2017. For more information on making AR experiences with Unreal Engine, head here.

September, 6 a.m., San-Francisco, – it's the last chance to enter the iOS 11 launch line before time expires. We must make the deadline, our whole team is sleeping on the other side of the planet, and a “Build Failed” message is facing me from my screen. How did I manage to get into this situation? Well, let me tell you about this.
 

We’ll Follow the Plan

We were experimenting with AI and drawings identification and decided to make a VR god simulator game. All of us at Luden.io love Black & White to death; we love to draw in VR with our hands and everyone thought that it was a cool idea to control the little men on earth. You draw a rain symbol in the sky — it starts to rain, doesn’t that sound cool?
ARrived1.jpg

No Time to Explain

To make the story complete I should mention that we have been working in the VR field for quite a while, but we’ve all agreed that AR is more interesting. In June at WWDC, Apple announced that ARKit would be released in the autumn of 2017, and it took us five minutes to decide that we must jump on that train. What were we doing during those five minutes of decision making? Finding out if Unreal Engine supported ARKit.

Limitations

With the switch from VR to AR already eating up time, and with two months of development left before launch, we decided that we wouldn’t be changing the genre of the game. Some of the development, such as on neural networks, was still useful, but more on this later. Based on the terms and our capabilities, we decided to bet on the updated Sequencer, aiming to make a weekly interactive story about a tribe that lives in the player’s room. We wanted to make the tribe look at the player and react to the camera.
ARrived2.jpg
There were a lot of risks.  We did not yet know the rules for what is considered a good tone in AR and it was clear that an AR game session taking longer than 5–10 minutes would be hard on the phone’s battery. At first we thought that showing a large terrain site would be a good idea, but we soon realized that it was necessary to show as little as possible of the real world on the screen in AR. Therefore, we turned to our favorite trick — we created the whole scene from premade Unreal Engine Marketplace assets. We highly recommend this approach, as it allows you to see the whole picture at once, and then  work on the details.
ARrived3.jpg
Besides programmers, we have two 3D artists and one game designer working in Blueprints. That’s why we made a system for assembling scenarios and transition logic between them, so that the three team members could work in parallel and embed the scenes in a common system as simply as possible. After three weeks, the base script was implemented and we decided to conduct playtests in order to listen to dozens of complimentary reviews about our ideas, new magic technology and gather general feedback.
ARrived4.jpg

The Game vs Reality

When the third player began to check the game out, we realised we were in trouble. We noticed all the players were poking the poor humans about five times per second. They wanted the little lads to react and to be able to zoom in with standard mobile gestures. Our wonderful script and story transitions did not particularly interest anyone. It was like a lightning strike for us, since there was only about a month left.
ARrived5.jpg
There was nothing we could do, so we began to rebuild the game into an interactive sandbox inspired by a rather popular game of its time, Pocket God. The good news is we were able to leverage light estimation from the device and use real-world lighting to affect what happened in the game. When it got dark in the room, we sent the tribe to bed and released fireflies into the scene. When we posted this on Twitter, it blew up as everyone liked this combination of AR and the real world.

Rethinking the game's systems so late in the game posed many new design challenges. We tried to keep everything we had made during the past month in the game, including animations, scenes, and transitions. To receive such a hybrid, we added selection buttons to the key moments of the game. All the rest of the management was just pressing on the little guys.
ARrived6.png

More Tech

The game also lacked wow-factor. We knew how to solve this issue, but we were afraid to do so because of the tight deadlines. However, after we finally gained some courage, we decided to take the risk anyway and added the ability to recognize real-world objects in the game. Just imagine how cool it would be if a virtual tribe reacted to a dog from the real world. Or if the tribe wants to eat, you can show them a banana and they will receive a virtual banana in the game. We implemented the Apple CoreML system to the game using a self-made plugin in Unreal Engine.
ARrived7.png

Gamescom

We performed playtests as quickly as possible prior to catching a plane to Gamescom 2017. The interactions did their job and it was possible to start polishing the game. The wow-effect from object recognition also worked as smooth as butter. It was a truly new experience for the player. We had to show the game to the journalists with our fingers crossed, in the hope that it would not crash. We were very lucky that everything went smoothly and, our friends from VRFocus even believed the game was already stable, but we knew that was far from the truth.

Besides meeting with journalists at Gamescom, we had one more important task to take care of — speak with the Unreal Engine team. With the release of iOS 11 just prior, Apple had changed the format a bit. While we were able to test the game on our devices, we could not load it into TestFlight, which we needed to begin Apple's review process. After a chat with the Unreal Engine team and some follow-ups, the problem was solved.

PAX

The game became better and better each day, and our hands were not shaking as much when we demonstrated it at PAX. We even had some time to add an option of changing the size of game objects in the same way we zoom photos on iPhone. It was super funny when people discovered this and made the little guys grow to the height of the Washington State Convention Center, where PAX Seattle 2017 was held.
ARrived8.png
A week after PAX, the launch date for iOS 11 was announced. For us, this meant that the game should be submitted no later than September 15th. We were so nervous that it seemed that we would be able to make it all on time, we even walked through the interfaces and made hints about the possibilities of the game in the form of quests in the upper right corner. Then the player could simply do any crazy things that they wanted to or just perform fun quests like “smash the tree with a lightning”.

In addition, we realized that lighting in different rooms is very diverse, so it’s important to add a quality tracking indicator for the player and give recommendations on how to improve tracking. Otherwise, the player will think that there’s a bug in the game, and that’s why the entire 3D scene is moving somewhere it is not supposed to move to.

Launch

ARrived9.png
On the day of the launch, we thought everything was good. I was in San Francisco, and the guys from Moscow and Cyprus had created a build. Since they were very tired from this 3-month race, they collected the final build and went to sleep. I saw the sunrise, loaded the build to iTunes and immediately got rejected.

You know, there are some special words in Russian that can help to express the intention to solve a complex problem when it is unclear how to solve it. Using those words, clearing up the cache, and rebuilding the editor I was able to create the necessary build in three hours. iTunes Connect approved it. I think we made the deadline with only a few hours left at that moment.

ARrived10.jpg

After the iOS 11 launch, our game was right on the main page of Apple's App Store. We were crazy happy and hyped about this! A few different fun surprises followed later, such as Apple’s award in the end of 2017, but those are stories for another day.

ARrived11.jpg

Conclusions

So what conclusions can be made based on this story?

  • It is very important to show the quality of the AR tracking and explain how to improve the tracking in AR to the player so that it is more comfortable to play.
  • Most players expect AR objects to be interactive.
  • The gaming session in AR should be short, about 5 to 10 minutes, because carrying the device for long is not very comfortable and the battery charge will not withstand longer sessions.
  • It is better to build a game based on interaction of virtual or real objects than make a massive scene.
  • New technologies, such as object recognition and light estimation, can attract players. They like everything interesting and technological.
  • Be careful when experimenting with new technologies. Our custom recognition system recognized the journalist’s cat as a dog and as a result received an average rating.
  • With Marketplace, you can build a prototype of the project very quickly. Use it.
  • Run playtests — the sooner the better. Players perceive the game differently to what we,  developers,  do.
  • Communicate with the engine team. They want to help you if they can.
  • Make games you love. Otherwise each obstacle on the way to launch will give you an excuse to give up.

ARrived12.jpg

NVIDIA Edge计划获奖者:2018年1月

Epic Games与NVIDIA联合主办的NVIDIA Edge计划旨在鼓励使用虚幻引擎创作出具有杰出视觉效果项目的团队和个人,并颁发高端硬件设备作为奖品。每个获奖项目都将获得一块NVIDIA GTX 1080 Ti。请别忘了,如果您的项目申请过我们的计划却没有入选,我们随时欢迎您再次提交。

在此,我们要祝贺本月的获奖者!

《Red Matter》 Vertical Robot

RedMatter_VerticalRobot_1080.jpg

Red Matter是一款优秀的剧情驱动型VR解密冒险游戏。故事的背景被设定在一个反乌托邦式的科幻冷战时期,由Vertical Robot开发。在游戏中,你将扮演一名宇航员,并被派往一颗遥远的冰封行星探索一座废弃的月球基地。

《咖啡馆》Imersys

CoffeeShop.jpg

这座时尚的可视化咖啡馆只是Imersys团队创作的众多精美项目之一。Imersys擅长AR及VR开发,已经通过众多教育、培训、可视化及旅游项目展露了自己的才华。

《动作沙盒游戏》 – Santeri Oksanen

ActionWIP_Sertani.jpg

虽然还是开发中的作品,但Santeri Oksanen的动作沙盒游戏已经呈现出了惊人的视觉效果,以及广阔的环境。关注他的Twitter可查看更多制作中的图片及视频。

再次感谢虚幻引擎社区与我们分享大家的精美成果。每个月只能选出区区三名获奖者总是令我们左右为难。大家总是能给我们带来惊喜。

想要提交您的项目?

赶快前往NVIDIA Edge hub了解提交作品的详细要求吧。

祝大家好运!