New VR Games Awarded Funds In Epic’s $1 Million Unreal Engine Dev Grant

New VR Games Awarded Funds In Epic’s $1 Million Unreal Engine Dev Grant

Epic Games’ latest batch of developer grants have been revealed and several VR companies are included in the list.

The grants, which collectively total $1 million, are awarded to developers using the company’s Unreal Engine for their upcoming games and apps. Among the 37 teams securing money in the range of $5,000 to $50,000 today are Blacksmith Studios, New Reality Co. and VitaeVR, all of which are working on VR projects.

Blacksmith is working on an intriguing puzzle game called Desolate Sands, in which players aim to dig deep by moving obstacles and using levers. VitaeVR, meanwhile, has a similarly interesting product in VStore, which is hoping to become a full VR supermarket that can be used to screen for early indicators of dementia. The game asks players to search a supermarket for certain items then pay for their items with the correct amount. Their performance is then measured and scores are delivered to clinicians.

Finally, New Reality Co. is working on an unannounced VR project that focused on storytelling and art. The studio is headed up by Milica Zec and Windslow Porter, who both worked on VR experiences that appear at festivals like Tree.

Another recipient in this round of grants is Mundfish, which is working on strange shooter Atomic Heart and released VR shooter Soviet Lunapark earlier this year. The latter title is in Early Access so hopefully some of the funding will go towards making that a bigger and better experience, too.

Tagged with:

The post New VR Games Awarded Funds In Epic’s $1 Million Unreal Engine Dev Grant appeared first on UploadVR.

Foveated Rendering Doesn’t Need Eye Tracking Hardware Thanks to Some Oculus/Unreal Engine Wizardry

When VRFocus has reported on Foveated Rendering in the past it has generally been in conjunction with some sort of eye tracking hardware by companies like Tobii. However, that’s not necessarily needed as Oculus has recently explained in a developer blog posting.

Robo Recall MBFR image2

For those unaware of the technique, foveated rendering is a graphical process to help reduce the processing load on a GPU. This is achieved by reducing the quality of areas in a players peripheral vision, whilst keeping maximum clarity in areas they are looking. As mentioned, this technique works very well with eye tracking tech, allowing a system to know exactly where someone is looking to accurately reduce the quality and processing power.

Oculus has now highlighted a technique called Mask-based foveated rendering (MBFR) that: ‘decreases the shading rate of the peripheral region of the eye buffers by dropping some of the pixels, based on a checkboard pattern.’ With the technique developers can drop different amounts of pixels, so for example the image above has 50 percent of the pixels dropped with the left part the masked portion, the right being the original – or ground truth image – while the centre shows the reconstructed result.

Oculus explains that: “MBFR reduces GPU pixel shading cost by dropping a subset of the pixels in the world rendering passes. But it also introduces extra cost in the post-processing passes for reconstructing the dropped pixels.” This means developers can expect to see more than 10 percent GPU performance savings but with an extra performance cost in reconstructing those dropped pixels.

Robo Recall MBFR image3

While demonstrated using Epic Games’ Robo Recall for Oculus Rift use of this technique would most benefit mobile headsets like Oculus Go, due to the restricted processing power these devices have.

For studios wishing to experiment with MBFR it is currently included on Oculus GitHub UE4 Repository and works with Unreal Engine 4.19 and 4.20-preview versions. For any further updates on Oculus’ latest developer techniques, keep reading VRFocus.

Improved AR and VR Features Available to All with Unreal Engine 4.19 Release

Unreal Engine remains one of the most popular engines for creating in virtual reality (VR) and augmented reality (AR), and is considered powerful and versatile enough that it has even seen use in Hollywood. The latest version is introducing a host of new features and improvements to make the engine even more indispensable to developers and content creators.

Unreal Engine has received many updates and new features from its creators at Epic Games, but thanks to its vast and engaged community, many updates and fixes were submitted by the community of Unreal Engine developers on GitHub. This updated version of Unreal Engine 4.19 was previously available for developers, and has now gone live for general users.

Among these updates are a number of features that will be of interest to VR content creators. Among these is the Unified Unreal AR Framework, which provides a framework for building AR apps for both Google and Apple portable platforms. This allows developers to create an AR app that will work for both ARKit and ARCore using a signal code path, without needing to create two separate projects.

Another new feature is Motion Controller Component Visualisation. This feature lets developers simply and quickly add a display model Static Mesh to your Motion Controllers. By default, this will match the device which is driving the motion controller, such as an Oculus touch, though more complicated, custom models can be loaded.

The Live Link plugin has also received substantial updates, allowing Unreal Engine users to establish a connection between Maya and Unreal Engine 4 and preview changes made in Maya in real-time from within the Unreal Engine 4 editor. Motion Controllers are also now compatible with Live Link.

Support for HTC Vive Pro is included in the new version, though performance will need o be checked. In addition, the latest SDK releases from Oculus, Steam VR, Android, iOS and others have all be integrated.

The full release notes for Unreal Engine 4.19 can be found on the Unreal Engine blog. VRFocus will continue to keep you updated on the latest Unreal Engine news.

Developer Creates The Oasis In 2018 By Living In Virtual Space For 30 Days

The virtual world of Oasis as seen in the blockbuster film Ready Player One is set in the year 2025 and many have wondered how long it will be until they can experience that level of immersion. For Enea Le Fons, an Italian developer, that day is now as he has spent the last 30 days living inside virtual reality (VR).

Building VR Studio with Vive Pro

Enea Le Fons would spend many hours in VR but as part of his exceptional “30 Days in VR” journey, he spent all of his time in VR ready for the films release in theatres. Other than eating, sleeping and using the bathroom, Le Fons has spent every waking moment in VR for work, communication, exercise, and entertainment, making the distant future of Oasis a close reality.

To help complete this impressive challenge Le Fons “moved” his physical studio space to a VR environment by coping the same layout and size of each piece of furniture, including tables, chairs and even tableware. Thanks to the use of an SLR camera and some free developing tools, Le Fons was able to successfully copy and paste the studio into the virtual world, helping make the “30 Days in VR” challenge a little bit more comfortable. Of course, being in virtual space the limits of the VR studio are endless meaning Le Fons was free to expand beyond the bounds he was use to and enable various functions such as architectural design,exercises and leisure activities all directly within VR and the space he had available.

Mediating with Focus

Inspired by the HTC Vive and the experiences they had created, Le Fons’ 30-day project was made possible thanks to the technology readily available to him. With products such as the HTC Vive Pro and Vive Focus, Le Fons was able to explore the universe for an afternoon walk, dance in a club and even try some mediation all within the virtual world. All of this was able to be created and designed as he wanted by using the tools available to him to build new experiences and locations to explore.

“Everything I developed will be shared to the entire developer community as an open source. Inspired by HTC Vive, my goal is to foster the VR ecosystem with the community together by creating content favoring everyone, making VR an essential part of our daily life: work, education, leisure,” said Le Fons, who has been fascinated by VR’s possibilities for over 20 years. “Thanks to Vive, I was able to precisely track all the objects in the VR environment; with Vive trackers, free movement was enabled even when I was wearing the headset, what’s more, with the advent of a comfortable device supporting long-term and intensive use like the Vive Pro, my productivity was also dramatically increased.”

From the 30-day challenge Le Fons captured a number of different points of data which provide some interesting insights. For example, his FitBit use to pick up around 700 steps a day on average when working as an Unreal Engine developer but since he started to use the HTC Vive for content development, he has walked over 10,000 steps a day. It is not a full work out but it does show that there are health benefits to long term VR usage thanks to technology such as room-scale.

“I am very pleased to see that our products were able to meet the needs of such an intensive project, and also excited to see that Enea finished the program healthier and happier than ever,” said Alvin W. Graylin, China Regional President of Vive, HTC. “I am getting more confident that VR can become an essential part of our daily life in the future. I welcome more VR enthusiasts and practitioners to challenge themselves for new breakthroughs in the industry.”

For more stories like this in the future, keep reading VRFocus.

The Weather Channel Will Be Making Weather More Immersive

In an attempt to really show the impact of various weather conditions, particularly those at the extreme ends, The Weather Channel’s parent company are teaming with The Future Group to present broadcasts in mixed reality (MR).

The technology being used in the broadcasts is using Unreal Engine, one of the most popular engines for videogame VR and MR experiences, and an engine which has begin to see use outside of videogames in industry.

The aim of the use of MR is to improve public understanding of the effects of extreme weather conditions and how it can impact people’s daily lives. The Future Group have worked extensively on immersive and interactive projects, including big-name brands such as Star Wars.

The Weather Channel has previously experimented with augmented reality (AR) in its broadcasts, particularly for sports coverage, using a 3D model of a sports field and explaining how weather can affect the event.

“Our immersive mixed reality (IMR) presentations will combine 360 HD video and augmented and virtual reality elements that are driven by real-time data and our expert on-air talent to transport our audience into the heart of the weather,” said the Vice President of Design for The Weather Group, Michael Potts. “Using The Future Group’s Frontier powered by Unreal Engine for weather broadcasting has never been done before. We are excited to continue our investment in the latest technologies that are not just cutting-edge, but on the bleeding edge of design and science.”

The Weather Channel has said that it is always looking for new ways to convey important safety and warning messages to viewers. By using Unreal engine and MR technology, the Weather Channel hopes it can continue to push the boundaries for immersive presentation by showing things like tornadoes or storm surges in detail.

For further news on new and innovative uses of mixed reality technology, keep checking back with VRFocus.

Intel Creates ‘How to’ Guide for Getting Started in VR Development Using Unreal Engine

Getting started in the videogame business can seem like a daunting prospect, even more so when it comes to figuring out virtual reality (VR). But those first steps don’t necessarily need to be as difficult as they may seem thanks to videogame development engines like Unreal Engine. Epic Games’ popular middleware is used by teams, both amateur and professional alike, to build the latest VR experiences (Robo Recall, Ace Combat 7, ARK Park, Moss) and getting started is free. So over at Intel’s Developer Zone the company has put together a quick guide on VR development using Unreal Engine.

Robo_Recall_OC3_A4_screenshot_05

The ‘How To Get Started in VR with Unreal Engine‘ guide doesn’t look at Unreal Engine as a whole – it’s way too big for a brief guide. Instead it focuses on one particular mode that Epic Games has been developing over the last couple of years, the VR Editor. The idea goes that while creating VR content on a normal 2D screen is difficult, being able to put yourself in the actual virtual world you’re creating can make the whole process that little bit easier, especially for newcomers.

It makes more sense that visualising a virtual world and where things go can be helped by being in it, donning a headset and then seeing what works and what doesn’t. With Unreal Engine’s marketplace and strong community it’s also straight forward to find and use assets – some free some paid – to help bring your creation to life.

Now if you are developing content for VR the guide does surmise that you’ll have a head-mounted display (HMD) of some sort – otherwise what’s the point – with the instructions based on the HTC Vive, although headsets like Oculus Rift can also be used. Once you’ve got Unreal Engine downloaded and installed – an easy free process – plus your headset is all up and running, Intel then goes through the basics of building your first scene.

Moss screenshot

After that you’re on your own – well not quite, there’s plenty of other literature online – with Intel wisely suggesting a look at Unreal Engine’s visual scripting system “blueprints.”

During last weeks Game Developers Conference (GDC) 2018, Epic Games held its annual State of Unreal keynote, detailing future plans, showcasing the latest videogames as well as new features for the engine. VRFocus will continue its coverage of Epic Games and Unreal Engine, reporting back with the latest announcements.

Unreal Engine 4 Showcase VR, AR and MR In New Trailer

Opening their Game Developer Conference 2018, Epic Games has distributed a new trailer for the Unreal Engine 4 featuring a number of virtual reality (VR) titles.

Though packed to breaking point with all manner videogames and experiences built using the Unreal Engine 4 there were a number of VR trailers featured that showcase the power of the software. The VR titles that were included in the showcase offer an insight into the types of experiences that can be created by developers for a whole range of different platforms thanks to the engine.

One VR title featured that many are sure to know was Moss, the hit PlayStation VR videogame which recently released. Taking players into a fantasy world within a book, they met Quill and must work together to help complete an epic adventure. Developer Polyarc Games used the Unreal Engine 4 to create a highly detailed fantasy world and deliver stunning animations for Quill, which included the use of sign language. They have since taken Quill onto iOS with a set of animated stickers.

VRFocus’ Staff Writer Rebecca Hills-Duty reviewed Moss saying: “Moss is a flawlessly crafted experience starring a character that absolutely deserves to be the face of modern VR. Every inch of the world shows attention to detail, and a story is woven that draws you in, making you truly invested in the world and in Quill as a person.”

Star Wars Secret of the Empire

The League of Legends: Elder Dragon mixed reality (MR) experience also got a feature in the trailer, which was used during the League of Legends Worlds 2017 Grand Final Stage. Developer Riot Games have been looking at new ways to bring their hit title to more platforms for sometime and the Elder Dragon crashing the party was a stunning experience for both fans of the title and new comers alike.

Upcoming arcade flight combat title Ace Combat 7 is also making use of the VR capabilities of the Unreal Engine 4 and is set to be playable in full with a VR head-mounted display (HMD). VRFocus’ Senior Staff Writer Peter Graham got hands on with Ace Combat 7 last July saying: “In its present state Ace Combat 7: Skies Unknown promises high octane flying action for PlayStation VR players that’s sure to encourage legions of fans to try the VR compatibility. As long as Bandai Namco manage to provide enough content to satisfy then PlayStation VR is likely to have another killer exclusive title when the videogame finally arrives next year.”

Another VR experience which made use of the Unreal Engine 4 was the Star Wars: Secrets of the Empire full-body experience that was created by The Void and ILMxLAB. Players were able to enter the world of Star Wars and live out a whole new story all with hyper-reality detail. Players wear an untethered VR system and move freely around a multi-sensory experience and must survive against all odds in order to recover imperial intelligence vita to the rebellion’s survival. Unreal Engine 4 was a key part of the development of the experience, allowing the teams at The Void and ILMxLAB to bring Star Wars to a new reality.

The Unreal Engine 4 was also used in the French elections by news channels in conjugation with green screens to deliver real-time MR to viewers. The full Unreal Engine 4 showcase trailer can be watched below and goes to show just how powerful and flexible the engine really is.

VRFocus will be bring you all the latest from the Game Developer Conference all throughout the week so make sure to check back regularly for more.

Watch: ILMxLab’s Star Wars Real-Time Ray-Tracing Demo on Unreal Engine is Stunning

At the Unreal Engine presentation at GDC 2018 today, Epic Games CTO Kim Libreri introduced several projects highlighting the scalability and performance of Unreal technology, including a partnership with ILMxLAB and Nvidia. This was the first live demonstration of interactive, real-time, ray-tracing with Unreal Engine.

Nvidia announced RTX on Monday, a real-time ray-tracing solution designed for their latest Volta GPU architecture. Today, we were treated to a live stage demo using Unreal Engine, Nvidia RTX, and ILMxLAB’s Star Wars assets. Aside from some slight connection problems (the image was being streamed from a PC to the iPad on stage), this was a stunning demonstration, and seemed to be far less noisy than the Northlight footage from Remedy.

Tony Tamasi, Senior Vice President of Content and Technology at Nvidia explained some of the technology behind the scenes, describing ray-tracing as “the holy grail” of rendering that solves many of the fundamental problems of traditional rasterization.

“We partnered with our friends at Microsoft to deliver an industry standard API called DirectX Raytracing,” he said. “That API is perfectly integrated with RTX to enable ray-tracing through interfaces and APIs that game developers are familiar with, and on top of that we’ve layered in some GameWorks technology to give developers a kickstart for things like de-noising and reconstruction filters.”

Despite the fact that this demo was running on a supercomputer (an Nvidia DGX Station running four Volta GPUs connected with NVLink), Tamasi thinks the technology is not too far away from reaching consumers. “We are at the crux of real-time ray-tracing being a practical reality right now, he said. “I expect you’re going to see games shipping with real-time ray-tracing this year.”

The post Watch: ILMxLab’s Star Wars Real-Time Ray-Tracing Demo on Unreal Engine is Stunning appeared first on Road to VR.

GDC 2018: Magic Leap One Getting Unreal Engine 4 Support

GDC 2018: Magic Leap One Getting Unreal Engine 4 Support

Today at GDC, Epic have announced that Unreal Engine 4 will officially support the Magic Leap One. You can read more about it on the Unreal Engine blog, but the basic gist of it is that you can use Epic’s UE4 now, one of the leading game engines on the market, to create content for Magic Leap One.

The news comes on the heels of the announcement earlier this week that Magic Leap launched its Lumin SDK Preview and Creator Portal making this an absolutely great time to get started on any ideas you might have for developing Magic Leap content. Companies like Epic themselves, as well as Framestore, ILMxLAB, Schell Games, and Peter Jackson’s Wingnut AR are all working on content for the Magic Leap One.

Reportedly, Unreal Engine 4 support for Magic Leap One will include the following features:

  • Head tracking
  • Eye tracking
  • Gesture and hand tracking
  • Room scanning and meshing
  • Spatialized audio
  • Microphone input
  • 6DOF hand controller (Totem) tracking
  • Vulkan and OpenGL support
  • Use of Unreal Engine 4’s desktop and mobile forward rendering paths

“Unreal has long been the go-to development engine for some of the top game studios in the world,” said Rio Caraeff, Chief Content Officer at Magic Leap. “This partnership with Epic will enable more creators to join Magic Leap on its journey, creating content for the next computing platform.”

Let us know what you think of the news down in the comments below!

Tagged with: , , ,

Unreal Engine 4.19 Introduces New and Improved VR and AR Features

Unreal Engine is one of the most poplar and versatile videogame engines, and has seen use in a huge range of titles in virtual reality (VR) and augmented reality (AR). The newest update brings in some improvements and new features for VR and AR developers and creators.

The newest version of Unreal Engine features improvements created by the team at Epic Games, who created Unreal Engine, but also many submissions from the expansive and dedicated Unreal user community.

Unreal Engine 4 VR Editor

New features devoted towards VR and AR development include features such as the Unified Unreal AR Framework. This framework allows for AR apps for both Apple and Android mobile devices to have a single development path. This allows developers to be able to use a single code base for apps, instead of having to build two different versions of an app for the two different mobile platforms.

The Unified Unreal AR Framework includes functions that support Alignment, Light Estimation, Pinning, Session State, Trace Results, and Tracking. Similarly, the Unified Unreal AR Framework Project Template has been introduced, which provides the HandheldAR Blueprint, providing a complete example project to demonstrate the new functionality.

The Live Link function provides a common interface for streaming ad bringing in animation data from outside Unreal engine, such as Motion Capture rigs. With Unreal Engine 4.19 Live Link can now capture information from motion controllers, such as the Oculus Touch or HTC Vive wands.

The release notes also confirm that all Unreal Engine 4 functions compatible with SteamVR is also compatible with the forthcoming HTC Vive Pro. No modifications are needed to port an Unreal engine 4 project to HTC Vive Pro, but developers will need to check the performance of the project as the HTC Vive Pro operates with a higher resolution.

A full list of release notes and changes to Unreal engine 4.19 is available on the Unreal Engine blog. The new version of the engine is available to creators and developers now. VRFocus will bring you further updates on Unreal Engine as it becomes available.