Unreal Engine Improves OpenXR Support, Now Production-Ready

The OpenXR plugin for Unreal Engine has been updated in the latest 4.27 release and is now production-ready.

OpenXR is a new standard that provides an API for VR and AR content that allows game engines to write code that is compatible across multiple hardware platforms. Previously, companies like Facebook, Valve and Microsoft all used their own separate APIs, which therefore required more effort from developers if they wanted to support multiple headsets.

Platforms have slowly been adding support for OpenXR and transferring over to the new API. SteamVR added full OpenXR support earlier this year, while Facebook got fully on board just under two months ago.

The latest release of Unreal Engine, 4.27, improves the OpenXR plugin support and brings it to a production-ready state. This means that Epic Games considers OpenXR support is essentially ready to use and able to meet the demands and needs of VR developers using Unreal. In 4.27, Epic added support for stereo layers, splash screens, querying playspace bounds and more — you can find more detail here.

The release also includes redesigned templates for VR and AR projects, which serve as “a starting point” for developers to jump off when developing a project using OpenXR, ARCore or ARKit.

The new VRTemplate for OpenXR projects includes encapsulated logic for teleport locomotion, snap rotation, grab components, a VR spectator camera and a menu. The template supports Quest 1 and 2, Quest via Link, Rift S, Valve Index, HTC Vive and Windows Mixed Reality platforms. You can read more about the template here.

The redesigned handheld AR template, designed for developers using ARCore or ARKit platforms on iOS and Android devices, includes some basic features such as basic UI and touch interface, easy setup and model scaling and rotation features.

Previously, Epic added OpenXR support to the beta for Unreal Engine 5. You can read more about OpenXR in Unreal 4.27 here. 

 

Unreal Engine 5 Enters Early Access, Supports OpenXR via New VRTemplate

Unreal Engine 5

A year ago Epic Games unveiled the first details for Unreal Engine 5, its hugely popular videogame development engine. This week the company has announced the launch of Unreal Engine 5 Early Access for studios to begin playing with the new features, these include continued virtual reality (VR) support utilising the OpenXR framework.

Unreal Engine 5

While the announcement concentrates on the amazing fidelity videogames developers will be able to bring to future titles thanks to virtualized micropolygon system Nanite and its global illumination solution Lumen, tucked away in the release notes were some VR specific additions.

The Unreal Engine team has created a new VRTemplate utilising the OpenXR framework as a nice launch point for VR development. “The template is designed to be a starting point for all your VR projects. It includes encapsulated logic for teleport locomotion and common input actions, such as grabbing and attaching items to your hand,” says the release notes. So much so that Epic Games recommends that developers: “create your VR project using the VRTemplate in UE5, because the project settings and plugins are already configured for the best VR experience.”

Unfortunately, if you do create a VR project whilst Unreal Engine 5 is in Early Access then you won’t be able to use features like Lumen because its not supported at the moment. Hopefully, that’ll change in the future so VR titles will look as good as those sample screenshots. In any case, Early Access shouldn’t be used to build entire projects because UE5 is still in development, likely to officially release in early 2022.

Unreal Engine 5

Thanks to VRTemplate supporting the OpenXR standard, it should be easy for developers to bring their titles to various VR headsets. VRTemplate currently supports the following devices:

  • Oculus Quest 1 and 2
  • Oculus Quest with Oculus Link
  • Oculus Rift S
  • Valve Index
  • HTC Vive
  • Windows Mixed Reality

In addition to VR, Unreal Engine 5 will continue its support of augmented reality (AR), from hardware such as Magic Leap to ARKit and ARCore SDK’s.

As further features and improvements are rolled out for Unreal Engine 5 during Early Access, VRFocus will keep you updated.

Epic Games Secures $1B Investment to Build Its Unreal Engine-powered Metaverse

Epic Games, the company known for its hit battle royale title Fortnite and the Unreal Engine game engine, has managed to attract a stunning $1 billion in its quest to establish a metaverse of linked games and services.

The new funding round, which was announced early last week, includes an additional $200 million strategic investment from Sony Group Corporation. This comes nearly a year after Sony initially invested $250 million in Epic in a pursuit to establish “real-time 3D social experiences leading to a convergence of gaming, film, and music,” Epic Game’s CEO & founder Tim Sweeney said in July.

The investment is said to “accelerate our work around building connected social experiences in Fortnite, Rocket League and Fall Guys, while empowering game developers and creators with Unreal Engine, Epic Online Services and the Epic Games Store,” Sweeney says.

Other investment partners include Appaloosa, Baillie Gifford, Fidelity Management & Research Company LLC, GIC, funds and accounts advised by T. Rowe Price Associates, Ontario Teachers’ Pension Plan Board, funds and accounts managed by BlackRock, Park West, KKR, AllianceBernstein, Altimeter, Franklin Templeton and Luxor Capital.

Epic’s massively popular title, Fortnite, has essentially transitioned from game to ‘social experience’ throughout the years with the addition of non-gaming content such as virtual concerts and film debuts. Linking those top Unreal Engine-driven games together into an overarching metaverse could spell even greater success for both companies, as they drive greater engagement—and inevitably also greater microtransaction sales.

The bulk of Epic’s success has undoubtedly been in flatscreen gaming, so it’s unlikely we’ll see a VR-specific metaverse from the company in the near term. Still, Unreal Engine is the second most popular game engine for building VR content, and has powered PSVR games such as FarpointMoss, Firewall: Zero Hour, and many more. Creating an early metaverse of flatscreen games, and more importantly building out the underlying architecture within Unreal Engine itself, could be a big step in creating a large-scale, VR metaverse as the companies inevitably push the confines of it to include immersive headsets.

The post Epic Games Secures $1B Investment to Build Its Unreal Engine-powered Metaverse appeared first on Road to VR.

Create MetaHuman’s With Epic Games’ Early Access, Cloud-Based app

MetaHuman Creator

Creating realistic human characters for videogames isn’t easy, there are so many facets to get right and if one is off, then it just looks weird. Hence why fantasy, sci-fi and other genres are employed, because then it doesn’t matter, you can be as creative as you like. Yet there plenty of times where studios want to create super-realistic characters/avatars but this process tends to be long and expensive. Epic Games has been working on a solution, its MetaHuman Creator, with Early Access launching today.

MetaHuman Creator

Epic Games offered a teasing peek at the software a couple of months ago, with a couple of free sample MetaHumans to play with. Now Unreal Engine developers can sign-up to use the new tool and experiment for themselves, fully testing the level of detail that can be achieved and how easy MetaHuman Creator is to use.

Entirely cloud-based, MetaHuman Creator has three main components you can tweak to get your desired effect; Face, Hair and Body. These have various sub-options, so for example with the Hair you can alter the Head, Eyebrows, Eye Lashes, Beard and Mustache. As you can see from the screenshots, what’s incredible is the level of detail MetaHuman Creator can provide whilst providing the ability to edit in real-time. “The MetaHuman Creator uses a library of real scans of people, so when you move the handles around to create a character, you’re mixing together these different parts of real people to create something that’s still really believable,” says James Golding, Technical Director at Epic Games. ”

Once you’ve done so, your MetaHuman can be downloaded directly to Unreal Engine or transferred for further editing in applications such as Autodesk Maya. The tool should be especially beneficial to indie devs working in Unreal Engine as this level of fidelity tends to only come from AAA studios that have big enough budgets and time to spend on character realism.

MetaHuman Creator

MetaHuman Creator also features a ‘Level of Detail’ (LOD) option which simplifies the materials for each character, reducing the quality to make crowds of people for example or to ensure performance is maintained across a range of devices. Thus it could be a great tool for virtual reality (VR) developers looking to create immersive worlds with characters that look strikingly real. Imagine an Epic Games MetaHuman combined with the tracking features of the HP Reverb G2 Omnicept.

The online tool could also see the rise of more realistic Virtual Beings. These are AI-driven characters like Fable’s Lucy – who recently attended the Sundance Film Festival – which can respond to your actions and remember them, creating a dynamic narrative in the process.

You can request access to the MetaHuman Creator tool today, with applicants being gradually added. Epic Games will also provide 50+ ready-made MetaHumans, download them from Quixel Bridge to use in your project. For further updates keep reading VRFocus.

SteamVR Update bringt volle OpenXR 1.0 Unterstützung

Tower Tag auf Steam

SteamVR bietet mit dem Update auf Version 1.16 eine vollständige Unterstützung des OpenXR-Standards. Dies ist ein wichtiger Schritt, um Entwicklern und Entwicklerinnen den Sprung auf unterschiedliche Plattformen zu erleichtern.

SteamVR Update bringt volle OpenXR 1.0 Unterstützung

Die Idee hinter OpenXR ist die Erschaffung einer Schnittstelle, welche anschließend für alle gängigen Plattformen funktioniert. Wird beispielsweise ein Spiel für Steam entwickelt, welches OpenXR nutzt, könnte das Produkt auch für den Store von Oculus verwendet werden, welcher SteamVR nicht unterstützt.

OpenXR ist seit mehreren Jahren in Entwicklung und hat die Unterstützung vieler wichtiger Akteure im XR-Bereich erhalten. Die Version 1.0 des Standards wurde 2019 angekündigt und findet langsam aber stetig seinen Weg in die wichtigsten VR-Plattformen und Game-Engines wie Oculus Quest & Rift, Windows Mixed Reality, Unity, Unreal Engine, SteamVR und mehr. Da die mobilen Plattformen jedoch deutlich weniger Leistung als ein PC bieten, müssen Entwickler/-Innen hier auch weiterhin viel Arbeit in eine adäquate Portierung stecken, die den unterschiedlichen Geräten gerecht wird.

Neben dem der Unterstützung von OpenXR bietet das Update auch einige generelle Verbesserungen von SteamVR. Die komplette Liste aller Änderungen findet ihr hier.

(Quelle: Road to VR)

Der Beitrag SteamVR Update bringt volle OpenXR 1.0 Unterstützung zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Make Sweet Electro Music in 2021 With Korg Gadget-VR

KORG Gadget-VR

Out of nowhere legendary synth and keyboard company, Korg has announced its getting into virtual reality (VR). In a future product preview, Korg has unveiled all-in-one music production software Gadget-VR with very few details other than a teasing video.

KORG Gadget-VR

Korg has brought its Gadget app to numerous platforms over the years, whether you’re on a laptop, iOS device even Nintendo Switch. This year it’ll be VR’s turn, with music producers able to surround themselves with virtual synthesizers, drum machines, samplers and other audio wizardry.

The company has yet to go into detail about how this will all work – or even connect to other hardware/software you may have. At the beginning of the video, there’s most definitely an Oculus Quest 2 so that support looks to be a given and the app is being built using Unreal Engine. VRFocus would expect to see a PC VR version as well, but we’ll have to wait and see.

As with any of the Korg Gadget apps, it’s all about being able to chop and change between those various gadget instruments – Korg Gadget 2 features over 40 for example. Korg Gadget-VR definitely looks to include some of them with the likes of ‘Miami’ a Monophonic Wobble Synthesizer, ‘Warszawa’ a Wavetable Synthesizer and the ‘Kingston’ Polyphonic Chip Synthesizer all clearly appearing in the video.

KORG Gadget-VR

The video does confirm a launch is slated for 2021 so hopefully, further details will be released soon.

There are plenty of ways to get creative in VR when it comes to music, from entry-level gamified titles like Electronauts to fully-blown DJ apps like TribeXR. When it comes making music from scratch you have Modulia Studio, Virtuoso and AliveinVR. Korg is going to be a major addition to this genre thanks to its impressive history. As further details on Korg Gadget-VR are released, VRFocus will let you know.

Theia Interactive Secures Epic MegaGrant for Unreal Engine VR Tool ‘Bigroom’

BigRoom

Known for its immersive work with fields such as architecture, aerospace and entertainment, Theia Interactive’s latest project is Bigroom, a collaborative virtual reality (VR) tool for Unreal Engine. This week the company has announced it’s been awarded an Epic MegaGrant to continue development.

BigRoom

The second MegaGrant Theia Interactive has received from Epic Games’ initiative, how much remains undisclosed. The money will aid the expansion of the team thus helping to expedite research and development. Currently, in a closed beta, Bigroom allows Unreal Engine developers to meet within a project and edit the design together in real-time whether that’s in VR or via desktop.

“Theia is working toward bringing real-time rendering and collaboration tools to the virtual design review process for multiple industries,” said Bill Fishkin, founder & CEO of Theia Interactive in a statement. “This Epic MegaGrant will help us improve the integration with Unreal Engine and deliver the next generation collaborative VR platform with BigRoom.” 

Bigroom includes a range of features to aid architects, project managers and designers in the creative process. These include configurable and customizable options to help scale a project, presentation boards for easy visualisation; 3D scale models suited to VR, bookmarking and comparison tools, and task lists and Post-It notes.

BigRoom

After starting as a visualization company in 2014, Theia Interactive launched Optim in 2018, an Unreal Engine tool to help speed up content creation and improve workflow. With BigRoom still in development, the company has revealed it’ll be conducting an open beta soon.

Remote collaboration is nothing new in the VR industry, especially over the past year as more companies have looked for new ways to connect other than video calling. From industrial apps like SkyReal or product design platform Gravity Sketch, all the way up to full international conferencing abilities, the ability to virtually communicate has never been easier.

As Theia Interactive release further details on Bigroom, VRFocus will keep you updated.

Oculus Unity & Unreal SDKs Deprecate Oculus Go Support

Facebook no longer supports Oculus Go in the latest version of the Oculus Unity & Unreal Integrations.

Oculus Go launched in May 2018, just over two years ago, as Facebook’s first standalone headset. Priced at $199, it’s primarily used for passive consumption of immersive and traditional media.

In January Facebook removed Go from its enterprise offering, and in June stopped selling it to consumers, vowing no more 3DoF VR products.

This deprecation means Go isn’t officially supported in v19 of the Unity & Unreal Oculus integrations, and could lead developers to stop updating Go versions.

Developers can still use v18 to develop for Go, but the Oculus Go Store will stop accepting app updates and new apps in December.

3DoF Input: Hard To Accommodate?

Go’s media viewing use cases emerged around its wireless, decent resolution experience and its major limitation- one which no other major headset has. Go can only track your head’s rotation, not position. If you lean forward, backward, or to the side, the entire world moves as if attached to your head. This is an uncomfortable feeling and can make some people feel sick.

Image from Aniwaa

But what is more likely behind Go’s engine deprecation is its controller- and there is only one- which has the same limitation. This means it works as a virtual laser pointer, not hands.

Version 19 of the core Oculus Mobile SDK, used by developers of native-Android Quest apps or open source engines, does not deprecate support for Go.

The Oculus Unity & Unreal SDK integrations provide a number of modifiable helper scripts, assets, and examples to developers. As Facebook’s views and understanding of spatial design changes over time, it may want to take paths that simply don’t work with a single rotational laser pointer locked in space.

Almost all VR Go apps are made with Unity or Unreal, so developers may choose to no longer release updates to Go versions. Facebook retired its Rooms social service on Go late last year and says it will support its upcoming Horizon social networking service on the Rift Platform and Quest, saying “full interactivity is core to the Facebook Horizon experience.”

While initially enthusiastic about 3DoF VR with Samsung Gear VR in 2014 and Go in 2018, the success of Quest means 3DoF just doesn’t seem to have a place in Facebook’s future VR plans.

The post Oculus Unity & Unreal SDKs Deprecate Oculus Go Support appeared first on UploadVR.