Epic Games is “exploring native Unreal Engine support” for Apple Vision Pro

Unity, makers of the popular game engine, announced earlier this week it’s getting ready to levy some pretty significant fees on developers, causing many to rethink whether it makes more sense to actually go with the main competition, Unreal Engine from Epic Games. It seems Epic isn’t wasting any time to help transition those creating projects for Apple Vision Pro.

According to Victor Lerp, Unreal Engine XR Product Specialist at Epic Games, the company is now “exploring native Unreal Engine support for Apple Vision Pro,” the upcoming mixed reality headset due to launch in early 2024.

Lerp says it’s still early days though, noting that it’s “too early for us to share details on the extent of support or timelines.”

Lerp posted the statement on Unreal Engine’s XR development forum. You can read it in full below, courtesy of Alex Coulombe, CEO of the XR creative studio Agile Lens:

 

During Vision Pro’s unveiling at WWDC in June, Apple prominently showcased native Unity support in its upcoming XR operating system, visionOS. Unity began offering beta access to its visionOS-supported engine shortly afterwards, making it feel like something of a ‘bait and switch’ for developers already creating new games, or porting existing titles to Vision Pro.

As explained by Axios, Unity’s new plan will require users of its free tier of development services to pay the company $0.20 per installation once their game hits thresholds of both 200,000 downloads and earns $200,000 in revenue. Subscribers to Unity Pro, which costs $2,000 a year, have a different fee structure that scales downwards in proportion to the number of installs. What constitutes an ‘install’ is still fairly nebulous at this point despite follow-up clarifications from Unity. Whatever the case, the change is set to go into effect starting on January 1st, 2024.

In the meantime, the proposed Unity price increase has caused many small to medium-size teams to reflect on whether to make the switch to the admittedly more complicated Unreal Engine, or pursue other game engines entirely. A majority of XR game studios fit into that category, which (among many other scenarios) could hobble teams as they look to replicate free-to-play success stories like Gorilla Tag, which generated over $26 million in revenue when it hit the Quest Store late last year.

Epic’s MetaHuman Avatar Tool Can Now Import Scans of Real People… Maybe You One Day

Epic Games, the company that makes Unreal Engine, recently released a substantial update to its MetaHuman character creation tool which for the first time allows developers to import scans of real people for use in real-time applications. The improvements glimpse a future where anyone can easily bring a realistic digital version of themselves into VR and the metaverse at large.

Epic’s MetaHuman tool is designed to make it easy for developers to create a wide variety of high quality 3D character models for use in real-time applications. The tool works like an advanced version of a ‘character customizer’ that you’d find in a modern videogame, except with a lot more control and fidelity.

A 3D character made with MetaHuman | Image courtesy Epic Games

On its initial release, developers were only able to start formulating their characters from a selection of preset faces, and then use tools from there to modify the character’s look to their taste. Naturally many experimented with trying to create their own likeness, or that of recognizable celebrities. Although MetaHuman character creation is lighting fast—compared to creating a comparable model manually from the ground up—achieving the likeness of a specific person remains challenging.

But now the latest release includes a new ‘Mesh to MetaHuman’ feature which allows developers to import face scans of real people (or 3D sculpts created in other software) and then have the system automatically generate a MetaHuman face based on the scan, including full rigging for animation.

There’s still some limitations, however. For one, hair, skin textures, and other details are not automatically generated; at this point the Mesh to MetaHuman feature is primarily focused on matching the overall topology of the head and segmenting it for realistic animations. Developers will still need to supply skin textures and do some additional work to match hair, facial hair, and eyes to the person they want to emulate.

The MetaHuman tool is still in early access and intended for developers of Unreal Engine. And while we’re not quite at the stage where anyone can simply snap a few photos of their head and generate a realistic digital version of themselves—it’s pretty clear that we’re heading in that direction.

– – — – –

However, if the goal is to create a completely believable avatar of ourselves for use in VR and the metaverse at large, there’s challenges still to be solved.

Simply generating a model that looks like you isn’t quite enough. You also need the model to move like you.

Every person has their own unique facial expressions and mannerisms which are easily identifiable by the people that know them well. Even if a face model is rigged for animation, unless it’s rigged in a way that’s specific to your expressions and able to draw from real examples of your expressions, a realistic avatar will never look quite like you when it’s in motion.

For people who don’t know you, that’s not too important because they don’t have a baseline of your expressions to draw from. But it would be important for your closest relationships, where even slight changes in a person’s usual facial expressions and mannerisms could implicate a range of conditions like being distracted, tired, or even drunk.

In an effort to address this specific challenge, Meta (not to be confused with Epic’s MetaHumans tool) has been working on its own system called Codec Avatars which aims to animate a realistic model of your face with completely believable animations that are unique to you—in real-time.

Perhaps in the future we’ll see a fusion of systems like MetaHumans and Codec Avatars; one to allow easy creation of a lifelike digital avatar and another to animate that avatar in a way that’s unique and believably you.

Oculus Unreal Engine 5 Branch Now Available, But Key Features Don’t Work In VR

An Oculus branch of Unreal Engine 5 is now available, enabling development for Quest 2.

Unreal Engine 5 shipped earlier this month after launching in Early Access last year. On modern PCs and next-gen consoles its ‘Nanite’ geometry system brings a radical new way to how games are made & rendered. In previous engines, artists import reduced detail versions of the original assets they create. When you move far enough away from those assets, an even lower detail version generated in advance (either manually or automatically) is displayed instead. This is called LOD (Level of Detail).

Nanite upends this approach. Artists import the full movie-quality assets and the geometric detail is scaled automatically in real time based on your distance from the model. Virtual objects look incredibly detailed up close, and don’t “pop in” or “pop out” as you move away from them.

However, Nanite and its associated Lumen lighting system don’t work in VR and aren’t even available on Android. VR developers have to use the legacy geometry and lighting systems, negating many of the advantages of the new engine.

If you’re migrating from Unreal Engine 4, Epic has an important guide on how to do this.

Meta notes the following features are not yet implemented:

  • passthrough or spatial anchors
  • late latching
  • Application SpaceWarp
  • mobile tonemap subpasses
  • UE5 ports of sample and showcase projects

As such, Meta still recommends sticking with Unreal Engine 4.27 for serious app development.

To access the Oculus UE5 branch you first need to register your GitHub ID with Epic – if you don’t do this you’ll get a 404 error when trying to access it.

Epic Games Raises $2B to Further Expand Its Metaverse Ambitions

Epic Games announced it’s doubled the amount of cash it raised last year to kick off its metaverse ambitions. Now the company, which owns Unreal Engine and popular battle royale shooter Fornite, says it’s secured $2 billion to further build out its future metaverse platforms.

Nearly a year after its stunning $1 billion raise, Epic has secured dual $1 billion investments from Sony Group Corporation and KIRKBI, the holding and investment company behind The LEGO Group.

Sony has become somewhat of a serial investor in Epic. In 2020 Sony invested $250 million in Epic with the goal of creating “real-time 3D social experiences leading to a convergence of gaming, film, and music. That’s before the ‘metaverse’ buzzword erupted, but it’s clear where Sony was headed.

Now KIRKBI, the private holding and investment company founded by former president and CEO of The LEGO Group, Kirk Kristiansen, is looking to jump into the rapidly expanding metaverse trend.

“Epic Games is known for building playful and creative experiences and empowering creators large and small,” said Søren Thorup Sørensen, CEO, KIRKBI. “A proportion of our investments is focused on trends we believe will impact the future world that we and our children will live in. This investment will accelerate our engagement in the world of digital play, and we are pleased to be investing in Epic Games to support their continued growth journey, with a long-term focus toward the future metaverse.”

Tim Sweeney, CEO and founder of Epic, says the investment will “accelerate our work to build the metaverse and create spaces where players can have fun with friends, brands can build creative and immersive experiences and creators can build a community and thrive.”

Epic says it will continue to have a single class of common stock, and remains controlled by its CEO and founder, Tim Sweeney. This brings Epic’s post-money equity valuation to $31.5 billion.

The post Epic Games Raises $2B to Further Expand Its Metaverse Ambitions appeared first on Road to VR.

Epic Confirms PSVR 2 Unreal Engine 5 Projects In Development

During its State of Unreal presentation this week, Epic Games confirmed that there are PSVR 2 titles in development using Unreal Engine 5.

Back in May 2020, Epic Games unveiled the next generation of its engine, Unreal Engine 5, which promised big graphical and technical leaps. That demo focused particularly on two big new technologies — Nanite (a geometry technology that lets artists create highly intricate details on objects and surfaces) and Lumen (a dynamic global illumination system for realistic lighting).

Epic Games confirmed to UploadVR at the time that Unreal Engine 5 would continue support for all current VR headsets.

Announced in yesterday’s presentation, Unreal Engine 5 is now available to all developers. At around the 26-minute mark in the video embedded below, Dana Cowley, Communications Director, Technology at Epic Games, spoke in front of a large graphic featuring many game studio logos. She then said that Epic was “thrilled to confirm that all of the amazing studios that you see here are a part of the Unreal Engine 5 community, which continues to grow every day.”

Interestingly, the PlayStation VR2 logo was displayed in the top right, even though it’s a piece of hardware and not a game studio itself. That being said, the intent is clear — there are studios who are developing PSVR 2 games using Unreal Engine 5.

If you take a closer look, you’ll also spot a few other studios and publishers who have flirted with VR before. Most notably, Fracked and Phantom developer and Little Cities publisher nDreams is listed. Just last week, nDreams announced a $35 million investment and confirmed they were working on PSVR 2 titles.

Bloober Team is also listed, a studio who has previously brought some of its flatscreen titles, like Layers of Fear, over to VR headsets.

Devolver Digital is listed, who previously published VR games like Gorn and more recently, Tentacular. Last but not least, 2K is also listed — a subsidiary of parent company Take-Two (who have said more VR titles are on the way) and developer of the Borderlands franchise and its brief dive into VR.

Keep your eyes open for more PSVR 2 news as the year progress — or even just one eye, if you’re using eye tracking.

Women in the Metaverse: Veronica Lynn Harper

Multidisciplinary artist Veronica Lynn Harper knows no bounds. With over 15 years of experience in 3D character design and asset development, she’s worked with some of the gaming industry’s most notable trailblazers — including Sony, Electronic Arts and Atari. She’s even hosted themed events for Disney and Blizzard and spoken on a panel for women at the Sundance Film Festival.

With Web3 on the horizon, Veronica’s next ventures include exploring new forms of digital storytelling and turning them into more immersive experiences. She’s already created a roster of digital collections — such as downloadable stories, an upcoming NFT collection and much more.

As part of our ongoing series, we recently sat down with Veronica to chat about her humble beginnings, her upcoming projects in the Web3 space and what her idea of a more immersive, opportunistic internet might look like one day.

Beginnings

From her earliest memory, Veronica recalls having a distinct passion for art and various forms of expression — whether it was through sports, dance or creating things with her hands, voice or brain. From a childhood Disney fandom to attending Cirque du Soleil and Broadway musicals, she especially adored the stylisation behind animation and how it was able to captivate audiences — even when the characters weren’t real. 

Her passion for creative storytelling even persuaded her grandparents to bring her to MGM Studios — where she was able to watch creators “writing and drawing on the spot” and witness processes unfolding in real-time. “Once you study characters and emotions, it’s fun to create stories around them,” she says.

Veronica cites Dan Platt, an established character designer and clay artist, as her first mentor. He guided her through tools and techniques of sculptural form, equipping her with the right foundational knowledge to kickstart a career in 3D design.

Today, she’s found a way to marry her earlier passions of storytelling, practical art and 3D rendering. Amongst her latest projects are fine art statues, textiles in fashion and creating digitised iterations of her work through VR, AR and XR experiences.

A modern-day Rorschach

Veronica has often been referred to as a “modern-day Rorschach”. Converging sight and sound, she uses music to induce a flow state and allows it to conduct her creation process, crafting visual iterations of each sound and its vibrational pattern. Depending on her audio of choice, her shapes will differ. 

“Patterns teach people about flow state, passion and what they’re doing,” she says. When detailing her process, she uses the analogy of a surfer riding a wave — when they’re able to “carve out a shape during flow state, they communicate with the water. Their design will be different, depending on how they’re in that space and in that moment.” 

The same can be said for artwork: “If that’s translated in a pure way without any thoughts breaking up the flow state, you’ll get the truest, most real thing in that experience.”

Veronica has tried to practice on her own materials with music, quickly digitising pieces and creating patterns that she feels will be easily transferable within various different industries (such as interior design, digital 3D stories, fine art or collectables). She has also deployed different music genres to guide her work — including tropical house, jungle bass, psytrance, EDM, dubstep and hip-hop.

By applying sound patterns to fashion, she’s seen this as a good way to measure whether her viewers will connect with them as intensely as she has. “You can make rad art and nobody feels anything, but I like to make art that other people have experienced things from — [art] that has made them feel more connected or that has given them something in return.”

Veronica has painted for several DJs at major music festivals (such as Lightning in a Bottle and Envision) — work she cites as some of the greatest experiences in her career thus far. In the last year, virtual concerts have become all the rage — with artists like Ariana Grande, Travis Scott and most recently, the Foo Fighters taking to metaverse stages to reach wider and more diverse audiences. 

In light of this, one of Veronica’s next projects includes dressing DJs and other musicians on stage with clothing that will bear the patterns of their musical genius. However, she’d also like to achieve this within a metaverse space, driving visuals with her motion-capture suit and building 3D assets that can be digitised and live anywhere. Having recently consulted with Nike about carrying her concepts into virtual spaces, she’s learning both digital and practical pipelines — which make her comfortable with art-directing for both physical and digital realms.

Capturing movement

To Veronica, flow state is “harmony between mind, body and energy, as well as mental and physical health.” Her relationship with movement and sound has formed an ideal foundation for her upcoming NFT collection, which features stunning, almost-pearlescent visual stills of her movement in stasis.

To create these images, Veronica works with various motion capture partnerships — one being OptiTrack, a motion-based capture system — and the other being Xsens, a type of wearable equipped with inertia-based sensors. Her motion capture gloves, which capture stunningly nuanced hand movements through the support of machine learning, are produced by the team at Stretchsense

She’s also partnered HTC Vive and Faceware for all of her facial animation work. Additionally, she is a huge supporter of Wacom and Logitech — all while developing visuals with software Maya, Substance, Photoshop, Zbrush, Marvelous Designer and Clo3d.

Building a more inclusive Web3

On the topic of building a more inclusive internet, Veronica has stressed the importance of a digital workforce — particularly in a post-COVID climate. “If you can do anything web or digital-based, your survival rate is better. Especially for countries that don’t have drivable access to studios for sustainability,” she says. By continuing to form and hold new partnerships, she hopes to experience new tech and showcase new ways for it to be used. 

Like many women in tech, Veronica also speaks of being a female minority throughout the course of her career. Having worked in the games industry for over 15 years, she recounts being one of three females in a team of 500 asset developers at Sony and one of 250 women at Electronic Arts. In recent years, she’s been pleased to see more women join the gaming space — an effort she’s always been eager to support.

Going forward, Veronica hopes to see a version of the internet that is open to everyone of all backgrounds, sexual orientations, ages and identities. Moreover, she notes that the very nature of the metaverse is all-inclusive — meaning that it should continue to bridge both practical and digital worlds. In Web3, “tech will continue to evolve and it is at the voice of the people.”

Her advice to other women? “If you want something to change or be different, apply to a team where you feel something is missing.”

What’s next?

Some of Veronica’s most recent work includes a project with leading design firm Gensler, where she’s providing upcoming visual content for buildings with built-in digital media, a large AT&T wall in Dallas’ Discovery District and other upcoming public installation projects. 

She’s also accepted representation from the renowned Patrick Jones Gallery, which is also based in Dallas. Famous contemporary notable artist offerings include Banksy, Andy Warhol, Invader, Arsham, Dicke and more. She’s also designing practical art sculptures and wall art with downloadable digital NFT stories.

She’s also the mastermind behind a series of bespoke 3D renderings — including a character named Bunnii — which she’s brought to life in Unreal Engine’s MetaHuman Creator. She plans to animate her renders with her motion capture gear.

Currently, Veronica is seeking partnerships for art installations for public art, museums, events, conventions or stage performances — whether that’s on a digital or a practical stage. She’s even in the midst of working on an XR stage in Unreal Engine, where she hopes to recreate the same energy she’s sparked at real-world gigs.

To find out more about Veronica’s next projects, be sure to follow her Twitter, YouTube, Instagram and LinkedIn pages for more updates. Her work can also be further explored on her official website.

Epic Games Awarded Grants to 31 XR Projects in 2021 Through the $100M MegaGrants Fund

Since 2019 Epic Games (well known as the creators of Unreal Engine & Fortnite) has run the Epic MegaGrants program, a $100 million fund to financially support projects built with Unreal Engine. In 2021 the program awarded grants to 31 XR projects.

Epic recently recapped the complete list of MegaGrant recipients in 2021, comprising a whopping 390 individual projects, each of which received a grant from the program of up to $500,000

By our count, 31 one of those were built with XR in mind. The projects range widely from games to simulation to education and more. Here’s a few that caught our eye, along with the complete list or XR recipients further below.

BRUNNER Elektronik – Unreal Engine Integration for NOVASIM Flight Simulator

HumanCodeable – Advanced VR Framework

Tribe XR – DJ in VR

VRSpeaking LLC. – Ovation

All Epic MegaGrant XR Recipients in 2021

  1. 6th Sense VR – Ayatana Concept (France)
  2. ​​ALO VR – VigourVR (Singapore)
  3. Art Reality Studio – IVR 6 (United States)
  4. BRUNNER Elektronik AG – Unreal Engine Integration for BRUNNER NOVASIM VR/MR (Switzerland)
  5. b.ReX GmbH – Intelligent Cycling
  6. Byker Biotech Pty Ltd – VR Lab for 3D human specimens (China)
  7. Dmitro Tsalko – EnergoVR (Ukraine)
  8. Eternal Monke Games – Dragon.IK – Universal Inverse Kinematics Plugin
  9. HumanCodeable – Advanced VR Framework (Germany)
  10. IMP – Interactive Media Production – VR Fire Safety Simulator
  11. Lightscape VR (Germany)
  12. North Carolina State University (NCSU) – transVRse (United States)
  13. ONMOTIO – Immersive Technical Training in Inhospitable or Dangerous Zones Using VR (Canada)
  14. Research Center for Molecular Medicine (CeMM) – DataDiVR Interactive Data Analytics Platform (Austria)
  15. Tribe XR – DJ in VR (United States)
  16. University of North Carolina at Asheville – Interfacing Unreal with Physical Computing & New Media VR Pedagogy (United States)
  17. Vantari VR – Critical Care Procedural Training Suite (Australia)
  18. VRSpeaking LLC. – Ovation (United States)
  19. Western University – Using AR/VR to Improve Pediatric Surgical Patient’s Hospital Experience (Canada)
  20. Raytracer PTY Ltd. – Underwater Virtual Reality Simulation for Astronaut Training (Australia)
  21. Universitat Pompeu Fabra – Mixed Reality in Fetal Intervention Using Unreal Engine (Spain)
  22. VYV Corporation – Photon Augmented Reality Studio (Canada)
  23. Lemay – AR Workflow from Revit to Reality (Canada)
  24. New Reality Co. – Rainforest: A Multiplayer AR Experience (United States)
  25. Oakland University – Augmented Reality Center (ARC) for Industrial Applications (United States)
  26. Marquette University – Immersive and Augmented Media Design Fellowship (United States)
  27. Brainstorm Multimedia S.L. – EDISON – Unreal Template Based AR Solution for Education (Spain)
  28. Createxion – Draw-It AR (India)
  29. University of Michigan – AR System for Lunar EVAs (United States)
  30. Visometry GmbH – HQ-CAD-AR on a Driving Car (Germany)
  31. RealityArts Studio / Velarion – The Stranger (Turkey)

Epic says that MegaGrants awards are not investments or loans, and recipients can use the money to do “whatever will make their project successful,” with no oversight from the company. Similarly, recipients retain full rights to their IP and can choose to publish their projects however they want.  If you’re working on something related to Unreal Engine, you can apply for consideration too!

The post Epic Games Awarded Grants to 31 XR Projects in 2021 Through the $100M MegaGrants Fund appeared first on Road to VR.

Manus Launches its Free Motion Capture Software Polygon

Manus Polygon

Manus specialises in building enterprise-level data gloves with precision finger tracking and haptic feedback for a range of use cases including virtual reality (VR). The company is moving beyond pure hardware solutions today by releasing Manus Polygon, motion capture software that’s SteamVR compatible and free to download.

Manus Polygon

Designed as an entry point for developers looking for a simple motion capture solution without the expense, Polygon Free enables live streaming of body data into Unity or Unreal Engine. When it comes to tracker support, Polygon can be used with any SteamVR compatible device, from the Vive controllers for a basic setup to Manus’ own SteamVR Pro Trackers or the Vive Trackers. And, of course, the software is compatible with the company’s own Prime X series gloves.

For a basic motion tracking setup beyond merely using controllers, developers need enough trackers to cover six points, hands, feet, waist and head. With a VR headset on that means five extra trackers are required. Polygon can support more though, adding further trackers to the upper arms to finesse that digital avatar movement.

“At Manus, we believe in a future where content creation in the virtual world becomes as integrated as video is currently. Convincing full-body motion capture will play a large part in the adoption and creation of the metaverse,” says Bart Loosman, CEO at Manus in a statement. With this release, we invite developers and content creators to dive into full-body motion capture and explore the opportunities this offers for VR, animation, digital avatars, virtual production, and the coming metaverse.”

Manus Polygon

Manus Polygon Free provides all the software functionality developers might need to get started, with Polygon Pro and Polygon Elite offering further professional features. Polygon Pro features recording and editing tools within the Manus Core, as well as FBX exporting, timesync and genlock. Pro users will also get the Manus Strap Set to attach SteamVR compatible trackers. Taking that a step further is Polygon Elite which includes the Pro bundle, a perpetual license, and Manus SteamVR Pro Trackers and a charging station.

The Manus SteamVR Pro Trackers were announced earlier this year with pre-orders being taken for them individually. On the Manus website currently, they only seem to come in a 6-pack retailing for €1,999 EUR, available Q4 2021. By comparison, six Vive Trackers would set you back €834.

For continued updates from Manus, keep reading VRFocus.

Unreal Engine Gets OpenXR Improvements Just in Time for Oculus Development Shift

The latest version of Unreal Engine 4, version 4.27, brings “production-ready” support for OpenXR. The change comes just in time for Oculus developers, as the company recently announced it’s fully moving open to the OpenXR standard for VR development going forward.

You may recall news back in May that the early access version of Unreal Engine 5 includes improved support for OpenXR. And while developers can play with that version of the engine and its XR tools, the engine’s creator (Epic Games), doesn’t recommend the early access version of Unreal Engine 5 for anything more than experiments at this point. For developers building anything they intend to ship to the public, the company is still recommends Unreal Engine 4.

Since Unreal Engine 5 itself isn’t ready for prime time, Epic Games is continuing to update the production-ready Unreal Engine 4. The latest of which, version 4.27, is the first that the company says includes production-ready OpenXR capabilities.

OpenXR is a royalty-free standard that aims to standardize the development of VR and AR applications, making hardware and software more interoperable. The standard has been in development since April 2017 and is backed by virtually every major hardware, platform, and engine company in the VR industry, including key AR players.

OpenXR has seen a slow but steady adoption since reaching version ‘1.0’ in 2019, and in the last 12 months it has significantly picked up pace with SteamVR officially supported it back in February and Oculus announcing last month that it’s going “all in” on the standard, saying that all new developer features will be built on top of OpenXR going forward.

That makes Unreal Engine 4.27 a timely release; when Oculus said it would be fully shifting to OpenXR development last month, it put XR developers in an odd spot because the two biggest game engines, Unity and Unreal Engine, didn’t yet claim to offer production-ready OpenXR support.

Unity developers will have to wait a while longer before they can confidently make the leap to OpenXR as oculus expects that the Unity OpenXR plugin won’t be “fully supported” until early 2022. That will be a bigger deal once it finally happens because Unity is far and away more used than Unreal Engine when it comes to building XR content.

But maybe developers should take another look… last we checked, Oculus still has a special deal for developers building VR apps with Unreal Engine; the company offers to cover engine royalties for a game’s first $5 million in revenue.

Epic says that the OpenXR support in Unreal Engine 4.27 supports extension plugins from the Unreal Marketplace, which means that developers can add extra OpenXR functionality through plugins rather than waiting for updates to the entire engine.

New VR and AR Development Templates

Unreal Engine 4.27 also adds an improved VR Template which Epic says is “designed to be a starting point for all your VR projects,” and comes with basic VR capabilities built-in, like teleport locomotion, snap rotation, object grabbing, a spectator camera, and a VR-capable menu system.

The VR Template offers support for Oculus Quest 1 & 2, Quest with Oculus Link, Rift S, Valve Index, HTC Vive, and Windows Mixed Reality. Thanks to OpenXR, Epic says that “the template’s logic works on multiple platforms and devices without any platform-specific checks or calls.”

For PC VR, Unreal Engine 4.27 also includes experimental support for fixed foveated rendering, a technique which reduces the quality of peripheral imagery in favor of higher quality at the center where the user can see most sharply through the lens. Fixed foveated rendering in Unreal Engine 4.27 is currently limited to Windows platforms with DX12 and a GPU supporting VRS Tier 2.

Unreal Engine 4.27 also includes a new template for handheld AR development which is designed as a starting point for developers building AR apps based on ARCore (Android) and ARKit (iOS). Similar to the VR template, the AR Template includes basics like a built-in UI, a tool for users to take snapshots of the AR content, and the ability to move, rotate, and scale models places into the world.

The new engine update further includes heaps of improvements to Unreal Engine’s virtual production tools which are designed to combine real-time CGI environments with live-action filmmaking. Check out the complete patch notes for Unreal Engine 4.27 here if you want to go in-depth.

The post Unreal Engine Gets OpenXR Improvements Just in Time for Oculus Development Shift appeared first on Road to VR.