LIV iOS Open Beta Offers Easy Oculus Quest Mixed Reality Capture

LIV’s iOS app is in open beta, offering an easy way to capture mixed reality footage with select games using just an iPhone and an Oculus Quest headset.

An early version of the software was available back in March 2020, but was taken down and improved to make it more reliable and accessible. Now, the new open beta gives anyone with a Quest and an iOS device an accessible way to start capturing mixed reality footage.

The app is available in open beta through Apple’s beta testing platform, Testflight. Users can install the app on their iOS device through Testflight and will then also need to install the LIV Capture app for Oculus Quest, which is available for sideloading through SideQuest.

Once everything is installed, the LIV Capture app on Quest runs the user through a quick calibration process to align everything. Once that’s complete, the LIV app can show a mixed reality view of select Quest games, where the player is overlaid onto a third-person perspective view of the virtual world.

Mixed reality capture has been available in varying capacities across many VR systems for some time now, but the LIV iOS app makes it much more accessible for Quest users. Only a phone and a Quest headset are needed — no additional equipment is required, not even a green screen. The app is able to identify the user against any background and dynamically place them into a third-person perspective of the virtual world, with generally positive results. Users can record the mixed reality view by using iOS’s built-in screen recorder.

For now, only select Quest games support mixed reality with the LIV app — Beat Saber, Crisis VRigade, Cubism, FitXR, Hyper Dash, OhShape, Real VR Fishing, Smash Drums, Space Pirate Trainer, Superhot and Synth Riders.

The LIV Beta app for iOS arrives at a time when there seems to be increasing support and interest in mixed reality capture. Facebook is seemingly trying to build its own answer to LIV’s mixed reality tools, but the features are segmented and don’t quite work together cohesively yet. Live Overlay lets you view a cut-out of the user playing VR (taken from the phone’s camera) on top of the first-person VR view cast to a phone. However, this misses the essential third-person perspective needed for mixed reality.

Meanwhile, its Spectator Camera feature will allow users cast a Quest to a phone and reposition the camera to third-person positions (that offer a different perspective to the VR user’s first-person view), but lacks any implementation of mixed reality features. The building blocks are there, but all remain separate and don’t come together cohesively like in LIV’s iOS app.

Likewise, Fabio Dela Antonio’s Reality Mixer app is another community project offering mixed reality capture on Quest, which Mark Zuckerberg seemingly used in a recent Facebook video showing him playing Beat Saber captured in mixed reality.

LIV’s iOS app is available in open beta now through Testflight. You can read more on the LIV blog.

Kopin Latest XR Optical Solution is an All-Plastic Pancake Design

Kopin’s microdisplays have long been used in a range of augmented reality (AR) and virtual reality (VR) products such as the SKULLY FENIX AR motorcycle helmet and the Google Glass Enterprise Edition. Like many companies working within the XR field, Kopin is working towards that dream scenario of a small, lightweight headset not too dissimilar to a pair of glasses. Its latest reveal takes another step towards that future with its new all-plastic Pancake optics.

Kopin Pancake Lens

Pancake optics are nothing new as they allow for a thin form factor, ideal for XR applications. However, this optical solution tends to require a glass lens to avoid image artefacts caused by birefringence when used in conjunction with plastic materials. Kopin has managed to create an all-plastic solution using two elements which not only reduces weight but also provides “virtually no birefringence” the company claims.

Currently, the company has applied for three patents on the design with more to follow. Its first all-plastic Pancake optics is called the P95 as it provides a 95° field of view (FoV), whilst weighing in at 15g per lens set and 16mm thick.

“This major breakthrough represents a foundational milestone for the emerging VR markets. Bulky, heavy headsets have been a major barrier to faster consumer adoption for many years. No more,” said Dr. John C.C. Fan, CEO and founder of Kopin Corporation in a statement. “After thorough review of the available optics designs, we believe all-plastic Pancake optics are the best for VR applications that require a very large field of view, excellent image performance, and a super compact size. The challenge of finding a suitable plastic material with the required optical properties has been solved.”

Google Glass Enterprise Edition2
Google Glass Enterprise Edition 2 uses Kopin’s nHD Display.

“We believe our Pancake optics are perfect for VR headsets. Our first design, P95, is specifically optimized for use with our 2.6K x 2.6K OLED high-brightness microdisplays. This new technology would be ideal for VR, but we believe this unique optics would also be useful for MR and AR applications,” Fan continues. “With all-plastic Pancake optics and duo-stack OLED microdisplays, the dream of stylish, super lightweight (< 100 g), compact VR/MR/AR high-performance headset products is finally reachable. We plan on offering our new optics as either a stand-alone component or in a module with our OLED microdisplays. We are delighted that we already have the first design-in partner for our P95 optics.”

As patents are still being applied for it might be a while before Kopin’s P95 Pancake Optics start showing up in products. There are plenty of others including Facebook and Apple looking to get into the AR market over the next couple of years whilst those already in the field like Microsoft are working towards consumer-friendly devices. As further details are released, VRFocus will let you know.

Facebook’s VR Spectator Tools Could Evolve Into Mixed Reality Casting

Facebook continues to evolve its spectator tools for virtual reality, but creating mixed reality footage remains an arduous process from the standalone Quest while the company continues to actively explore the approach.

The company recently added a “Live Overlay” feature in the most recent v29 software update for Quest which merges the first-person view in VR with a camera view of the person wearing the headset. The feature is starting to roll out to a limited number of users in the Oculus app “and it will work with any VR app that supports Casting and Recording,” Facebook explained over email.

In addition, a page in the Oculus Quest developer documentation explains “Spectator Camera is extending Mixed Reality Capture (MRC) to the Oculus app casting.” We contacted Facebook to ask for clarification about this feature because the language in this description isn’t clear. The “Spectator Camera” will allow casting the view from VR to the Oculus app while enabling control of the view from the phone using touch controls, Facebook explained. It was announced at Facebook’s last VR developer event but doesn’t include a view of the VR user wearing the headset.

“There aren’t any apps live with this today, but we are actively working with developers to integrate, so that’s why these developer tools are available now,” Facebook explained.

And while the “Live Overlay” feature does feature an external view of the person in VR, it doesn’t include a third-person angle or the careful alignment that’s key to “Mixed Reality Capture.”

Mixed Reality Capture is driven largely by community projects making it possible to merge simulated content streaming from a VR app with an actual camera’s view of the scene. The merging of two realities becomes a single seamless view that’s instantly understandable to spectators. In 2016, Valve essentially popularized the idea with its initial promotional video for SteamVR. Since then, community projects like LIV and Fabio Dela Antonio’s Reality Mixer app enable developers and broadcasters to more easily share what it feels like to play VR games without post-production work. Some of these projects, however, require an arduous alignment process to sync the view of two realities, or requires the addition of extra recording software like OBS.

Dela Antonio maintains an open source repository for his work on the Reality Mixer app and Facebook CEO Mark Zuckerberg this week posted a mixed reality video of himself playing Beat Saber using what appears to be the tool. We reached out to Dela Antonio for comment on the Zuckerberg video and to see what Facebook’s interest in mixed reality broadcasting means for his work. He wrote to us over direct message that he thinks Zuckerberg’s video and the developer page are evidence the company will eventually offer a solution similar to his app.

“I generally see this as a positive thing, it means that these videos will become even more accessible, and that they’ll start making some improvements to the existing Oculus MRC mechanism,” he wrote. “As for my app, its future compatibility with the Quest will depend on the changes they make to the MRC mechanism. I’m still working on the app as normal, I’ve recently added support for Avatars using ARKit’s body tracking and I’m also working on supporting physical green screens…there are some other things I could do to improve the calibration process and to allow the user to move the camera around the scene…but that’d require changes to how the Oculus MRC protocol works, it’d be interesting to know if Oculus has similar plans/ideas…One reason I was able to start working on my app last year was because the Oculus MRC plugin for OBS was open source, and that’s also why I’ve kept my app open source. It’d be nice if they could open source their new mobile Mixed Reality solution…”

Facebook offers the Portal line of video calling devices which include a wide angle camera and a key feature of the hardware zooms in on people in the scene. But years after release, there’s still no integration with Oculus Quest for Casting or any other functionality. Recent tweets from a Japanese group shows how full body tracking might be possible with an external camera and Oculus founder Palmer Luckey recently commented that Portal should be used for the same purpose.

“Overall, we are exploring a variety of ways to help people better capture, represent and share their VR experiences. We believe these kinds of tools help to make VR more fun, and are also good for the ecosystem since they enable people to share the experiences they’re having in VR,” Facebook shared in an email. “Moving forward, you will see us continue to experiment with different features and capabilities in the Oculus App that provide new ways of casting, recording and sharing the VR experience with various different views of reality, virtual reality, camera perspectives, etc. There are many ways that it’s possible to represent and share VR, and every VR app is different – That’s why we’re testing different approaches to learn from our community about what modes of capture best represent them and the VR apps they use. In the future, features like these could possibly evolve into a mobile ‘mixed reality’ casting/recording experience like you describe. We don’t have specific plans or timing to announce for that, but it’s something we are exploring – in fact, we showed a proof of concept for this at OC6, so it’s something we’ve been continuing to explore over time.”

‘Guardian Intrusion Detection’ Prompt Dormant In Oculus Quest

A dormant ‘Guardian Intrusion Detection’ interface prompt in the Oculus Quest system software was found by YouTuber Basti564.

Basti showed us how to find and launch the prompt. We verified it’s real, and exists in the current Quest firmware. There is no evidence of any functionality yet – it looks like this is just a dormant prompt for a potential future feature.

Based on the wording, it looks like this feature will let you know if any object bigger than a human hand enters your space while you’re in VR. That should mean less risk of accidentally punching your family or housemates in the face, and it may also protect cats and dogs that don’t understand you can’t see them.

Quest headsets have cameras on the front and top sides, so the system warns you it won’t be able to see behind you, and “works best in rooms with ample lighting”.

Basti564 tells us this prompt has existed since Quest software v24, launched in December. Reddit user Reggy04 claims strings referencing ‘Intrusion Alert’ existed as early as v20 back in September, which we were able to independently confirm.

So does this mean Intrusion Detection is launching some time soon? It’s unclear.

Guardian’s Couch Mode, just launched in February, was first spotted in the Quest software back in July, seven months earlier. Conversely, references to a shared-space colocation mode have existed as far back as August 2019, but that feature has yet to even be announced.

Since May 2020, Guardian warns you of static obstacles in your playspace when you boot up. But that warning only shows up in passthrough mode, so Intrusion Detection could be the natural next step.

VR is sometimes criticized as isolating, but each year headsets get a little more aware of the real world around you, and we don’t expect that trend to stop any time soon.

GTA-Style VR Game ‘GangV Civil Battle Royale’ Coming This Year To PC VR

Raptor Lab, the creators of War Dust and Stand Out: VR Battle Royale are back again with another in-progress big-battle PC VR game titled GangV Civil Battle Royale, but this time it takes place in a modern city setting similar to Grand Theft Auto. GangV will also support non-VR players.

GangV, other than being a VR battle royale game (50 total players, plus NPCs on a large 64 square kilometers map) with tons of vehicles and weapons to choose from across a sprawling open city, is actually pretty unique. The clever concept here is that you’re not battling other players on an open, empty map. Instead, the city is bustling and full of NPCs just like in Grand Theft Auto.

Your objective is to be the last player standing, but the game itself plays out like a big gang war across a metropolis. The footage provided on the Steam page mentions that some viable tactics include trying to blend into traffic while driving to “hide” yourself and reaching out the window to shoot at people chasing you while a friend drives the car.

There is a law enforcement system built into the game as well. So if you rob a gas station looking for cash to try and get better gear, the cops might get called on you. But if you rob a police station or military base, prepare for SWAT or the actual military itself to try and hunt you down.

GangV sounds and looks really ambitious and impressive, so I’m eager to see how it pans out. Their two most popular previous games, Stand Out and War Dust, really did a good job of nailing the sense of scale for big-battle games despite feeling a bit janky, but maybe adding non-VR support to GangV means they can hit a wider audience and get more revenue to keep working on the game for longer and making it even better.

Check out the Steam page for more details. GangV Civil Battle Royale doesn’t have a release date, but it’s currently in alpha testing for PC VR with support for Rift, Vive, Index, and Windows MR. Early Access should start soon on Steam, where it will stay for “2-3 years” according to the developers. Within two months after Early Access launch, they’re planning to add deep modding support as well.

You can check the game’s Discord channel for more details.

Microsoft Mesh to Enable Shared Experiences Across XR Platforms

Microsoft Mesh

Today see’s the start of Microsoft Ignite, its online virtual event which has started with an XR bang. Taking to AltspaceVR’s virtual stage was Microsoft Technical Fellow Alex Kipman to announce Microsoft Mesh, its new mixed-reality (MR) platform which aims to make shared holographic experiences effortless across multiple devices.

Microsoft Mesh

Showcasing Mesh by hosting the keynote in the social app, Kipman welcomed various speakers including Microsoft CEO Satya Nadella, director James Cameron, Niantic CEO John Hanke and Cirque du Soleil co-founder Guy Laliberté as viewers tuned in from around the world, in both VR and via other devices.

The platform is powered by Microsoft Azure, its cloud-computing service, benefiting from its enterprise-grade security and privacy features. The core focus of Microsoft Mesh is to enable multi-user XR, where companies and consumers can take a device with a Mesh-enabled application and swap ideas, learn or simply socialise. It’ll support 3D models for users to interact with, whilst a full suite of AI-powered tools will enable avatar creation. spatial rendering and more.

“This has been the dream for mixed reality, the idea from the very beginning,” said Kipman in a blog post. “You can actually feel like you’re in the same place with someone sharing content or you can teleport from different mixed reality devices and be present with people even when you’re not physically together.”

Microsoft Ignite, Alex Kipman and John Hanke
Alex Kipman and John Hanke at Microsoft Ignite

“Our part of this is the work of stitching the digital and physical worlds together, connecting the bits and atoms so these experiences can be possible using the Niantic platform,” Hanke said. “But social connections are really at the heart of everything we do, and Microsoft Mesh innovations just enrich that.”

Microsoft Mesh will work on HoloLens 2, Windows Mixed Reality, Oculus headsets, PCs, Macs and smartphones so its not restricted to one particular platform. While an official launch date has yet to be confirmed, a collaborative preview of the Microsoft Mesh app for HoloLens is available and access can be requested for a new version of Mesh enabled AltspaceVR. Eventually, Mesh will be integrated within Microsoft Teams and Microsoft Dynamics 365.

As further details are released for Microsoft Mesh, VRFocus will keep you updated.

Microsoft Ignite To Host Immersive ‘Mixed Reality Keynote’ Next Week

Microsoft will host a “mixed reality keynote” at its Ignite digital conference next week.

Alex Kipman, Technical Fellow and HoloLens/Mixed Reality figurehead at the company confirmed as much on Twitter this week. In a video clip, Kipman promised an immersive keynote “the likes of which you have not experienced before”. Those joining inside a Mixed Reality headset will apparently experience “more immersion than you’ve ever seen before.” Now there’s a promise.

You’ll still be able to watch the conference via livestream if you don’t have a headset, though.

Microsoft labels both HoloLens and the PC VR headsets it produces with partners like HP and Dell as ‘Mixed Reality’. It’s not clear exactly which category Kipman is referring to here but he likely means that PC VR fans with an HP Reverb G2 or older device can watch the stream. It doesn’t appear that Oculus Rift, Quest, HTC Vive and Valve Index owners will be able to watch in VR based on this branding, but we’ve asked Microsoft.

You can register to attend Ignite here, though there aren’t details on how to attend in VR just yet. Kipman is confirmed to be speaking at the Day 1 Keynote, which kicks off at 8:30am PT on March 2nd. There’s no details yet on exactly what he’ll be talking about but, given his role within the HoloLens and Mixed Reality teams, we’re hoping for some new reveals.

That said, Ignite is a largely business-focused conference, so don’t hold your breath for any big consumer-facing news. Either way, we’ll bring you all the latest on UploadVR.

Through-The-Lens Clip Shows The Supremacy Of Passthrough AR

A short through-the-lens clip of the upcoming Lynx-R1 mixed reality headset shows the advantages of passthrough AR over transparent optics.

Lynx-R1 was announced in February 2020, slated to be priced at $1499. It’s targeted at professionals. Since that announcement the design has changed, becoming significantly slimmer. It uses the same Snapdragon XR2 processor seen in Oculus Quest 2, but places it in the rear alongside a much larger battery.

There are currently two fundamental types of AR headsets: see-through and passthrough. Most AR headsets, such as Microsoft’s HoloLens 2, are see-through. You view the real world directly, with virtual objects superimposed onto the glass.

The technology behind see-through AR optics is still in the very early stages. The field of view is narrow and virtual objects cannot be fully opaque.

Passthrough headsets, like Lynx-R1, use the same kind of display system as VR headsets, except instead of rendering an entirely virtual world they show the real world via cameras. While the real world won’t necessarily look as good, this allows for AR across a much wider field of view, as well as full virtual object opacity and lower cost (HoloLens 2 is priced at $3500).

That’s visible in the short clip posted this week. It was filmed by placing 2 cameras in front of the lenses. Remember, what you’re seeing is not a transparent optic. It’s a view from the cameras on the front, synthesized into a perspective-correct view by the Snapdragon processor onboard and displayed on the LCD display behind the lenses.

Long before we get the sci-fi AR glasses expected from Apple and Facebook, VR headsets will deliver richer augmented reality via high quality cameras. As the field of view of VR increases, it will be harder yet for transparent optics to catch up.

Realworld Will Let You Explore The World In VR With Hand-Tracking On Quest, Also Coming To PC VR

Realworld is a newly announced in-development VR app from the creator of Cubic VR that will let users explore the actual world from inside a VR headset while connecting with friends.

Based on that description and the video embedded above if you think that sounds a lot like Google Earth, then you’re absolutely correct. However, like Microsoft Flight Simulator, Realworld uses Bing Maps, not Google Maps.

Realworld is coming natively to Oculus Quest with additional plans for support on PC VR, mobile AR, and mixed reality devices. The eventual goal is to make it so that if you visit a location in real life, you can see markups and notes that people left via Realworld, in addition to being able to use AR to look up and see VR users from around the world.

We haven’t gotten a chance to try Realworld, but it looks a bit like Google Earth was condensed down onto a tabletop to make rendering that sort of information manageable. Using a “pinch” type gesture with both hands you can zoom the view in and out very quickly.

In the trailer we can even see the ability to “grab” one another, since this is multiplayer, and either shrink or grow each user to get a different perspective on the environment. Since the table is so small, you can start from a space-style continental view and then zoom all the way down to street level very smoothly. But the limited scope of the “table” format seems to rid the experience of the grand scale of things found in something like Google Earth.

Luckily, you can still go “inside” the street view perspective like a 360-photo instantly like you can in Wander. The table becomes sort of like a 3D map with to-scale models of locations and then you can teleport down to see it all around you if you’d like. Realworld will also let you sketch onto the world itself to draw things with 3D pens, drag and drop your own 3D models directly into the world itself which has some amazing possibilities, as well as much more such as built-in streaming support, sticky notes, animation features, and lots of other tools the trailer only hints at for now.

realworld google earth multiplayer

You can go sign up on the official Realworld website to stay up-to-date on details and future information.

Video Series Shares the Secrets to Great Mixed Reality Capture for VR

There’s no better way to show off a VR game than with a mixed reality captured which shows a real player inside of a virtual world; unfortunately setting it all up is no easy task. Luckily, YouTube channel TogueVR is sharing the secrets to great mixed reality streaming and capture in a video series called Mixed Reality Masterclass.

Whether you’re a content creator looking to stream VR content or a game studio making a high quality trailer for your VR game, mixed reality is the ideal way to give players a sense of what it’s really like to be inside virtual reality. But mixing virtual reality and real life footage together into one seamless shot isn’t exactly straightforward. Between hardware, software, framing, and lighting, there’s a lot to learn to make it work, let alone make it work well. Luckily, help has arrived.

Image courtesy TogueVR

The TogueVR channel is home to some of the best mixed reality content you’ll find on YouTube, and now host David is sharing the secrets of his mixed reality productions in a free video series called Mixed Reality Masterclass.

A masterclass indeed… David’s delightfully straightforward presentation goes over the complete setup process, covering camera selection and framing, controller calibration, latency correction, chroma key best practices, and plenty more.

While the first video in the series (above) will get you looking good with the basics, the second video introduces moving shots with a tracked camera, along with all the considerations you’ll need to make to your filming space to accomodate such shots.

TogueVR is promising more episodes of Mixed Reality Masterclass to come, so be sure to stay tuned!

The post Video Series Shares the Secrets to Great Mixed Reality Capture for VR appeared first on Road to VR.