Meta to Host Quest Gaming Showcase Just Days Ahead of Rumored Apple Headset Announcement

Meta announced its third annual Quest Gaming Showcase is arriving next month, coming only a few days before Apple’s rumored XR headset announcement at Worldwide Developers Conference (WWDC).

Meta is livestreaming the Quest Gaming Showcase on June 1st, a bit unusual for the company, as it traditionally holds the annual event in late April.

Calling it their “biggest celebration of the depth and breadth of content across the Meta Quest Platform yet,” Meta is slated to share over 40 minutes of content, including a brand-new pre-show covering game updates and debut trailers, starting 15 minutes before the show begins.

Meta says to expect new game announcements, gameplay first-looks, updates to existing games, and more. There’s also set to be a post-show developer roundtable, which will feature conversation around upcoming games.

There could be at least one clue to what’s in store, as we get a brief glimpse at a horned helmet in the showcase’s promo video, which seems very much like Loki’s helmet from Rift exclusive Asgard’s Wrath (2019). Maybe Meta’s Sanzaru Games has slimmed down the Norse-inspired RPG?

Meanwhile, previous reports maintain Apple is finally set to unveil its long rumored mixed reality headset during the company’s WWDC keynote, taking place on Monday, June 5th.

Provided Apple indeed plans to announce its headset at WWDC, Meta could be looking to generate so called ‘strategic noise’ to better manage market reactions, and potentially offset any negative sentiment prior to Apple’s expected announcement—undoubtedly slated to be a pivotal moment for the entire XR industry.

Meta recently released its Q1 2023 earnings report, showing a consistent investment of around $4 billion per quarter into its XR division Reality Labs. With Apple rumored to be unveiling their own XR headset and a host of apps, reportedly set to include everything from fitness to VR/AR gaming, Meta may want to showcase where some of that investment is going.

Who knows? We may even hear more about Meta’s promised Quest 3 at the gaming showcase, which the company has confirmed will “fire up enthusiasts” when its released at some point this year, notably targeting a higher price point than its Quest 2 headset.

To find out, tune into the Quest Gaming Showcase on June 1st at 10AM PT (local time here), livestreamed across the company’s various channels, including TwitchFacebookYouTube, and in Meta Horizon Worlds.

Apple’s RealityOS Trademarked For Just After WWDC – Is A Reveal Imminent?

Apple’s RealityOS has appeared in a trademark filing with a deadline two days after WWDC, Apple’s yearly developer conference.

The filing was spotted by Vox Media’s Parker Ortolani. The listed applicant is ‘Realityo Systems LLC’, a company with no other public presence. Apple has in the past used the shell company ‘Yosemite Research LLC’ to file macOS update names, 9to5Mac reports – and Realityo Systems LLC is registered at the same address.

The existence of realityOS, or rOS, was first reported by Bloomberg all the way back in 2017. In 2021 BloombergThe Information, and supply chain analyst Ming-Chi Kuo released reports claiming Apple is preparing to release a premium headset for VR and AR with high resolution color passthrough. Recent notes from Kuo claim this headset will weigh significantly less than Meta’s Quest 2, feature dual 4K OLED microdisplays, and use a new chip with “similar computing power as the M1 for Mac”.

In January iOS Developer Rens Verhoeven spotted a new platform “com.apple.platform.realityos” in the App Store app upload logs. Apple’s existing operating systems include iOS (com.apple.platform.iphoneos), iPadOS, watchOS (com.apple.platform.watchos), macOS, and tvOS.

In February, “award-winning git repository surgeon” Nicolás Álvarez spotted Apple committing code to its open source GitHub repository referencing ‘TARGET_FEATURE_REALITYOS’ and ‘realityOS_simulator’ – the latter likely a feature to allow developers without the headset to test building AR or VR applications. Álvarez said Apple quickly force-pushed the repo to try & hide the change, suggesting making this public was a mistake.

The week before last, Bloomberg’s Mark Gurman reported Apple recently ramped  up development of realityOS and previewed the headset to the board of directors. Gurman is sticking by his earlier reporting that the product will be announced this year or early next for a release in 2023.

The Information Apple VR

WWDC 2022 is scheduled for June 6, one week from now. Given the filing deadline date, could Apple be planning to finally unveil the headset, or at least its operating system?

Valve Ends SteamVR Support For macOS

Valve just ended SteamVR’s support for macOS. ‘Legacy builds’ will continue to be available.

SteamVR is Valve’s PC-based virtual reality platform, supporting Windows and Linux. Mac owners can continue to use the latest versions of SteamVR by installing Windows as a secondary OS.

Facebook’s competing Oculus Rift platform hasn’t supported Macs since early development kits, which preceded the Rift. In 2016, founder Palmer Luckey claimed this was due to Apple’s lack of priority on GPU power.

Apple Went All-In

Support for macOS was announced by Apple itself during its annual developer conference in 2017. Craig Federighi, who reports directly to Tim Cook, revealed the support with excitement- VR would be a showcase of Apple’s new commitment to high performance graphics.

At the time, Apple released updates to Metal, its equivalent of Vulkan/DX12, to make it suitable for high performance low latency VR rendering, including via external GPUs on MacBooks.

MacOS Metal VR

The company even worked with Unity and Unreal Engine to make this available to all developers.

Later in the conference, employees from ILM gave a live on-stage mixed reality demo of VR on macOS- a Star Wars scene powered by Unreal Engine.

MacOS Vive Pro

In September 2018, Apple added support for HTC’s Vive Pro, including giving developers access to the onboard stereo cameras, presumably for AR development.

What Happened?

Less than two years later, Valve is announcing ending support for macOS. Apple doesn’t seem to have made a statement on this yet, but we’ve reached out to employees who worked on Metal’s VR support.

According to Valve’s Hardware Survey, just under 4% of Steam users are using macOS, with over 95% using Microsoft’s Windows. Given that just over 1% own a VR headset at all, it’s easy to see just how niche VR on macOS likely was.

Valve states it will now “focus on Windows and Linux”. Despite being even less popular than macOS, Linux is preferred by developers, and required for certain advanced enterprise and government use cases.

Apple is reportedly working on a standalone mixed reality headset slated for 2022. The company may decide to support this in MacOS in a similar way to Facebook’s Oculus Link. Alternatively, it may have decided that the tethered market just isn’t big enough to focus on.

We’ll keep a close eye on Apple this year for any further news about its support for virtual reality.

The post Valve Ends SteamVR Support For macOS appeared first on UploadVR.

Apple’s new ARKit 3 to Feature People Occlusion and Triple Face Tracking

Apple held its annual WWDC developers conference yesterday and revealed its latest plans for augmented reality (AR), more specifically ARKit. Originally released in 2017 alongside iOS 11, for 2019 Apple will be launching ARKit 3 with some major additions such as People Occlusion and Motion Capture.

Apple ARKit 3 image2

If you’ve ever used basic AR apps and videogames like Pokemon GO, for example, you’ll have noticed how digital objects and characters are simply placed over the real world rather than fully integrating with it. By that, we mean walking behind and in front of an object rather than walking right on through. So with People Occulsion AR can become even more realistic, enabling green-screen-like effects almost anywhere.

Thanks to computer vision, Apple demonstrated its Motion Capture technology for ARKit 3. Using just one camera, users capture the motion of someone in real-time, which can then be transferred into input for an AR experience. While the demo shown did look a little rough with the digital character slightly moving around without moving its feet, the software was still able to accurately track joints, like the bending of elbows or knees.

There was actually quite a few additions in ARKit 3 which Apple didn’t go into detail about during the keynote, some of these were Multiple Face Tracking which could recognise three faces at once; Collaborative Sessions which are ideal for developers or AR multiplayer experiences, the simultaneous use of front and back cameras and improve 3D-object detection.

Working in unison with ARKit 3 to make things easier for developers are RealityKit and Reality Composer. Designed to help developers who don’t have much experience with the 3D modelling required for AR, RealityKit is said to offer features such as photo-realistic rendering, camera effects, animations and physics.

“RealityKit seamlessly blends virtual content with the real world using realistic physically-based materials, environment reflections, grounding shadows, camera noise, motion blur, and more, making virtual content nearly indistinguishable from reality,” Apple explains.

While Reality Composer offers a library of assets, either stationary or animated for developers to quickly and simply drag and drop into RealityKit. Reality Composer can also be used with Xcode, so developers can build, test, tune, and simulate AR experiences entirely on iPhone or iPad.

ARKit 3 is expected later this year, when further details arrive VRFocus will let you know.

Apple Announces ARKit 3 with Body Tracking & Human Occlusion

At the company’s annual WWDC developer conference today, Apple revealed ARKit 3, its latest set of developer tools for creating AR applications on iOS. ARKit 3 now offers real-time body tracking of people in the scene as well as occlusion, allowing AR objects to be convincingly placed in front of and behind those people. Apple also introduced Reality Composer and RealityKit to make it easier for developers to build augmented reality apps.

Today during the opening keynote of WWDC in San Jose, Apple revealed ARKit 3. First introduced in 2017, ARKit is a suite of tools for building AR applications on iOS.

From the beginning, ARKit has offered computer vision tracking which allows modern iOS devices to track their location in space, as well as detect flat planes like the ground or a flat table which could be used to place virtual objects into the scene. With ARKit 3, the system now supports motion capture and occlusion of people.

Human Occlusion & Body Tracking

Using computer vision, ARKit 3 understands the position of people in the scene. Knowing where the person is allows the system to correctly composite virtual objects with regard to real people in the scene, rendering those objects in front of or behind the person depending upon which is closer to the camera. In prior versions of ARKit, virtual objects would always show ‘on top’ of anyone in the scene, no matter how close they were to the camera. This would break the illusion of augmented reality by showing conflicting depth cues.

Similar tech is used for real-time body tracking in ARKit 3. By knowing where people are in the scene and how their body is moving, ARKit 3 tracks a virtual version of that person’s body which can in turn be used as input for the AR app. Body tracking could be used to translate a person’s movements into the animation of an avatar, or for interacting with objects in the scene, etc.

From the footage Apple showed of their body tracking tech, it looks pretty coarse at this stage. Even with minor camera movement the avatar’s feet don’t remain particularly still while the rest of the body is moving, and small leg motions aren’t well tracked. When waving, the avatar can be seen to tip forward in response to the motion even though the user doesn’t. In the demo footage, the user keeps their arms completely out to the sides and never moves them across their torso (which would present a more challenging motion capture scenario).

For now this could surely be useful for something simple like an app which lets kids puppeteer characters and record a story with AR avatars. But hopefully we’ll see it improve over time and become more accurate to enable more uses. It’s likely that this was a simple ‘hello world’ sort of demo using raw tracking information; a more complex avatar rig could smartly incorporate both motion input and physics to create a more realistic, procedurally generated animation.

SEE ALSO
Report: Apple Developing AR/VR Headset with 8K Resolution Per-eye Slated for 2020

Both human occlusion and body tracking will be important for the future of AR, especially with head-worn devices which will be ‘always on’ and need to constantly deal with occlusions to remain immersive throughout the day. This is an active area of R&D for many companies, and Apple is very likely deploying these features now to continue honing them before the expected debut of their upcoming AR headset.

Apple didn’t go into detail but listed a handful of other improvements in ARKit 3:

  • Simultaneous front and back camera
  • Motion capture
  • Faster reference image loading
  • Auto-detect image size
  • Visual coherence
  • More robust 3D object detection
  • People occlusion
  • Video recording in AR Quick Look
  • Apple Pay in AR Quick Look
  • Multiple-face tracking
  • Collaborative session
  • Audio support in AR Quick Look
  • Detect upt to 100 images
  • HDR environment textures
  • Multiple-model support in AR Quick Look
  • AR Coaching UI

RealityKit

Image courtesy Apple

With ARKit 3, Apple also introduced RealityKit which is designed to make it easier for developers to build augmented reality apps on iOS.

Building AR apps requires a strong understanding of 3D app development, tools, and workflows—something that a big portion of iOS developers (who are usually building ‘flat’ apps) aren’t likely to have much experience with. This makes it less likely for developers to jump into something new like AR, and Apple is clearly trying to help smooth that transition.

From Apple’s description, RealityKit almost sounds like a miniature game engine, including “photo-realistic rendering, camera effects, animations, physics and more.” Rather than asking iOS developers to learn game engine tools like Unity or Unreal Engine, it seems that RealityKit will be an option that Apple hopes will be easier and more familiar to its developers.

SEE ALSO
Apple CEO on AR Headsets: 'We don't want to be first, we want to be the best'

With RealityKit, Apple is also promising top notch rendering. While we doubt it’ll qualify as “photo-realistic,” the company is tuning rendering the allow virtual objects to blend as convincingly as possible into the real world through the camera of an iOS device by layering effects onto virtual objects as if they were really captured through the camera.

“RealityKit seamlessly blends virtual content with the real world using realistic physically-based materials, environment reflections, grounding shadows, camera noise, motion blur, and more, making virtual content nearly indistinguishable from reality,” Apple writes.

RealityKit, which uses a Swift API, also supports the creation of shared AR experiences on iOS by offering a network solution out of the box.

Reality Composer

Image courtesy Apple

Just like RealityKit, Reality Composer aims to make things easier for developers not experienced with game engines, workflows, and assets. Apple says that Reality Composer offers a library of existing 3D models and animations with drag and drop ease, allowing the creation of simple AR experiences that can integrate into apps using Xcode or be exported to AR Quick Look (which allows built-in iOS apps like Safari, Messages, Mail, and more, to quickly visualize 3D objects at scale using augmented reality).

In addition to the built in object library, Reality Composer also allows importing 3D files in the USDZ format, and offers a spatial audio solution.

The post Apple Announces ARKit 3 with Body Tracking & Human Occlusion appeared first on Road to VR.

VR vs. AR vs. Creative Play

It was, I have decided, not the best week I’ve ever had. It started off on the Monday where I woke up drenched in sweat and not entirely sure where I was and who I was. Which, I can assure you, is pretty frightening. Going in and out of consciousness to the point you no longer can trust whether ‘here’ is reality or not – and that sounds like a topic for another occasion.

RummyTuesday I cannot recount that well, only that I spent most of it wracked with pain. Wednesdays traditionally I work the later shift as both Peter and Rebecca have social things they do in the evening which I was able to drag myself through but I had to wave the white flag midway through Thursday, consumed by a splitting headache, nausea, a stomach apparently practicing a gymnastics floor routine and some bastard had in the middle of the night tilted the entire flat by twenty degrees or so making standing up an issue. So, I signed off, keeled over and slept… somewhere, I’m not even sure where. for about 7 hours in total. Waking up long enough to find I needed to go to bed again.  All-in-all not the best of preludes to Friday where I was hosting friends before Saturday’s trip to the 2018 UK Games Expo.

I really was still not in the best of shape to go and deal with a vast people-filled space. My friends would have entirely understood if I chose not to go, but I wanted to for a couple of reasons. First: I’d spent money on the damn thing. Second however was a desire to see how augmented reality (AR) was being used within the realm of tabletop/board games. It was pretty amazing to see just how many different games there were out there. Long gone are the days where playing a board game basically just consisted of your Dad going to the cupboard and giving you the choice of Cluedo, Scrabble and Risk because the latter would be a great way to beat you.

We’ve seen board games turn to immersive technology, both virtual reality (VR) and augmented reality (AR) over the last few months to bring a new level of imagination to their play. There’s been the launch of Catan VR, something we’d been following on VRFocus for some time. There’s been Kickstarter projects such as Crime Shoots which have been in the news, whilst as recently as last week Ahoy Games revealed that Rummy, of all things, was getting the AR treatment. So at this big gathering of board game companies and aficionados it only made sense to have a look around and see just what was going on – and you know what I found?

Nothing.

TumbleweedNope, not one solitary instance in all of that of immersive technology being implemented in some way. No virtual equivalents, no cards with fancy AR effects. Zip. Nada. Was it because it was a traditional audience? Was it because the imagination is still a far more powerful tool that immersive technology ever will be and board/card games are inherently a little bit about immersion in that sense?  It was disappointing but I kind of understood why I didn’t find anything.

Contrast this with yesterday’s Day One at Apple’s WWDC event where after Google’s stony silence on immersive technology at Google I/O, Apple seemed ready to grab you by the lapels and shake you while screaming in your face about how awesome it was. Actually, Apple were pretty frisky all over, there was shots at Facebook, some serious shade flung at Google and Android. “It’s hard to say they really have a software update model.” scoffed Craig Federighi, a man who looks like a human version of Sam The Eagle from The Muppets – something I now entirely can’t unsee when looking at him.

But while, again Google and Apple’s take on how important immersive technology is to them is a topic for another day I was struck by the LEGO presentation which used ARKit 2 in order to bring a physical model to life.

Throughout the presentation LEGO were really keen to emphasise (i.e. repeat it to death) one phrase, that the use of AR was great because “It really opens up those creative play possibilities.”

No, it really doesn’t. Giving scripted stories and scenarios that happen with your toys is not opening up options it narrows them down. The imagination isn’t running free it you’re guiding it in a certain direction. In that instance I almost think the board game people are right by not taking it up immediately. Think of it this way, whilst I am not a player I am aware one of the great joys for players of Dungeons and Dragons is that you are in command and can (essentially) try anything in a game.  Yet that game still has a script of sorts. It felt very different in the LEGO example.  Its not like LEGO hasn’t had linear scripted experiences but here was different as it was kind of imposing it on the physical toy through AR. AR wasn’t used to ‘open up’ those “creative play possibilities”, rather give some approved ones. It felt like, yes, it gave a new viewpoint – and the way it used AR in this instance was impressive. But it also felt limiting – you can imagine whatever you like, before you do however, might we suggest imagining this?

ARKit 2 / LegoLEGO’s ARKit 2 items are coming later this year and they look fun. I just hope the company remembers just why LEGO is fun in the first place. When you’re suggesting to kids how to imagine, repeatedly saying “it really opens up those creative play possibilities” doesn’t come across so good. In fact, it sounds pretty disconcerting. I seem to remember a movie about a children’s toy that had its bad guy tell everyone that “Let’s take extra care to follow the instructions” and make sure to have fun on Tuesdays at the designated time. Last thing LEGO wants is coming across a bit like that(!)

That movie was pretty cool though. Had a moral about freedom of play and freedom of imagination that seems pretty important under the circumstances. Seems like something that should be kept in mind.

Man, I wish I could remember the name of the toy the movie was about…

 

 

 

 

 

 

 

A Sequel With Sharing, ARKit 2 Is Officially Revealed By Apple At WWDC 18

Over in San Jose, California, Apple has once again been showing the world their progress at the 2018 edition of its annual World Wide Developers Conference – aka WWDC. As usual it brought with it the next version of its mobile operating system iOS, this time revealing iOS 12. As well as improvements in areas such as performance it also debuted with a number of new features and upgrades to existing areas of interest within the operating system.

ARKit 2 - Shared Experiences / Slingshot AppWhen it came to actually going through these at WWDC, it was immersive technology – in this instance augmented reality (AR) – that was front and centre. Senior Vice President of Software and Engineering Craig Federighi kicking things off with AR and a number of announcements and reveals. Including an ARKit powered measuring app called, appropriately, Measure and a new file format for AR created with some help from Disney’s Pixar, that will see AR support coming to familiar tools such as Adobe Photoshop.

“ARKit opens up the world of AR to hundreds of millions of users. Making it the world’s largest AR platform. By far.” Federighi told the assembled crowd at the McEnery Convention Center. “We’re on a relentless pace of advancement with AR, and that continues today with ARKit 2. ARKit 2 delivers advances with improved face tracking, realistic rendering, and support for 3D object detection and persistence – which enables launching into AR experiences associated with a particular object or physical space. Like starting a game based around a physical toy, or having a physical classroom as the foundation for a lesson that was authored in AR.”

ARKit 2 - Shared Experiences / Slingshot AppFederighi then revealed how ARKit 2 (which also has an updated logo) was bringing with it shared experiences, allowing for multiple users to have their own person viewpoint of a single AR experience within a common environment in real-time. Be that a videogame, or other experience. This was showcased in two forms. The first through captured video of an app to be released today that Apple created, a versus videogame featuring an AR slingshot, before Apple invited LEGO’s Director of Innovation Martin Sanders onto the stage. To showcase how the popular toy firm had used ARKit 2 to create a virtual playset, that can come to life at the touch of a button.

“Creating and playing with physical LEGO sets brings great joy to millions of children and LEGO fans all over the world. And now, with ARKit 2, we get to expand those creative possibilities like never before and take things beyond the physical.” Said Sanders. “Our LEGO sets are really the start for all of those children’s imaginations and when we get the chance to embed ARKit 2 it really takes it to the next level.”

In the following demo Sanders helped show how ARKit 2 cold be used to scan a physical LEGO model, recognise what it was, and generate an expanded AR world around it with additional roads, buildings, and other locations. Even combining models the system knew each player owned and allowing them to combine their sets in a combination of AR representations and physical toys. LEGO’s appearance at WWDC marks a period of frequent association with immersive technology. Just last week, UK retail catalogue store Argos updated its AR app to view LEGO models in AR and earlier last month Merlin Entertainment revealed a deal with VR technology company Immotion for a new VR experience at LEGOLAND Discovery Center in Boston.

LEGO will be bringing new experiences that utilise ARKit 2 to the Apple Store later on this year. VRFocus will bring you news on those as we get them.

 

Apple Unveils New App Measure At WWDC 18

How long is a piece of string? Okay that’s somewhat obvious. How about… how deep is a suitcase? Well if you want to be precise about it, a new augmented reality (AR) app coming your way from Apple will be able to tell you.

WWDC 18

Revealed as part of Apple’s presentation into the changes and developments implemented in the forthcoming iOS 12, which also included a brand-new file format for AR, USDZ, which Apple created with the help of Disney’s famous 3D animation studio Pixar. Apple’s Craig Federighi introduced on stage new app Measure, which does exactly as the name suggests.

Using Measure, an Apple iPhone or iPad user is able to drag a line between two specified points, as recognised within the three-dimensional space to give an approximate measurement. The app is, naturally enough, powered by Apple’s own AR developer platform ARKit and the lines and measurements stay ‘on’ the measured objects, allowing additional measurements to be taken. The first iteration of the app can also identify rectangles and show their dimensions.

Measure

It is, of course, not the first instance of an app using the power of AR in order to get an idea as to how long a distance is of objects or the dimensions of the space between them. Regular VRFocus readers might recall another ARKit powered tool called Survey, developed by start-up company DigitalBridge, which came out in October last year. But even as recently as last month a new feature was added to iStaging’s VR Maker app which utilised ARKit and your mobile device’s camera in order to scan the interior or a property to create a set of floor plans. Enabling people to be able to not only have a better idea as to the size of a property, but – for say, an events venue – enable estimations as to its potential capacity. This again operates in essentially the same way.

iStaging Floor Plans
The ARKit powered Floor Plans addition to iStaging’s app, currently in Beta at the time of writing.

Whatever the case, it seems would-be AR users will soon have an ‘official’ option when it comes to measuring things on the Apple Store. VRFocus will bring you more news on the various AR related developments at this year’s WWDC shortly.