Apple Reveals Improvements Coming in ARKit 6 for Developers

Earlier this month during Apple’s annual developer conference, WWDC 2022, the company gave developers the first look at improvements coming to Apple’s ARKit 6 toolkit for building AR apps on iOS devices.

Though Apple has yet to reveal (or even confirm) the existence of an AR headset, the clearest indication the company is absolutely serious about AR is ARKit, the developer toolkit for building AR apps on iOS devices which Apple has been advancing since 2017.

At WWDC 2022 Apple revealed the latest version, ARKit 6, which is bringing improvements to core capabilities so developers can build better AR apps for iPhones and iPads (and eventually headsets… probably).

Image courtesy Apple

During the ‘Discover ARKit 6’ developer session at WWDC 2022, Apple ARKit Engineer Christian Lipski, overviewed what’s next.

Better Motion Capture

ARKit includes a MotionCapture function which tracks people in the video frame, giving developers a ‘skeleton’ which estimates the position of the person’s head and limbs. This allows developers to create apps which overlay augmented things onto the person, or moves them relative to the person (it can also be used for occlusion to place augmented content behind someone to more realistically embed it into the scene).

In ARKit 6, Lipski says the function is getting a “whole suite of updates,” including improved tracking of 2D skeletons which now estimate the location of the subject’s left and right ears (which will surely be useful for face-filters, trying on glasses with AR, and similar functions involving the head).

Image courtesy Apple

As for 3D skeletons, which gives a pose estimation with depth, Apple is promising better tracking with less jitter, more temporal consistency, and more robustness when the user is occluded by the edge of the camera or other objects (though some of these  enhancements are only available on iPhone 12 and up).

Camera Access Improvements

Image courtesy Apple

ARKit 6 gives developers much more control over the device’s camera while it’s being used with an AR app for tracking.

Developers can now access incoming frames in real-time up to 4K at 30FPS on the iPhone 11 and up, and the latest iPad Pro (M1). The prior mode, which uses a lower resolution but higher framerate (60FPS), is still available to developers. Lipski says developers should carefully consider which mode to use. The 4K mode might be better for apps focused on previewing or recording video (like a virtual production app), but the lower resolution 60FPS mode might be better for apps that benefit from responsiveness, like games.

Similar to higher video resolution during an AR app, developers can now take full resolution photos even while an AR app is actively using the camera. That means they can pluck out a 12MP image (on an iPhone 13 anyway) to be saved or used elsewhere. This could be great for an AR app where capturing photos is part of the experience. For instance, Lipski says, an app where users are guided through taking photos of an object to later be converted into a 3D model with photogrammetry.

ARKit 6 also gives developers more control over the camera while it’s being used by an AR app. Developers can adjust things like white balance, brightness, and focus as needed, and can read EXIF data from every incoming frame.

More Location Anchor… Locations

Image courtesy Apple

ARKit includes LocationAnchors which can provide street-level tracking for AR in select cities (for instance, to do augmented reality turn-by-turn directions). Apple is expanding this functionality to more cities, now including Vancouver, Toronto, and Montreal in Canada; Fukuoka, Hiroshima, Osaka, Kyoto, Nagoya, Yokohama, and Tokyo in Japan; and Singapore.

Later this year the function will further expand to Auckland, New Zealand; Tel Aviv-Yafo, Israel; and Paris, France.

Plane Anchors

Plane Anchors are a tool for tracking flat objects like tables, floors, and walls during an AR session. Prior to ARKit 6, the origin of a Plane Anchor would be updated as more of the plane was discovered (for instance, moving the device to reveal more of a table than the camera saw previously). This could make it difficult to keep augmented objects locked in place on a plane if the origin was rotated after first being placed. With ARKit 6, the origin’s rotation remains static no matter how the shape of the plane might change during the session.

– – — – –

ARKit 6 will launch with the iOS 16 update which is available now in beta for developers and is expected to be release to the public this Fall.

The post Apple Reveals Improvements Coming in ARKit 6 for Developers appeared first on Road to VR.

Apple Quietly Released One of The Most Impressive AR Room-mapping Tools

Apple has barely mentioned augmented or virtual reality in its big keynotes lately, however at WWDC 2022 earlier this month, the company quietly released probably one of the best 3D room-mapping tools for mobile AR yet.

Called RoomPlan, the ARKit Swift API uses the camera and LiDAR scanner on recent iPhones and iPads to create a 3D floor plan of a room, including key characteristics such as dimensions and types of furniture.

It’s not for consumers (yet) though. Apple says it’s aiming to appeal to professionals like architecture and interior designers for conceptual exploration and planning, as well as developers of real estate, e-commerce, or hospitality apps; developers can integrate RoomPlan directly into their AR-capable apps.

When it was released earlier this month, Jonathan Stephens, Chief Evangelist at spatial computing company EveryPoint, took RoomPlan for a test drive to see what it could do. The results are pretty surprising.

RoomPlan seems to be able to deal with a number of traditionally difficult situations, including the mirror seen above, but also messy spaces, open and closed doors, windows, and generally complex architecture. Still, Stephens’ house isn’t just a bunch of cube-shaped rooms, so there’s a few bits that just didn’t match up.

Vaulted ceilings, wall openings, multifloor areas like you might find in foyers were all a bit too difficult for RoomPlan to correctly digest. Although not perfect, it seems to at least autocorrect to some degree based on some assumptions of how things might best fit together.

RoomPlan isn’t just for app integrations though. Apple says it outputs in USD or USDZ file formats which include dimensions of each component recognized in the room, such as walls or cabinets, as well as the type of furniture detected.

If you’re looking to finetune the scan, dimensions and placement of each individual components can be adjusted when exported into various USDZ-compatible tools, such as Cinema 4D, Shapr3D, or AutoCAD, Apple says.

We’re still no closer to learning when the company plans to release its rumored mixed reality headset or its full-fledged AR glasses, however either AR or MR headset would need extremely robust space-mapping capabilities. Seeing Apple make these sorts of strides using its existent platforms certainly shows they’re on the right track.

If you haven’t been following along with the Apple rumor mill, check out some of the links below regarding the company’s mixed reality headset, codenamed N301:

What We (think we) Know About N301 Mixed Reality Headset


A special thanks to Hrafn Thorisson for pointing us to the news!

The post Apple Quietly Released One of The Most Impressive AR Room-mapping Tools appeared first on Road to VR.

Report: Apple to Announce ARKit Updates at WWDC 2019 Including OS Support for AR Headsets

Apple has been continuously iterating on ARKit, its augmented reality development tool that lets creators make smartphone-based AR experiences. The company unveiled ARKit at its World Wide Developer Conference (WWDC) in 2017, and the 2.0 version at the dev conference a year later. Now, a report from 9to5Mac holds that this year’s WWDC could see yet more new additions, including OS support for stereo AR headsets.

Citing sources familiar with the development of Apple’s new operating systems, the report maintains that ARKit will get a new Swift-only framework for AR and a companion app that lets developers create AR experiences visually. ARKit will also reportedly get the ability to detect human poses.

One of the biggest claims to come from the report is the supposed announcement surrounding OS support for controllers with touchpads as well as “stereo AR headsets.”

As with all unconfirmed reports, we’re taking this with a big grain of salt. However it’s hardly conceivable that Apple would open their software ecosystem to third-party devices, so it definitely raises the question of whether we’re close to a bonafide Apple AR headset tease or not.

SEE ALSO
Report: Apple Nearly Acquired Leap Motion but the Deal Fell Through

In any case, there’s been several reports of an Apple AR headset in the making. Chi Kuo, someone Business Insider called “the most accurate Apple analyst in the world,” offered up his prediction for the fabled device last month, stating that Apple will likely begin production of its AR headset sometime between Q4 of 2019 and Q2 of 2020. Furthermore, it’s been reported that the upcoming Apple headset could rely on the iPhone for computing, rendering, internet connectivity and location services.

This comes as stark contrast to one of the earliest reports we’ve seen, from late 2017, by Bloomberg which posited an Apple AR headset would be a dedicated, standalone device, also slated for a 2020 release.

Whatever the case, we’ll have our eyes peeled from June 3rd to 7th when the hardcore Apple dev community descends upon San Jose, California for this year’s WWDC.

The post Report: Apple to Announce ARKit Updates at WWDC 2019 Including OS Support for AR Headsets appeared first on Road to VR.

Adobe’s Project Aero Aims to Help Creators Build AR Content

Adobe recently unveiled a new project that aims to take the company further into the realm of augmented reality. Called Project Aero, the newly announced AR authoring tool and multi-platform system that will soon deliver a way for developers to build simple AR scenes and experiences for Apple’s ARKit.

In collaboration with Apple and Pixar, Adobe is also adding ‘usdz’ support to Adobe Creative Cloud apps and services, a file format that is a zero compression, unencrypted zip archive for 3D content such as AR/VR objects. The integration of usdz support was first announced at Apple’s WWDC, which saw the release of ARKit 2.0.

Users, the company says, will be able to create AR content using industry standard tools such as Photoshop CC and Dimension CC, and then “convert assets into usdz that can be natively consumed in the Apple ecosystem,” writes CTO Abhay Parasnis in an Adobe blogpost.

Essentially, Adobe is taking one step further into a world still largely dominated by 3D game engines such as Unreal and Unity, which could foretell and interesting forward march from 2D creation to a decidedly more 3D-focused business. Project Aero is heading into early access soon, and is available right now by request-only.

To show off what’s possible with Project Aero, Adobe has partnered with 15 artists for The Festival of the Impossible, a three-day immersive art exhibition in San Francisco which is featured in the video above.

“This is just the beginning of our journey to extend digital experiences beyond the screen and I couldn’t be more excited about what’s ahead,” Parasnis says. “We’ll have much more to share at the Adobe MAX Creativity Conference later this fall.”

The post Adobe’s Project Aero Aims to Help Creators Build AR Content appeared first on Road to VR.

Apple Joins the Impending Avatar Wars With Memoji

At Worldwide Developer Conference today, Apple announced a new avatar system, dubbed ‘Memoji’, the company’s next step into the world of iPhone X animated emojis mapped to a user’s facial movements. Move over Samsung AR Emoji, because Apple looks to have gotten the cutesy Pixar-vibe down pat.

Apple Software Program Manager Kelsey Peterson took the stage to demonstrate the Memoji creator tool, which lets you select dozens of options including the shape and color of your eyes, face, and hair. The creator tool also includes sliders so you can pick precise shades of colors, and specific accessories like earring, hats, sunglasses, etc.

Image courtesy Apple

During the live demo, Apple’s Memoji showed an impressive range of facial movements that seemed to match up fairly accurately with the user’s actual voice. The resultant Memoji that Peterson created was decidedly on the Pixar-side of the uncanny valley, making it approachable and actually really cute.

Apple is also integrating Animoji, as well as Memoji, onto FaceTime group chat, which is cool if that’s what you’re into.

While Samsung released its own version, called ‘AR Emoji’, back at MWC 2018 a few months ago, the truly striking feature of Apple’s Memoji system is how solid it appears in comparison to Samsung’s AR Emoji, which not only proved to be surprisingly jittery, but offered overall strange-looking results to say the least—certainly less approachable and less cartoon-like.

Samsung AR Emoji – Image courtesy Samsung Central

For now, it’s not likely that users would choose a phone based on what personalized emoji’s look like, but as these systems grow and avatars become more and more intertwined with services and apps, these AR features, although admittedly minuscule to the overall task of selling a phone, are helping form specific divisions in these companies dedicated to creating virtual versions of you—something we wouldn’t have thought possible back in 2013 when the Oculus Rift DK1 first made its way to Kickstarter backers.

SEE ALSO
Apple Unveils ARKit 2.0, Putting Multiuser AR at Its Core

Teasing it out a bit, megalithic companies like Facebook, Apple and Samsung are essentially shooting for the same goal: relateable avatars that people feel comfortable enough using, and consider human-enough for whatever task. Apple certainly took its time creating a cute, and fun-looking avatar system, and you can bet its competitors have taken notice.

And as the smartphone generation inevitably takes a back seat in the future to wearable solutions such as AR headsets, we may look back at this time as the beginning of a long, protracted avatar war, as company’s demarcated specific design language and defined how their users would look to each other and the world. For now, Apple has nailed its selfie avatars, and we can’t wait to see what the competition brings next.

The post Apple Joins the Impending Avatar Wars With Memoji appeared first on Road to VR.

Apple Unveils ARKit 2.0, Putting Multiuser AR at Its Core

At Apple’s Worldwide Developer Conference (WWDC) today, the company confirmed that ARKit 2.0 is coming with a list of new features, including mutiuser support, and a bevy of updates that look to refine AR interactions on iOS devices.

Update (06/04/18): First leaked in the report cited below, a list of features are coming to ARKit 2.0: improved face tracking, more realistic rendering, better 3D object detection, persistent experiences and multi-user support for shared experiences. One bit that wasn’t mentioned however was privacy, which although not featured, we suspect was skipped over for the sake of showing the more catchy multiuser features.

The company made special mention of shared experiences coming to the next iteration of ARKit, showing a multiplayer slingshot game that lets two players duel on a real table filled with virtual blocks; besides support for two players using their own devices, the company also mentioned there can be up to one observer. The slingshot game is being released to developers now as an example of what’s possible with the update.

Image courtesy Apple

Apple also partnered with LEGO to build an experience that lets up to four players interact using miniature Lego avatars in a mix of real and virtual Lego buildings. More Lego AR experiences are expected in the App Store “later this year,” said Lego Director of Innovation Martin Sanders. It’s uncertain if the unnamed game will see the light of day, as the demo was very sparse on actual game mechanics, although seeing the little Lego characters walk around a real Lego set certainly made for a nice vertical slice of what could come down the line.

Apple also announced a new AR utility app called ‘Measure’, that lets you measure items such as boxes and photos, as well as new AR format, called USDZ. As a ‘zero compression’, unencrypted zip archive supported across iOS, AR objects can be shared and inserted into Apple ecosystem apps such as Safari and Mail, and even supported on Adobe’s Creative Cloud.

The original article follows below:

Original article (06/02/18): According to a recent Reuters report, Apple is set to unveil some new features in its supposed release of ARKit 2.0, the company’s augmented reality toolkit for iOS 11 devices. Reported changes coming to the platform include multiuser AR that runs on a local peer-to-peer network, ensuring the resultant data, such as room scans, remain private.

Citing people familiar with the subject, Apple is said to have designed its two-player system out of privacy concerns—a departure from how Google currently handles multiplayer AR.

Google ARCore, the company’s Android-based counterpart to Apple ARKit, requires scans of a player’s environment to be sent to, and stored in, the cloud in something the company dubs ‘Cloud Anchors’.

In Apple’s ARKit 2.0, Reuters reports, the company will avoid storing any raw mapping scans of a user’s environment in the cloud. Google says it discards raw mapping data after a week.

image courtesy TIME

Many developers have already created ad hoc multiplayer games for both ARKit and ARCore prior to official support, but with Apple’s built-in support for this very specific way of connecting with other users, it’s likely to push the future of apps in a more social direction.

While Apple has placed an ever-increasing emphasis on smartphone-driven AR, Apple CEO Tim Cook publicly said there are still plenty of challenges to consider before the company would release an AR headset, but ultimately, that’s where the technology is headed.

“The display technology required, as well as putting enough stuff around your face – there’s huge challenges with that. The field of view, the quality of the display itself, it’s not there yet,” he says. “We don’t give a rat’s about being first, we want to be the best, and give people a great experience. But now anything you would see on the market any time soon would not be something any of us would be satisfied with. Nor do I think the vast majority of people would be satisfied.”

Apple is said to unveil ARKit 2.0 at the company’s yearly Worldwide Developers Conference (WWDC), which takes place June 4th-8th in San Jose, California.

The post Apple Unveils ARKit 2.0, Putting Multiuser AR at Its Core appeared first on Road to VR.

‘Pokémon GO’ Now Uses Apple’s ARKit on iOS 11, Bringing Pokémon Closer to Reality

Pokémon GO, the massively successful location-based mobile game, just got an update on iOS 11 thanks to Apple’s ARKit that gives iPhone 6s and above more realistic Pokémon-catching encounters.

Since it was released in summer 2016, Pokémon GO has been hailed as an augmented reality game capable of immersing you in the world of Pokémon like never before. The only problem is it wasn’t really AR.

Entering a battle to catch one of the elusive pocket monsters left you with two options; a simple battle sequence in a virtual environment, or a pass-through ‘AR mode’ that let you see the Pokémon projected on top of the real world. These projections only allowed for the most basic of interactions though, and wouldn’t actively change position according to the user’s movement in 3D space, making it impossible to walk closer to a Pokémon or even look around it to get a different vantage point—effectively leaving you with little more than a novelty in contrast to the game’s true potential: catching Pokémon as if they were really capable of existing in the physical world.

To that effect, Niantic has pushed what it calls an ‘AR+’ update to its iOS 11 app, allowing for Pokémon to be fixed to a point in space, meaning you can walk up close to Pikachu or Snorlax to see to how they’ll look in the real world.

Now that Pokémon actually have a fixed point in space, physically moving closer to the little beasties makes it easier to throw Pokéballs. To balance this advantage, Pokémon will run away if you get too close. Niantic says in a blogpost announcing AR+ that you’ll have to sneak up close to earn an Expert Handler bonus, but you’ll need to be extra cautious so you don’t scare it away, as an awareness meter now indicates how spooked they are. If the meter fills up, you’re in danger of losing them.

“This is our first step toward making AR capabilities in Pokémon GO even more awesome, opening up the framework for greater AR experiences in the future,” says Niantic.

The company will likely update the Android app at some point, although it would only be capable of running on phones that support Google’s ARCore. Initially supporting both the Pixel line and Samsung S8 line, ARCore is said to roll out to 100 million devices in the coming months though, setting up Niantic’s next big IP, Harry Potter: Wizards Unite, to be a smash hit as it hopefully rolls out to both AR-supporting Android and iOS devices.

The post ‘Pokémon GO’ Now Uses Apple’s ARKit on iOS 11, Bringing Pokémon Closer to Reality appeared first on Road to VR.

Amazon Brings Augmented Reality Product Previews to iOS App

‘Try it before you buy it’, the old adage goes. Well, an AR preview might not exactly be ‘trying it’, but Amazon’s new ‘AR View’ function aims to help you figure out if a crock-pot clashes with your marble counters, or if that chair can really fit in that corner, adding a little more depth to the online shopping experience than you’re used to.

Now, US-based iPhone users running iOS 11 can start placing thousands of true-to-scale virtual items in their home to see just what they’re getting before the package comes.

Simply tap the camera in the Amazon app and then tap ‘AR View’ to browse products in AR—containing everything from chairs to teapots.

Amazon’s ‘AR View’ is exclusive to iPhones for now. Android users with flagship-level phones could soon have access to a host of similar AR functions however thanks to Google’s ARCore, which was announced a few months after Apple’s ARKit. ARCore is soon to be available on Google Pixel devices and Samsung Galaxy S8 and above, but won’t be integrated into other Android phones until Google and its partners think it’s ready.

Retailing giants like IKEATarget, and Wayfair have also added AR functions to their iOS apps, making it easier for you to buy physical items by giving you a reliable visual representation—but there’s an interesting side effect to all this that people still aren’t really talking about. Effectively companies are now digitizing their products, and probably trying to find out ways to do it quicker so they can eventually offer their entire catalog virtually.

SEE ALSO
8 Cool AR Apps to Try Now That Apple iOS 11 is Here

While low-profile, high-function AR headsets are still somewhere in our near future (with a form factor that doesn’t make it look like you’re wearing a weird helmet), these first steps by retailers to offer virtual shopping, albeit it with a limited catalog, will no doubt be considered integral features in upcoming AR headsets of tomorrow. Now, the virtual items represented are little more than hollow props, but the level of articulation these items could take on in the future might actually let you ‘try it before you buy it’ in a certain sense. The possibility of browsing through AR portals stocked with virtual microwave ovens that let you microwave a burrito to hear what the ding sounds like when it’s done, or a flat-pack desk that shows you how to assemble it before you buy it, aren’t really that far out. Creating such a detailed 3D item is entirely possible now, although probably not feasible on a large scale. Not yet anyway.

Healthy speculation aside, virtual reality is here now in consumers hands from a variety of established brands, and companies like Amazon haven’t plunged head-first into creating VR shopping apps for a reason—retailers know AR is instantly going to appeal to average consumers as a productivity device and they’re starting now so that when it comes time, they’ll be at the forefront.


Some of our articles feature Amazon affiliate links. We’ve disabled them for this article.

The post Amazon Brings Augmented Reality Product Previews to iOS App appeared first on Road to VR.

Unreal Engine 4.18 Update Brings Native Support for ARKit and ARCore, SteamVR Support for Mac

Epic Game’s Unreal Engine is making it easier to create for augmented reality in the newest 4.18 update, now including official support for Apple’s ARKit and Google’s ARCore software dev kits, and support for SteamVR on Mac.

‘Production-ready’ support for Apple’s ARKit working on iOS11 was initially announced during Apple’s iPhone 8 and iPhone X unveiling last month. Epic has however provided experimental support in their game engine for ARKit since Unreal Engine 4.17, but the new 4.18 update represents what Epic calls “significant changes” since the prior version went live back in August.

Announced on the Unreal Engine blog, the company says they’ve “streamlined workflows [for ARKit projects] making use of existing framework components, added robust handling of the passthrough camera, and increased fidelity by improving performance and prediction.”

Unreal Engine 4.18 now contains official support for ARCore developer preview too, Google’s answer to ARKit that provides a similar AR function to Google’s new Pixel 2 smartphones and soon more Android phones running 7.0 Nougat and above including Samsung S8 line.

In the 4.18 update, the game engine also includes native SteamVR support on Mac, making the same well-worn PC interfaces available on Mac and adding the ability to easily transfer projects between the two platforms.

Apple announced Valve was bringing SteamVR support to Mac during the company’s World Wide Developer Conference (WWDC) back in June, showing the audience the power of the company’s new VR Ready 27-inch iMac. Apple featured a demo running on the HTC Vive that was created by Industrial Light and Magic. Using Epic’s Unreal Engine VR Editor, they showed how developers could build VR content inside of VR itself, using Star Wars assets.

Unreal’s support for SteamVR on Mac comes alongside support for Metal 2, Apple graphics API which is getting the special VR treatment too. Apple says Metal 2 can bring up to a 10x increase in draw call throughput over the prior version, and it will include a VR-optimized display pipeline.

Check out full release notes here.

The post Unreal Engine 4.18 Update Brings Native Support for ARKit and ARCore, SteamVR Support for Mac appeared first on Road to VR.