Apple Reveals Improvements Coming in ARKit 6 for Developers

Earlier this month during Apple’s annual developer conference, WWDC 2022, the company gave developers the first look at improvements coming to Apple’s ARKit 6 toolkit for building AR apps on iOS devices.

Though Apple has yet to reveal (or even confirm) the existence of an AR headset, the clearest indication the company is absolutely serious about AR is ARKit, the developer toolkit for building AR apps on iOS devices which Apple has been advancing since 2017.

At WWDC 2022 Apple revealed the latest version, ARKit 6, which is bringing improvements to core capabilities so developers can build better AR apps for iPhones and iPads (and eventually headsets… probably).

Image courtesy Apple

During the ‘Discover ARKit 6’ developer session at WWDC 2022, Apple ARKit Engineer Christian Lipski, overviewed what’s next.

Better Motion Capture

ARKit includes a MotionCapture function which tracks people in the video frame, giving developers a ‘skeleton’ which estimates the position of the person’s head and limbs. This allows developers to create apps which overlay augmented things onto the person, or moves them relative to the person (it can also be used for occlusion to place augmented content behind someone to more realistically embed it into the scene).

In ARKit 6, Lipski says the function is getting a “whole suite of updates,” including improved tracking of 2D skeletons which now estimate the location of the subject’s left and right ears (which will surely be useful for face-filters, trying on glasses with AR, and similar functions involving the head).

Image courtesy Apple

As for 3D skeletons, which gives a pose estimation with depth, Apple is promising better tracking with less jitter, more temporal consistency, and more robustness when the user is occluded by the edge of the camera or other objects (though some of these  enhancements are only available on iPhone 12 and up).

Camera Access Improvements

Image courtesy Apple

ARKit 6 gives developers much more control over the device’s camera while it’s being used with an AR app for tracking.

Developers can now access incoming frames in real-time up to 4K at 30FPS on the iPhone 11 and up, and the latest iPad Pro (M1). The prior mode, which uses a lower resolution but higher framerate (60FPS), is still available to developers. Lipski says developers should carefully consider which mode to use. The 4K mode might be better for apps focused on previewing or recording video (like a virtual production app), but the lower resolution 60FPS mode might be better for apps that benefit from responsiveness, like games.

Similar to higher video resolution during an AR app, developers can now take full resolution photos even while an AR app is actively using the camera. That means they can pluck out a 12MP image (on an iPhone 13 anyway) to be saved or used elsewhere. This could be great for an AR app where capturing photos is part of the experience. For instance, Lipski says, an app where users are guided through taking photos of an object to later be converted into a 3D model with photogrammetry.

ARKit 6 also gives developers more control over the camera while it’s being used by an AR app. Developers can adjust things like white balance, brightness, and focus as needed, and can read EXIF data from every incoming frame.

More Location Anchor… Locations

Image courtesy Apple

ARKit includes LocationAnchors which can provide street-level tracking for AR in select cities (for instance, to do augmented reality turn-by-turn directions). Apple is expanding this functionality to more cities, now including Vancouver, Toronto, and Montreal in Canada; Fukuoka, Hiroshima, Osaka, Kyoto, Nagoya, Yokohama, and Tokyo in Japan; and Singapore.

Later this year the function will further expand to Auckland, New Zealand; Tel Aviv-Yafo, Israel; and Paris, France.

Plane Anchors

Plane Anchors are a tool for tracking flat objects like tables, floors, and walls during an AR session. Prior to ARKit 6, the origin of a Plane Anchor would be updated as more of the plane was discovered (for instance, moving the device to reveal more of a table than the camera saw previously). This could make it difficult to keep augmented objects locked in place on a plane if the origin was rotated after first being placed. With ARKit 6, the origin’s rotation remains static no matter how the shape of the plane might change during the session.

– – — – –

ARKit 6 will launch with the iOS 16 update which is available now in beta for developers and is expected to be release to the public this Fall.

The post Apple Reveals Improvements Coming in ARKit 6 for Developers appeared first on Road to VR.

Apple Quietly Released One of The Most Impressive AR Room-mapping Tools

Apple has barely mentioned augmented or virtual reality in its big keynotes lately, however at WWDC 2022 earlier this month, the company quietly released probably one of the best 3D room-mapping tools for mobile AR yet.

Called RoomPlan, the ARKit Swift API uses the camera and LiDAR scanner on recent iPhones and iPads to create a 3D floor plan of a room, including key characteristics such as dimensions and types of furniture.

It’s not for consumers (yet) though. Apple says it’s aiming to appeal to professionals like architecture and interior designers for conceptual exploration and planning, as well as developers of real estate, e-commerce, or hospitality apps; developers can integrate RoomPlan directly into their AR-capable apps.

When it was released earlier this month, Jonathan Stephens, Chief Evangelist at spatial computing company EveryPoint, took RoomPlan for a test drive to see what it could do. The results are pretty surprising.

RoomPlan seems to be able to deal with a number of traditionally difficult situations, including the mirror seen above, but also messy spaces, open and closed doors, windows, and generally complex architecture. Still, Stephens’ house isn’t just a bunch of cube-shaped rooms, so there’s a few bits that just didn’t match up.

Vaulted ceilings, wall openings, multifloor areas like you might find in foyers were all a bit too difficult for RoomPlan to correctly digest. Although not perfect, it seems to at least autocorrect to some degree based on some assumptions of how things might best fit together.

RoomPlan isn’t just for app integrations though. Apple says it outputs in USD or USDZ file formats which include dimensions of each component recognized in the room, such as walls or cabinets, as well as the type of furniture detected.

If you’re looking to finetune the scan, dimensions and placement of each individual components can be adjusted when exported into various USDZ-compatible tools, such as Cinema 4D, Shapr3D, or AutoCAD, Apple says.

We’re still no closer to learning when the company plans to release its rumored mixed reality headset or its full-fledged AR glasses, however either AR or MR headset would need extremely robust space-mapping capabilities. Seeing Apple make these sorts of strides using its existent platforms certainly shows they’re on the right track.

If you haven’t been following along with the Apple rumor mill, check out some of the links below regarding the company’s mixed reality headset, codenamed N301:

What We (think we) Know About N301 Mixed Reality Headset


A special thanks to Hrafn Thorisson for pointing us to the news!

The post Apple Quietly Released One of The Most Impressive AR Room-mapping Tools appeared first on Road to VR.

You Can Now Photoshop the World in Real-time With AR on an iPhone

WarpAR is a free iOS app from developer Matt Bierner which lets you use Photoshop-like liquify tools to dynamically modify the world through AR. More than just a cool tech demo, it also helps us imagine what the future might be like when our physical reality becomes increasingly subject to digital whims.

WarpAR brings Photoshop-like liquify tools to augmented reality. The free app works on all iOS devices which support ARKit. On iOS devices with LiDAR, a bonus feature allows users to reach out with their hand to distort reality directly.

The app supports six different tools which will be familiar to Photoshop users:

Push – Move around parts of the world’s textures by tapping and dragging.
Restore – Move the texture back to its original, undistorted state.
Bloat – Expand the texture outwards from the center.
Pucker – Collapse the texture inwards from the center.
Swirl Left – Swirl the texture to the left (counterclockwise) around the center.
Swirl Right – Swirl the texture to the right (clockwise) around the center.

Once you’re done modifying the world, you can easily take photos and videos of the effect directly in the application for sharing.

Developer Matt Bierner has been experimenting with lots of reality-bending AR apps. A prior project, watAR, adds frighteningly convincing AR waves to the world:

And then there’s In The Walls, which projects the user’s face onto surfaces in a way that may or may not induce nightmares:

While largely playful tech demos, Beirner’s work is also a potent springboard to imagine what this sort of reality-manipulating capability might look like in the future.

Though these apps run in a handheld AR mode today, which offers only a small view into the modified reality, the future will bring always-on head-mounted AR devices with much larger fields of view that will make the experience more natural. While many AR applications focus on placing digital objects into the real world, apps like these show that modifying the world itself may be an equally compelling use of augmented reality.

Anything from whiting-out parts of the skyline you don’t like to digitally sculpting the world around you to turning on digital shades in your home when the sun is shining in too brightly, could be practical future use-cases for this sort of AR. On the downside, some uses of these techniques could involve ‘erasing’ parts of the world users don’t want to confront, like rundown streets or even homeless individuals, which could deflect much-needed attention away from disadvantaged communities. As ever, technology itself is rarely good or bad; it’s what we choose to do with it that will determine if it is a net positive or negative to humanity.

The post You Can Now Photoshop the World in Real-time With AR on an iPhone appeared first on Road to VR.

Millions of Sketchfab Models are now Available in Apple’s AR Format

During Apple’s WWDC 2018 event when iOS 12 made its official debut the company also unveiled a new file format specifically for augmented reality (AR) and ARKit, that format was USDZ. Today, 3D model library Sketchfab has announced it fully supports the format, unlocking millions of models in the process.

Available for the launch today will be 400,000 USDZ files instantly available for download across both free Creative Commons licensed models as well as royalty-free models. Creators can then use Sketchfab to convert from most 3D formats to USDZ. This will then give ARKit developers access to a massive range of 3D models for them to use within applications.

One of the biggest benefits adding USDZ support comes in the form of AR Quick Look. This feature allows iOS users viewing models on Sketchfab to quickly view them in AR.  All they need to do is hit the download button on any downloadable model while logged into Sketchfab and select the USDZ option.

The platform is also working to bring the same functionality to the Sketchfab viewer in the near future.

Apple - Augmented Reality/AR At WWDC 18

“Our goal at Sketchfab has always been to make 3D content easily shareable and discoverable as widely as possible, and Apple’s AR platform – enabled thanks to USDZ – has become a key part of the ecosystem we play in. We are excited to offer a great new way to leverage the massive Sketchfab library,” said Alban Denoyel, Co-founder and CEO of Sketchfab in a statement.

There have been plenty of advancements in the AR field, both for consumers as well as enterprise. Companies like Dr. Seuss Enterprises have created educational apps for mobile devices while BBC Studios and Preloaded built BBC Earth – Micro Kingdoms: Senses for Magic Leap 1. For those wishing to create their own AR content, Psychic VR Lab’s STYLY platform will be adding that functionality later this year.

As further progress is made within the AR industry, VRFocus will keep you updated.

New iPad Pro Gets LiDAR Scanner for Improved AR

LiDAR, the light detection and ranging technology, is usually found in commercial and industrial equipment for things like 3D scanning buildings, land, and other objects. Now, Apple announced that its latest iPad Pro includes LiDAR hardware, making it what the company calls “the world’s best device for augmented reality.”

LiDAR is able to create a depth map by measuring how long it takes light to reach an object and reflect back. To that effect, Apple’s new custom-designed LiDAR scanner is said to operate at “nano-second speeds,” and work up to five meters away (~16.5 ft) from an object, something the company says it can do both indoors and out. This basically gives it a higher fidelity way of mapping a room for more accurate AR scenarios.

Image courtesy Apple

To do this, Apple says its new depth frameworks in iPadOS combines depth points measured by the LiDAR scanner, data from both cameras and motion sensors, and on-device computer vision algorithms via its A12Z Bionic chip.

The company has also integrated the new scanner to plug into its existing ARKit framework, giving all of the platform’s AR apps improved motion capture and people occlusion, Apple says. This close integration likely points to other Apple devices getting LiDAR for improved AR in the future, with the most likely suspect being the next flagship iPhone.

SEE ALSO
Report: Apple Acquires Motion Capture Firm IKinema

“Using the latest update to ARKit with a new Scene Geometry API, developers can harness the power of the new LiDAR Scanner to unleash scenarios never before possible,” the company says. “The LiDAR Scanner improves the Measure app, making it faster and easier to automatically calculate someone’s height, while helpful vertical and edge guides automatically appear to let users more quickly and accurately measure objects. The Measure app also now comes with Ruler View for more granular measurements and allows users to save a list of all measurements, complete with screenshots for future use.”

Priced at $800 for the 11-inch version and $1,000 for the 12.9-inch, the new iPad Pro’s LiDAR scanner comes alongside a list of other “pro” features, including new cameras, motion sensors, “pro” performance & audio, and a Liquid Retina display. Can’t forget that new Magic Keyboard, which Apple hopes will entice more users to finally make the switch from PC laptops to iPads. Check out the new iPad Pro here.

The post New iPad Pro Gets LiDAR Scanner for Improved AR appeared first on Road to VR.

New iPad Pro Adds LiDAR And ‘Instant’ AR Placement

Apple unveiled a new line of iPad Pros which include a LiDAR scanner and “new depth frameworks” to combine depth information from all the device’s sensors and cameras “for a more detailed understanding of a scene.”

The new iPads start at $800 and include the LiDAR scanner and two wide angle cameras, with the widest of the two offering a 125-degree field of view.

According to Apple, “Every existing ARKit app automatically gets instant AR placement, improved motion capture and people occlusion. Using the latest update to ARKit with a new Scene Geometry API, developers can harness the power of the new LiDAR Scanner to unleash scenarios never before possible.”

Mixed reality startup LIV recently released to testers a version of its camera app for iOS. With an A12 or newer processor the app automatically recognizes the background of the scene. This can be used to composite a player wearing a VR headset with content from their virtual without the need for a green screen. While the new iPad Pro features an A12Z Bionic chip, it is currently unknown whether or not it will work with an app like LIV.

Still, the new iPad Pro looks like it might be extremely useful relative to VR and AR. In 2018, Facebook showed an incredible tech demo at its OC5 developer conference that featured six people playing Dead and Buried on Oculus Quest at “arena” scale. A tablet was able to peer into the scene in real-time. Check it out here:

Facebook’s Oculus app already supports casting the view from an Oculus Quest to an iOS device. If Facebook could take advantage of the new depth information provided by this latest iPad, one day it might be possible to simply point the device at your friend wearing an Oculus Quest and peer into their virtual world.

Of course, Facebook has made no announcements about support for this kind of capture directly on an Apple device. We’ll provide updates as we hear whether developers are able to take advantage of the 3D-sensing capabilities of the new iPad Pro.

The post New iPad Pro Adds LiDAR And ‘Instant’ AR Placement appeared first on UploadVR.

Apple CEO Tim Cook Expects AR: ‘Will Pervade our Entire Lives’

When it comes to preference over virtual reality (VR) and augmented reality (AR) technologies Apple CEO, Tim Cook has made his stance continually clear, AR is most certainly the future. Cook was in Dublin, Ireland this week to receive a Special Recognition Award for the company’s contributions to the country – 6,000 people work at its Cork office – and discussed the future of tech and AR’s role.

During a session chaired by IDA Ireland CEO Martin Shanahan, he asked Cook about his expectations for the next five to ten years: “I’m excited about AR. My view is it’s the next big thing, and it will pervade our entire lives,” reports Silicon Republic.

While Apple has yet to enter the AR headset market to compete against Microsoft HoloLens 2 or Magic Leap 1, the company has made great software inroads thanks to the launch of ARKit back in 2017 – which is now in its third iteration. This has helped developers create a wide range of apps and videogames for iOS devices. Besides entertainment, Cooks sees plenty of useful applications of AR for home users: “You may be under the car changing the oil, and you’re not sure exactly how to do it. You can use AR,” he mentions.

He also seems to make a subtle nod to his dislike of VR and why AR is his tech of choice: “I think it’s something that doesn’t isolate people. We can use it to enhance our discussion, not substitute it for human connection, which I’ve always deeply worried about in some of the other technologies.”

Bait! Under the Surface

During his visit to Ireland Cook managed to pop into Dublin-based developer War Ducks, the team behind VR titles like Sneaky Bears and RollerCoaster Legends II: Thor’s Hammer. Last year the company announced a $3.8 million USD investment which was going towards a location-based AR experience.

“Yesterday, I visited a development company called War Ducks … in Dublin – 15 people and they’re staffing up and using AR for games,” Cook mentioned. “You can imagine, for games it’s incredible but even for our discussion here. You and I might be talking about an article and using AR we can pull it up, and can both be looking at the same thing at the same time.”

As Apple continues to expand its AR development, VRFocus will keep you updated.

Google ARCore Update Brings More Robust Cloud Anchors for Improved Multiuser AR

ARCore, Google’s developer platform for building augmented reality experiences, is getting an update today that aims to make shared AR experiences quicker and more reliable. Additionally, Google is also rolling out support for Augmented Faces on iOS, the company’s 3D face filter API.

Introduced last year, Google’s Cloud Anchors API essentially lets developers create a shared, cross-platform AR experience for Android and iOS, and then host the so-called anchors through Google’s Cloud services. Users can then add virtual objects to a scene, and share them with others so they view and interact simultaneously.

In today’s update, Google says it’s made improvements to the Cloud Anchors API that make hosting and resolving anchors more efficient and robust, something the company says is due to improved anchor creation and visual processing in the cloud.

 

Google AR team product manager Christina Tong says in a blog post that developers will now have access to more angles across larger areas in the scene, making for what she calls a “more robust 3D feature map.”

This, Tong explains, will allow for multiple anchors in the scene to be resolved simultaneously, which she says reduces the app’s startup time.

Tong says that once a map is created from your physical surroundings, the visual data used to create the map is deleted, leaving only anchor IDs to be shared with other devices.

 

In the future, Google is also looking to further develop Persistent Cloud Anchors, which would allow users to map and anchor content over both a larger area and an extended period of time, something Tong calls a “save button” for AR.

This prospective ‘AR save button’ would, according to Tong, be an important method of bridging the digital and physical worlds, as users may one day be able to leave anchors anywhere they need to, attaching things like notes, video links, and 3D objects.

Apps like Mark AR, a graffiti-art app developed by Sybo and iDreamSky, already uses Persistent Cloud Anchors to link user-made creations to real-world locations.

If you’re a developer, check out Google’s guide to creating Cloud Anchor-enabled apps here.

The post Google ARCore Update Brings More Robust Cloud Anchors for Improved Multiuser AR appeared first on Road to VR.

Google Maps’ ‘Live View’ AR Feature Available in Beta, Makes Getting Lost Harder

Google may seem to be losing interest in its virtual reality (VR) ventures such as Daydream View but on the augmented reality (AR) the company is still pressing forward with gusto. Having released an AR feature for Google Maps earlier this year to Google Maps Local Guides and Google Pixel users, the company has today begun a wider rollout of Live View.

Google Maps Live View

Currently still in beta, the feature to compatible iOS and Android devices which support ARKit and ARCore respectively. While the launch happens today, you may not be able to update your device just yet, as it may take Google several days or weeks to get to your region.

The whole purpose of the AR option is to make navigation using the highly popular Google Maps even easier and straight forward. All you need to do is tap on a location in the app, hit the Directions button then select the Walking option. After that, you should find the option to ‘Live View’ those directions towards the bottom of the screen. With Live View enabled you’ll get some gigantic handy arrows appear in the middle of the street (or wherever you are) telling you the right direction to head.

Obviously, this sort of feature isn’t supposed to make you continually hold your phone up and look like a lost kitten. You can simply bring it up when required to let you know you’ve gone the wrong way – or going the right way. It’s just one of a number of updates Google has added to the app, including being able to see all of your flight and hotel reservations in one place, or finding a nice restaurant and booking a reservation all without leaving the app.

Google Maps

While AR might be seen as the little brother to VR, it’s often thought of as having the greatest potential in the long run. Apart from apps like Google Maps, a lot of the AR content consumers are coming across at the moment are videogames such as Harry Potter: Wizards Unite and Minecraft Earth. VRFocus will continue its coverage of AR, reporting back with the latest updates.

Analysis: Did Apple Really Cancel Its AR Headset?

Taiwanese news outlet DigiTimes recently reported that Apple canceled its long rumored AR hardware project.

The report cited issues making the device light enough as well as its high production cost. But is this really true? And if so, what exactly did Apple cancel?

What was Apple working on?

In late 2017, Bloomberg reported the company was working on AR glasses for release as early as 2020. The outlet didn’t provide any details on the product other than that the project name was T288 and that it could be released by 2020.

In the same year, Apple acquired Akonia Holographics. Akonia was working on novel optics for AR based on holography. They called the approach ‘HoloMirror’, and claimed it had “dramatically higher” field of view with lower production cost.

Last year, CNET reported T288 would feature dual 8K displays and be capable of VR too. Presumably this would have been achieved with video passthrough. The report claimed the headset would be wireless, powered by an external “box” with a 5-nanometer processor.

Avi Bar-Zeev

In 2016 Microsoft launched HoloLens, the first true 6DoF AR headset available for purchase. The principal architect for the project was Avi Bar-Zeev, with his contributions including “assembling the very first AR prototypes, demos and UX concepts”. He left Microsoft in 2012 and went to Amazon.

Bar-Zeev was also a co-founder of Keyhole, which became Google Earth. In the 90’s he worked on virtual reality for DisneyQuest including Aladdin’s Magic Carpet Ride.

In 2016, Bar-Zeev moved to Apple, likely to work on an AR hardware project. A Linkedin page in his name states he led “the Experience Prototyping (XP) team for a new effort”.

Earlier this year, however, he left Apple. No reason was given. We reached out and Bar-Zeev declined to comment on his work at Apple, except to say “the purpose of my team’s work was to inform leadership about what would be great user experiences, and sometimes show which things to avoid.”

More Than One Project?

CNET’s report of an 8K headset with both VR and AR capabilities does not fit with the idea of a lightweight pair of AR glasses.

michael abrash ar vr oculus connect 5

The design goals for such a product would be radically different than for a mixed reality headset with both VR and AR capabilities. T288, as described by CNET’s report, would likely have been a very expensive and heavy device. Oculus CTO John Carmack recently said that making a VR headset “with every hardware feature anyone asked for” would be “heavy and very expensive., so consumers wouldn’t buy it”.

The Case For Glasses

ARKit is already built in to every iPhone released since 2015, and every iPad since 2017. It’s estimated that totals to 500 million devices. The platform already attracted hundreds of AR developers, including big names like IKEA, Edmunds and LEGO. The SDK is even used in Pokémon Go to more realistically place the digital creatures in a real-world scene.

ARKit

Since Apple has full control over the iOS hardware and software, the company could use the iPhone to power such glasses, as described in a recent extensive patent application.

If Apple does use the iPhone to power AR glasses, adding support for the hardware to an existing ARKit app could potentially be trivial — or even automatic. Some question Apple’s intense investment in ARKit and how heavily it is pushed to developers. This could be the reason.

Summary

While we’ve seen no indication Apple will be competing with Facebook in the VR space anytime soon, the two companies will likely be fiercely competitive when it comes to AR glasses. If Google also enters the ring and Microsoft changes its enterprise-first strategy, the 2020’s could see the four giants battle for control of the post-smartphone era.

While Apple cancels projects all the time — with T288 potentially being one of them — it is possible (probable even) that other AR and VR projects live on at Apple parallel to this one.

The post Analysis: Did Apple Really Cancel Its AR Headset? appeared first on UploadVR.