Apple’s new ARKit 3 to Feature People Occlusion and Triple Face Tracking

Apple held its annual WWDC developers conference yesterday and revealed its latest plans for augmented reality (AR), more specifically ARKit. Originally released in 2017 alongside iOS 11, for 2019 Apple will be launching ARKit 3 with some major additions such as People Occlusion and Motion Capture.

Apple ARKit 3 image2

If you’ve ever used basic AR apps and videogames like Pokemon GO, for example, you’ll have noticed how digital objects and characters are simply placed over the real world rather than fully integrating with it. By that, we mean walking behind and in front of an object rather than walking right on through. So with People Occulsion AR can become even more realistic, enabling green-screen-like effects almost anywhere.

Thanks to computer vision, Apple demonstrated its Motion Capture technology for ARKit 3. Using just one camera, users capture the motion of someone in real-time, which can then be transferred into input for an AR experience. While the demo shown did look a little rough with the digital character slightly moving around without moving its feet, the software was still able to accurately track joints, like the bending of elbows or knees.

There was actually quite a few additions in ARKit 3 which Apple didn’t go into detail about during the keynote, some of these were Multiple Face Tracking which could recognise three faces at once; Collaborative Sessions which are ideal for developers or AR multiplayer experiences, the simultaneous use of front and back cameras and improve 3D-object detection.

Working in unison with ARKit 3 to make things easier for developers are RealityKit and Reality Composer. Designed to help developers who don’t have much experience with the 3D modelling required for AR, RealityKit is said to offer features such as photo-realistic rendering, camera effects, animations and physics.

“RealityKit seamlessly blends virtual content with the real world using realistic physically-based materials, environment reflections, grounding shadows, camera noise, motion blur, and more, making virtual content nearly indistinguishable from reality,” Apple explains.

While Reality Composer offers a library of assets, either stationary or animated for developers to quickly and simply drag and drop into RealityKit. Reality Composer can also be used with Xcode, so developers can build, test, tune, and simulate AR experiences entirely on iPhone or iPad.

ARKit 3 is expected later this year, when further details arrive VRFocus will let you know.