Minecraft Earth Beta Coming To iOS This Month, Android ‘Soon’ After

Minecraft Earth, an all-new mobile AR version of the global sensation sandbox building and survival game, is coming to iOS later this month and Android “soon thereafter” according to the official blog.

We first heard about Minecraft Earth about two months ago in mid-May when developers Mojang released an announcement trailer and then last month debuted actual gameplay for the very first time. Now, they’ve got a brand new beta announcement video that goes over more details regarding the game’s mechanics and gameplay:

Basically you’re presented with an overworld map that looks strikingly similar to Pokemon Go, complete with a Minecraft-style block avatar complete with skins. It uses the actual world map to create the environment. You walk around and tap on items like animals and blocks to collect them. Naturally, they’re called “tappables” in Minecraft Earth.

Once you collect enough tappables, you level up and once you have enough resources you can build things that are placed into the real world from your phone screen. It’s described as a “living, breathing” Minecraft world. It includes multiplayer seamlessly integrated where people can help “or hinder” your creations. Then you can scale creations to life-size to explore and see in the world around you.

minecraft earth gameplay

I haven’t tried it yet, but honestly, it looks impressive.

The limited iOS beta for Minecraft Earth is due out within the next two weeks, which means sometime before July 26th. Only a “limited number of players in a few select cities” will be chosen before a wider release this summer.

You can sign up for a chance to be selected in the closed beta right here and check out the main website for more details as they’re available.

h/t: Engadget

The post Minecraft Earth Beta Coming To iOS This Month, Android ‘Soon’ After appeared first on UploadVR.

First Minecraft Earth Gameplay Revealed, Uses ARKit 3 Body Occlusion

First Minecraft Earth Gameplay Revealed, Uses ARKit 3 Body Occlusion

At Apple’s WWDC 2019 this week Microsoft showed off gameplay of Minecraft Earth for the first time:

Minecraft Earth is a smartphone augmented reality game. It’s loosely based on Minecraft, but it’s definitely not the same game. You make your own creations with blocks, much like creative mode, then place them in the real world with AR.

Through Microsoft’s Azure Cloud Spatial Anchor system, everyone else will see your creation in the same real world position in AR.

ARKit 3

Being at WWDC, Microsoft showed off how Minecraft Earth would be enhanced on iOS compared to Android due to ARKit 3. Version 3 is introducing human occlusion, which allows you to walk in front of blocks, or even over blocks on the ground.

Microsoft describes this feature as “only on iOS”, but it will be interesting to see whether Google’s ARCore adds a similar feature between now and the game’s eventual release.

HoloLens?

This same cloud anchors technology was shown off for HoloLens 2 earlier this year. A demo even showed a user in a HoloLens 2 where another user with an iPad shared the same AR session. While Microsoft hasn’t said anything about Minecraft Earth coming to HoloLens, it seems the technology is there to make it possible if they wanted to.

Minecraft VR

While Earth brings the franchise into AR, many fans have been wondering about the VR version of the original game. While it was released on Rift and Gear VR back in 2016, it didn’t see a port to the Oculus Go or recently released Oculus Quest.

CTO John Carmack has been trying to get the game on Oculus standalone headsets for over a year now, but still hasn’t been successful.

Tagged with: ,

The post First Minecraft Earth Gameplay Revealed, Uses ARKit 3 Body Occlusion appeared first on UploadVR.

Apple’s new ARKit 3 to Feature People Occlusion and Triple Face Tracking

Apple held its annual WWDC developers conference yesterday and revealed its latest plans for augmented reality (AR), more specifically ARKit. Originally released in 2017 alongside iOS 11, for 2019 Apple will be launching ARKit 3 with some major additions such as People Occlusion and Motion Capture.

Apple ARKit 3 image2

If you’ve ever used basic AR apps and videogames like Pokemon GO, for example, you’ll have noticed how digital objects and characters are simply placed over the real world rather than fully integrating with it. By that, we mean walking behind and in front of an object rather than walking right on through. So with People Occulsion AR can become even more realistic, enabling green-screen-like effects almost anywhere.

Thanks to computer vision, Apple demonstrated its Motion Capture technology for ARKit 3. Using just one camera, users capture the motion of someone in real-time, which can then be transferred into input for an AR experience. While the demo shown did look a little rough with the digital character slightly moving around without moving its feet, the software was still able to accurately track joints, like the bending of elbows or knees.

There was actually quite a few additions in ARKit 3 which Apple didn’t go into detail about during the keynote, some of these were Multiple Face Tracking which could recognise three faces at once; Collaborative Sessions which are ideal for developers or AR multiplayer experiences, the simultaneous use of front and back cameras and improve 3D-object detection.

Working in unison with ARKit 3 to make things easier for developers are RealityKit and Reality Composer. Designed to help developers who don’t have much experience with the 3D modelling required for AR, RealityKit is said to offer features such as photo-realistic rendering, camera effects, animations and physics.

“RealityKit seamlessly blends virtual content with the real world using realistic physically-based materials, environment reflections, grounding shadows, camera noise, motion blur, and more, making virtual content nearly indistinguishable from reality,” Apple explains.

While Reality Composer offers a library of assets, either stationary or animated for developers to quickly and simply drag and drop into RealityKit. Reality Composer can also be used with Xcode, so developers can build, test, tune, and simulate AR experiences entirely on iPhone or iPad.

ARKit 3 is expected later this year, when further details arrive VRFocus will let you know.

Apple Announces ARKit 3 with Body Tracking & Human Occlusion

At the company’s annual WWDC developer conference today, Apple revealed ARKit 3, its latest set of developer tools for creating AR applications on iOS. ARKit 3 now offers real-time body tracking of people in the scene as well as occlusion, allowing AR objects to be convincingly placed in front of and behind those people. Apple also introduced Reality Composer and RealityKit to make it easier for developers to build augmented reality apps.

Today during the opening keynote of WWDC in San Jose, Apple revealed ARKit 3. First introduced in 2017, ARKit is a suite of tools for building AR applications on iOS.

From the beginning, ARKit has offered computer vision tracking which allows modern iOS devices to track their location in space, as well as detect flat planes like the ground or a flat table which could be used to place virtual objects into the scene. With ARKit 3, the system now supports motion capture and occlusion of people.

Human Occlusion & Body Tracking

Using computer vision, ARKit 3 understands the position of people in the scene. Knowing where the person is allows the system to correctly composite virtual objects with regard to real people in the scene, rendering those objects in front of or behind the person depending upon which is closer to the camera. In prior versions of ARKit, virtual objects would always show ‘on top’ of anyone in the scene, no matter how close they were to the camera. This would break the illusion of augmented reality by showing conflicting depth cues.

Similar tech is used for real-time body tracking in ARKit 3. By knowing where people are in the scene and how their body is moving, ARKit 3 tracks a virtual version of that person’s body which can in turn be used as input for the AR app. Body tracking could be used to translate a person’s movements into the animation of an avatar, or for interacting with objects in the scene, etc.

From the footage Apple showed of their body tracking tech, it looks pretty coarse at this stage. Even with minor camera movement the avatar’s feet don’t remain particularly still while the rest of the body is moving, and small leg motions aren’t well tracked. When waving, the avatar can be seen to tip forward in response to the motion even though the user doesn’t. In the demo footage, the user keeps their arms completely out to the sides and never moves them across their torso (which would present a more challenging motion capture scenario).

For now this could surely be useful for something simple like an app which lets kids puppeteer characters and record a story with AR avatars. But hopefully we’ll see it improve over time and become more accurate to enable more uses. It’s likely that this was a simple ‘hello world’ sort of demo using raw tracking information; a more complex avatar rig could smartly incorporate both motion input and physics to create a more realistic, procedurally generated animation.

SEE ALSO
Report: Apple Developing AR/VR Headset with 8K Resolution Per-eye Slated for 2020

Both human occlusion and body tracking will be important for the future of AR, especially with head-worn devices which will be ‘always on’ and need to constantly deal with occlusions to remain immersive throughout the day. This is an active area of R&D for many companies, and Apple is very likely deploying these features now to continue honing them before the expected debut of their upcoming AR headset.

Apple didn’t go into detail but listed a handful of other improvements in ARKit 3:

  • Simultaneous front and back camera
  • Motion capture
  • Faster reference image loading
  • Auto-detect image size
  • Visual coherence
  • More robust 3D object detection
  • People occlusion
  • Video recording in AR Quick Look
  • Apple Pay in AR Quick Look
  • Multiple-face tracking
  • Collaborative session
  • Audio support in AR Quick Look
  • Detect upt to 100 images
  • HDR environment textures
  • Multiple-model support in AR Quick Look
  • AR Coaching UI

RealityKit

Image courtesy Apple

With ARKit 3, Apple also introduced RealityKit which is designed to make it easier for developers to build augmented reality apps on iOS.

Building AR apps requires a strong understanding of 3D app development, tools, and workflows—something that a big portion of iOS developers (who are usually building ‘flat’ apps) aren’t likely to have much experience with. This makes it less likely for developers to jump into something new like AR, and Apple is clearly trying to help smooth that transition.

From Apple’s description, RealityKit almost sounds like a miniature game engine, including “photo-realistic rendering, camera effects, animations, physics and more.” Rather than asking iOS developers to learn game engine tools like Unity or Unreal Engine, it seems that RealityKit will be an option that Apple hopes will be easier and more familiar to its developers.

SEE ALSO
Apple CEO on AR Headsets: 'We don't want to be first, we want to be the best'

With RealityKit, Apple is also promising top notch rendering. While we doubt it’ll qualify as “photo-realistic,” the company is tuning rendering the allow virtual objects to blend as convincingly as possible into the real world through the camera of an iOS device by layering effects onto virtual objects as if they were really captured through the camera.

“RealityKit seamlessly blends virtual content with the real world using realistic physically-based materials, environment reflections, grounding shadows, camera noise, motion blur, and more, making virtual content nearly indistinguishable from reality,” Apple writes.

RealityKit, which uses a Swift API, also supports the creation of shared AR experiences on iOS by offering a network solution out of the box.

Reality Composer

Image courtesy Apple

Just like RealityKit, Reality Composer aims to make things easier for developers not experienced with game engines, workflows, and assets. Apple says that Reality Composer offers a library of existing 3D models and animations with drag and drop ease, allowing the creation of simple AR experiences that can integrate into apps using Xcode or be exported to AR Quick Look (which allows built-in iOS apps like Safari, Messages, Mail, and more, to quickly visualize 3D objects at scale using augmented reality).

In addition to the built in object library, Reality Composer also allows importing 3D files in the USDZ format, and offers a spatial audio solution.

The post Apple Announces ARKit 3 with Body Tracking & Human Occlusion appeared first on Road to VR.

Apple ARKit To Get People Occlusion, Body Tracking, High Level ‘RealityKit’ Framework

ARKit 3

At its annual WWDC conference today Apple announced big new updates to ARKit, including people occlusion and body tracking.

The company also announced a high level AR framework called RealityKit and easy to use AR creation tool Reality Composer.

People Occlusion, Body Tracking

With previous versions of ARKit, and with Google’s ARCore, virtual objects often show up on top. If someone walked in front of the object it would still render as if the person were behind it. This looks wrong and instantly breaks the illusion that the virtual object is really in the environment.

ARKit 3 introduces real time human occlusion, which means if a person walks in front of a virtual object it will appear behind them.

This understanding of human movement can also be used for body tracking, enabling use cases such as animating a virtual character in real time from human movement.

RealityKit & Reality Composer

Until now, most ARKit experiences have been developed using engines like Unity. For some app developers looking to add AR elements, Unity has a relatively steep learning curve and a plethora of irrelevant user interface panels and configuration to deal with.

RealityKit is a new high level framework from Apple made specifically for AR. It handles all aspects of rendering including materials, shadows, reflections, and even camera motion blur. It also handles networking for multiplayer AR apps, meaning developers won’t need to be network engineers to develop shared AR experiences.

The framework’s rendering engine takes full advantage of hardware acceleration in Apple chips and performant Apple APIs.

Apple is also launching a new macOS tool called Reality Composer. This tool lets developers visually create AR scenes. Developers can add animations like movement, scaling, and spinning. These animations can be set to be triggered when a user taps on or comes close to an AR object.

Reality Composer scenes can be integrated directly into iOS apps using RealityKit. Alternatively, some developers can use it as a prototyping tool.

Simultaneous Cameras, Multiple Faces

ARKit 3 also adds new minor features to enable new use cases. Both cameras can now be used simultaneously for example, so a user’s facial expressions could drive the AR experience.

Additionally, the selfie camera can now track multiple people, which could open up interactive facial augmentation experiences, similar to multi-person Snapchat filters.

Tagged with: ,

The post Apple ARKit To Get People Occlusion, Body Tracking, High Level ‘RealityKit’ Framework appeared first on UploadVR.

Novelis Augmented Reality Sales Tool

Augmented Reality Sales Tool

Novelis recently engaged Zugara to develop an Augmented Reality mobile app that could help act as an interactive sales tool for Novelis’ product team. The AR experience needed to help communicate the product features and attributes for Novelis’ new aluminum battery enclosure for electric vehicles. AR was utilized within the mobile app to enhance product demonstrations on multiple fronts:

  • An augmented reality product demonstration showed the battery enclosure in both an enclosed and expanded view. Different sub part features could be selected and viewed in a real world environment (pictured above).
  • The AR presentation of the product could be scaled for a full-size product demonstration in a factory or reduced for display on a conference room table during a board room presentation.
  • Different product features (including chemical compositions) were selectable and viewable in AR mode. In addition, we developed 3D viewable data that could assist engineers with viewing data in AR while also viewing specific product sub part information.
  • A product assembly presentation was also developed in Augmented Reality view to show how the battery enclosure was assembled in a step by step process.

“Novelis consistently seeks new, innovative ways to engage with our customers when it comes to marketing our products,” said Nick Dzierzak, Electric Vehicle Business Development Manager, Novelis Global Automotive Team. “Augmented Reality helps improve the interaction and engagement with our customers by offering our sales & technical teams a new way to present our products and material data and can be used anywhere and anytime in true to life detail and form. The Zugara Team was great to work with and helped explain how both Augmented Reality and Mixed Reality could be utilized both with our mobile application and within our organization.”

You can view a demonstration of the Novelis Augmented Reality Sales Tool in the embedded video below or on YouTube here. You can also view an Image Gallery at the end of this blog post.

The mobile AR experience was developed for smartphone and tablet devices with corresponding ARKit (Apple) and ARCore (Android) functionality.

You can view other AR apps Zugara has developed for our clients in our Augmented Reality Projects section. We’re also happy to help you build you next Augmented Reality experience so please feel free to Contact Us if you have any questions on Augmented Reality, Mixed Reality or Zugara.

The post Novelis Augmented Reality Sales Tool appeared first on Zugara.

AR Shooter Reality Clash Hits UK and Western Europe

After several years of development, the Reality Gaming Group began a phased global rollout of its augmented reality (AR) combat experience Reality Clash in January. Australia, New Zealand and Denmark were the first countries to see the title arrive, and now that support has widened to the UK and Western Europe.

Reality Clash

Set in an underground world of cryptocurrencies and hackers, Reality Clash features geo-location technology that’s become common in mobile AR videogames – with the aim of getting players out the house. Built using the latest ARKit and ARCore technology from Apple and Google, players are able to compete in real-time first-person shooter (FPS) battles with their friends or bots.

Players have to navigate a 3D map to defend key parts of their own real-world town, city or village; these fights can be team-based, made up of 200 people. When not in combat they’ll need to mine for resources, build new weapons or customise the ones they already have.

“We’re thrilled to make Reality Clash available in the UK and Western Europe,” said Reality Gaming Group’s Co-Founder Tony Pearce in a statement. “Reality Clash offers gamers a whole new way to experience the FPS genre, with exciting geo-location and mining features and an AR interface that doesn’t require clunky add-ons or equipment. Let battle commerce!”

Reality Clash

While the core gameplay revolves around local battles between players, Reality Clash also features a Quick Battle mode if nobody can be found nearby. Players can instantly challenge anyone in different parts of the world and they’ll both be beamed into the same virtual arena to duel it out.

Apart from the previously mentioned territories Reality Clash is also available in The Philippines, Vietnam, Russia and Brazil, with the US to follow (the studio hasn’t confirmed when). The combat title is free to download on AR compatible Android and iOS devices.

In other AR news, those on Apple devices can now play Angry Birds AR: Isle of Pigswhich Rovio Entertainment and Resolution Games announced yesterday and is also free of charge.

VRFocus will continue to report on the latest Reality Clash announcements as the title expands to more locations worldwide.

Report: Apple to Announce ARKit Updates at WWDC 2019 Including OS Support for AR Headsets

Apple has been continuously iterating on ARKit, its augmented reality development tool that lets creators make smartphone-based AR experiences. The company unveiled ARKit at its World Wide Developer Conference (WWDC) in 2017, and the 2.0 version at the dev conference a year later. Now, a report from 9to5Mac holds that this year’s WWDC could see yet more new additions, including OS support for stereo AR headsets.

Citing sources familiar with the development of Apple’s new operating systems, the report maintains that ARKit will get a new Swift-only framework for AR and a companion app that lets developers create AR experiences visually. ARKit will also reportedly get the ability to detect human poses.

One of the biggest claims to come from the report is the supposed announcement surrounding OS support for controllers with touchpads as well as “stereo AR headsets.”

As with all unconfirmed reports, we’re taking this with a big grain of salt. However it’s hardly conceivable that Apple would open their software ecosystem to third-party devices, so it definitely raises the question of whether we’re close to a bonafide Apple AR headset tease or not.

SEE ALSO
Report: Apple Nearly Acquired Leap Motion but the Deal Fell Through

In any case, there’s been several reports of an Apple AR headset in the making. Chi Kuo, someone Business Insider called “the most accurate Apple analyst in the world,” offered up his prediction for the fabled device last month, stating that Apple will likely begin production of its AR headset sometime between Q4 of 2019 and Q2 of 2020. Furthermore, it’s been reported that the upcoming Apple headset could rely on the iPhone for computing, rendering, internet connectivity and location services.

This comes as stark contrast to one of the earliest reports we’ve seen, from late 2017, by Bloomberg which posited an Apple AR headset would be a dedicated, standalone device, also slated for a 2020 release.

Whatever the case, we’ll have our eyes peeled from June 3rd to 7th when the hardcore Apple dev community descends upon San Jose, California for this year’s WWDC.

The post Report: Apple to Announce ARKit Updates at WWDC 2019 Including OS Support for AR Headsets appeared first on Road to VR.

Dent Reality Aims to Bring AR Indoor Directions to Malls, Airports & Retail Stores

Navigating indoors is still a pretty old school experience: you look for a map of shops, bathrooms, and accessibility ramps and follow signs to your desired destination, all the while knowing that the super powered computer in your pocket has lost a core functionality without GPS signal. Dent Reality, a UK-based studio, is creating its own augmented reality-based SDK for iOS to remedy this.

As first reported by 9to5Mac, Dent Reality says its SDK can integrate the map of indoor spaces, figure out where the user is, and use virtual paths and arrows to help you find their destination.

Dent Reality isn’t tackling the monumental task of creating a single ‘everywhere’ app though—that’s something that will likely come from platform holders in a first-party solution at some point. As it is, the company provides its developer SDK and services to help places like malls, airports, and retail stores integrate these features into their own apps. The results do look promising though.

Since the SDK integrates with Apple’s ARKit, the company’s solution doesn’t require Apple’s iBeacons or other hardware, just a user’s phone to visually locate itself and display turn-by-turn directions.

Does all of this look more than a bit familiar? There’s a good reason why. You might have seen a prototype back in 2017, shared by iOS developer Andrew Hart. That’s no mere coincidence; Hart is in fact the creator behind Dent Reality.

That said, Dent Reality isn’t the only company exploring AR-based indoor location services.

Google announced its own first-party system dubbed VPS, or visual position system, back at the company’s 2017 I/O dev conference. The company has yet to release VPS for Google Maps though, leaving the limelight for other creators for now.

SEE ALSO
HoloLens 2 Hits the Ground as the U.S. Army's Own AR "combat multiplier"

Scape Technologies, which recently raised an $8 million funding round, is also building an VPS for smartphones. Much like Google’s VPS, it’s said to marry GPS and AR visual data for more accurate and information-based navigation.

Computer vision company Blippar has released a similar app for iOS called AR City, a free app that you can try now, letting you explore and navigate more than 300 cities worldwide using AR and GPS.

While the technology is poised to change how we navigate indoor spaces, and what information we learn about them along the way, it’s also a prescient look at how immersive head-mounted AR systems could serve up directions in the near future. Whatever the case, the prospect of finding the most direct route to your airport gate, or the nearest wheelchair-accessible ramp is sure to be a welcome change for many.

The post Dent Reality Aims to Bring AR Indoor Directions to Malls, Airports & Retail Stores appeared first on Road to VR.

Make ARKit and ARCore Development Easier With Unity AR Foundation

When Apple and then Google launched their augmented reality (AR) development software ARKit and ARCore respectively, they provided a perfect avenue for creators to build immersive AR apps and videogames for mobile devices. But just like the Khronos Group is trying to solve with the issue of fragmentation using OpenXR, having both ARKit and ARCore meant more work for developers trying to support both. During the recent Unity keynote at the Game Developers Conference (GDC) 2019, the videogame engine company has a solution, AR Foundation.

Angry Birds AR Structure Destroyed

While devices like Magic Leap and HoloLens might be pushing the upper reaches of AR technology, some of the most interesting work is being done at a consumer level on mobile devices. Yet creators generally have to choose whether to focus on ARKit or ARCore. Which is why Unity created the AR Foundation framework, specifically for AR content developers, allowing them to build an AR app and then deploy it to both ARKit and ARCore.

AR Foundation also includes features to overcome common problems such as anchoring digital objects into the real world and the visual fidelity of digital objects. One of the options Unity focused on was AR Remote: “it significantly reduces iteration time by streaming sensor data from an AR-enabled mobile device directly into the Unity editor, explains a blog posting. “This allows developers to see changes on their target device in real time without having to publish to the device.

As AR Foundation is part of Unity, veterans of the software will feel right at home using its workflows and features to create AR content. They can even use the assets built for non-AR titles and use them in their new AR project.

Tendar

Unity is one of the most popular engines for virtual reality (VR) and AR development, supporting the industry from an early stage. Unity CEO John Riccitiello has previously claimed that around two-thirds of all VR and AR apps on the market were built using Unity.

As Unity continues to expand and introduce more features for VR and AR development, VRFocus will keep you updated.