Drone Hero AR: Using ARCore

Hey there! Before we dive into ARCore, let me quickly introduce myself: My name is Jonas Johansson. I love tech and games, and especially the marriage of the two. The game studio Neuston, which I run, recently released the VR game Drone Hero, where you fly a drone through a set of difficult challenges.

Before Neuston, I was the lead programmer on Angry Birds 2 (Rovio), and before that I worked on the Crysis series (Crytek) and Just Cause 2 (Avalanche Studios).

Ideation

When Google announced ARCore I downloaded the SDK along with the required beta version of Unity to give it a go.

Since there were already some interesting drone flying mechanics in Drone Hero, I figured it’d be cool to have the drone fly around in my living room. I copied the drone-related assets from Drone Hero over to a new project, and started out by simply spawning the drone into the world to get a sense of scale.

I deployed to a Google Pixel, and boom, there it was – a drone in my living room!

Lighting

The drone mesh didn’t blend in that well with the surroundings, mainly because of a mismatch in lighting between the real scene and the virtual scene. The drone seemed to be “pasted” on top of a camera image.

Luckily, the ARCore SDK provides a light estimation that can be used for tinting the rendered objects to better match the real scene.

In fact, the Unity version of the SDK contains a handy shader called “ARCore/DiffuseWithLightEstimateion” (yes, typo). It’s a surface shader based on the Lambert lighting model, with the addition that it also adjusts the final color with the estimated light (a scalar value).

Now, if I change the light in reality (eg. flip a light switch) it affects the rendered drone. Aaaaw yeah, the drone blends into the scene much better now!

Collision

I wanted the drone to collide with the ground. That’s obviously cool.

The ARCore SDK can be polled for “trackable planes”. A trackable plane is essentially an identified flat surface such as the ground or a tabletop. ARCore keeps tracking the plane and reports any changes to it.

I then ask the SDK for a list of points in each trackable plane’s “boundary polygon” (retrieved in clockwise order). To create a mesh for these points we need to produce triangles from them. There’s likely a very easy way to do this if you know the topology of the boundary polygon, but I create triangle indices by triangulating the points.

Anyway, with the mesh ready, all we need to do is set up a GameObject and add a MeshCollider component that references it.

Sweeeeet, the drone collides with the floor now!

Shadow

The drone still seemed a bit too detached from the environment. It was hard to get a sense of scale and how far away from the floor it was. I figured we needed to glue the drone to the scene better by casting shadows onto the floor.

The only problem with that is that there is no floor in the virtual scene that can receive shadows from the drone. Hmm. Basically, we need to render an invisible floor that can receive shadow.

From what I know, there’s no suitable shader in Unity out-of-the-box, so I decided to create a new one. I’ll get back to that shader in a bit, but first: we need a light source that casts shadows. I added a white directional light to the virtual scene, pointing downward. It’s not a perfect representation of the real scene light, of course, but most light come from above, so it’ll do for now.

In code, I add a MeshRenderer and a MeshFilter for all trackable planes, using the same mesh we created for the collision previously. The MeshRenderer uses a material with the new shader. Let’s go through the shader.

The new “ARSurface” shader I created has one pass:

  • Desired result:
  • Final pixel color = real world pixel color (from camera) * virtual light at this pixel (lit/shadowed).
  • Vertex shader:
  • Just pass through data.
  • Fragment shader:
  • Calculate the amount of virtual light hitting this fragment.
  • White when not in shadow.
  • Black when in shadow.
  • Blending:
  • Syntax in Unity is “Blend SrcFactor DstFactor”.
  • The generated coloris multiplied by SrcFactor.
  • The color already on screenis multiplied by DstFactor.
  • The two are added together.
  • To accomplish our goal we can:
  • Multiply source (virtual floor surface color) with Zero.
  • Multiply destination (real world color) with SrcColor (the color of our virtual plane – white/black depending on lit/shadowed).
  • The two are added together. Since the first one is Zero, only the second part contributes.
  • The blending is therefore set up as “Blend Zero SrcColor”.
  • The best way to wrap your head around this is to turn blending off and create a plane in the scene. Inspect it, and imagine that the color of the plane will be multiplied with the background.

Note that the shadow strength can be lowered to make it a bit softer, the math still holds up.

Motion Blur

As you can see in the first image the real world gets blurry when moving the camera around. The drone remains unnaturally sharp, however. To make it blend in better we add motion blur to the camera. I used the motion blur from Unity’s post processing stack.

You can find that here: https://www.assetstore.unity3d.com/en/#!/content/83912

I set the sample count to 4 to minimize the performance impact. That’s enough for continuous motion like this.

It glues much better now.

Final Thoughts

Video showing both indoor and outdoor footage:

Source code for collision and shadow:
https://github.com/jonas-johansson/ARCoreUtils

It’s easy to get up and running with ARCore, and with a few additions it looks pretty nice. A good way to explore ARCore is to look at the “HelloAR” scene bundled with the SDK, and specifically at “HelloARController.cs”.

Happy hacking!