ManoMotion Unveils Hand Gesture Control for Apple’s ARKit

Back in June ManoMotion released a software development kit (SDK) to allow developers to include add hand gestures into any virtual reality (VR), augmented reality (AR) or mixed reality (MR), applications. One of the biggest AR apps to launch this year was Apple’s ARKit. Now the computer vision specialist has included support into its SDK.

ManoMotion’s gesture technology uses a standard 2D camera to recognise and track many of the 27 degrees of freedom (DOF) of motion in a hand, all in real-time. So now ARKit developers will be able to include their hands in projects rather than just tapping on a screen, being able to pick up AR objects.

ManoMotion ARKit 1

The current version features a set of predefined gestures, such as point, push, pinch, swipe and grab, offering a range of interactive possibilities depending to what they want to achieve, or allow the users to do.

“Up until now, there has been a very painful limitation to the current state of AR technology – the inability to interact intuitively in depth with augmented objects in 3D space,” said Daniel Carlman, co-founder and CEO of ManoMotion in a statement. “Introducing gesture control to the ARKit, and being the first in the market to show proof of this, for that matter, is a tremendous milestone for us. We’re eager to see how developers create and potentially redefine interaction in Augmented Reality!”

To begin with ManoMotion’s SDK will initially be made available for Unity iOS, followed by Native iOS in subsequent updates. Developers interested in using ManoMotion’s SDK with ARKit should visit: https://www.manomotion.com/get-started/.

In addition to ARKit, Google’s recently announced ARCore will also see ManoMotion integration, with a release date coming in the near future.

VRFocus will continue its coverage of ManoMotion, reporting back with the latest updates.

ManoMotion Brings Hand Gesture Input to Apple’s ARKit

ManoMotion, a computer-vision and machine learning company, today announced they’re integrated their company’s smartphone-based gesture control with Apple’s augmented reality developer tool ARKit, making it possible to bring basic hand-tracking into AR with only the use of the smartphone’s onboard processors and camera.

With Google and Apple gearing up for the augmented reality revolution with their respective software developer kits, ARCore and ARKit, developers are fervently looking to see just how far smartphone-based AR can really go. We’ve seen plenty of new usecases for both, including inside-out positional tracking for mobile VR headsets and some pretty mind-blowing experiments too, but this is the first we’ve seen hand-tracking integrated into either AR platform.

image courtesy ManoMotion

Venture Beat got an early look at the company’s gesture input capabilities before they integrated support for ARKit, with ManoMotion CEO Daniel Carlman telling them it tracked “many of the 27 degrees of freedom (DOF) of motion in a hand.” Just like their previous build, the new ARKit-integrated SDK can track depth and recognize familiar gestures like swipes, clicking, tapping, grab, and release—all with what ManoMotion calls “an extremely small footprint on CPUs, memory, and battery consumption.”

In ManoMotion’s video, we can see the ARKit-driven app recognize the user’s hand and respond to a flicking motion, which sends a ping-pong ball into a cup, replete with all of the spatial mapping abilities of ARKit.

A simple game like beerpong may seem like a fairly banal usecase, but being able to interact with the digital realm with your own two hands (or in this case, one hand) has a much larger implication outside of games. AR devices like HoloLens and The Meta 2 rely upon gesture control to make UI fully interactive, which opens up a world of possibilities including productivity-related stuff like placing and resizing windows, or simply turning on Internet-connected lights in your house with the snap of the finger. While neither Google nor Apple have released word on future AR headsets, it’s these early experimental steps on the mobile platforms of today—which necessarily don’t have access to expensive custom parts—that will define the capabilities of AR headsets in the near future.

“Up until now, there has been a very painful limitation to the current state of AR technology – the inability to interact intuitively in-depth with augmented objects in 3D space,” said Carlman. “Introducing gesture control to the ARKit, and being the first in the market to show proof of this, for that matter, is a tremendous milestone for us. We’re eager to see how developers create and potentially redefine interaction in Augmented Reality!”

ManoMotion says ARKit integration will be made available in the upcoming SDK build, which will be available for download “in the coming weeks” on the company’s website. The integration will initially be made available for Unity iOS, followed by Native iOS in subsequent updates.

The post ManoMotion Brings Hand Gesture Input to Apple’s ARKit appeared first on Road to VR.

ARKit: ARZombi bringt die Zombie-Apokalypse in die echte Welt

Die Kombination aus Zombies und VR erweist sich immer wieder als erfolgreiche Kombination. Der Zombie-Shooter Arizona Sunshine gehört zu einem der meist verkauften VR-Titeln überhaupt. Auch Resident Evil 7 begeisterte die Gamer und sorgte für reichlichen Umsatz. Nun entwickelten Crisly Manor mit dem ARKit den Zombie-Shooter ARZombi, der dank Augmented Reality die Zombies in die reale Welt implementiert. So könnt ihr in verschiedenen Spielmodi den Kampf in eurer eigenen Wohnung oder der nahen Umgebung aufnehmen.

ARZombi – Zombie-Apokalypse hautnah

Der Zombie-Shooter ARZombi von Crisly Manor entstand mit dem ARKit für iOS. Dabei erinnert sowohl das Spiel wie auch das Konzept stark an das bald erscheinende The Walking Dead: Our World, denn in beiden Spielen erscheinen mit AR Zombies in der gewohnten Umgebung. Entsprechend wirkt es dank Room-Mapping-Technologie immersiv, denn die Zombies tauchen in realistischer Form in den eigenen vier Wänden auf. So kann man versuchen, so lange wie möglich zu überleben, bis die Armee der Toten den Spieler umzingelt.

ARZombi-ARKit-iOS

Sowohl eine Singleplayer-Kampagne als auch ein Multiplayer-Modus sollen ein Bestandteil des Spiels sein. Innerhalb der jeweiligen Spielmodi stehen euch diverse Möglichkeiten zur Verfügung. So könnt ihr beispielsweise mit Brettern eure Türen und Fenster verriegeln, um das Eindringen der lebenden Toten zu verhindern. Natürlich stehen auch verschiedene Waffen zur Verfügung, um die unzähligen Horden der Untoten zu dezimieren. Zur Belohnung für einen langen Überlebenskampf winken versteckte Achievements und weitere Goodies.

ARZombi-ARKit-iOS

Innerhalb des Story-Modus trefft ihr Entscheidungen, die sich auf euren Charakter auswirken. So hat jede dieser Entscheidungen nicht nur Einfluss auf das jeweils aktuelle Spiel, sondern auch auf die nächsten Sessions. Dadurch definiert ihr, welcher Typ ihr in einer Zombie-Apokalypse seit.

ARZombi soll im Herbst 2017 für iOS erscheinen. Der Preis steht aktuell noch nicht fest. Ebenso wenig ist derzeit klar, ob es eine Umsetzung für Android geben wird. Der Überlebenskampf gegen die Zombiehorden ist mit sämtlichen ARKit-kompatiblen iPhones und iPads möglich.

(Quellen: UploadVR | ARZombi | Video: ARZombi Youtube)

Der Beitrag ARKit: ARZombi bringt die Zombie-Apokalypse in die echte Welt zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

SpatialStories: Erstellung von AR- und VR-Inhalten ohne Vorkenntnisse

Das Unternehmen Apelab entwickelte im Kollaboration mit Technicolor das Tool SpatialStories für interessierte VR-Enthusiasten, die ohne Vorkenntnisse im Coding Inhalte für die VR erstellen möchten. Dieses ermöglicht die Erstellung und Anpassung von AR- und VR-Apps für Laien und Fortgeschrittene.

Eigenen AR- und VR-Apps mit SpatialStories

SpatialStories entstand in Zusammenarbeit durch das schweizer-amerikanische Unternehmen Apelab mit Technicolor. Das Tool ist kompatibel mit Oculus Rift, HTC Vive, Samsung Gear VR und der HoloLens. Zudem soll es bald mit Apples ARKit und Google ARCore nutzbar sein.

Mit SpatialStories ist es möglich, auf einfache Weise AR- und VR-Projekte zu erstellen und zu bearbeiten. Dadurch sollen ambitionierte Entwickler und interessierte VR-Enthusiasten dank einem Unity-Addon interaktive Erfahrungen erstellen können.

Die entsprechenden Schritte haben die Entwickler zugänglich und intuitiv gestaltet. Der Anwender kann beispielsweise ein Objekt oder einen Charakter auswählen und diesen vorbestimmte Spezifikationen zuweisen. Dabei sind alle genutzten Objekte miteinander verbunden, wodurch Designer in der Lage sind, Sequenzen ablaufen zu lassen oder schnell Interaktionen testen können. Auch Apelabs nächstes Videospielprojekt Break a Leg entstand innerhalb weniger Monate mit dem Toolkit.

Break-a-Leg-SpatialStories-Apelab

Der CTO und Co-Gründer von Apelab, Michael Martin, sagte dazu: „Unser Ziel war es, ein flexibles und intuitives Tool für Leute ohne Coding-Kenntnisse zu entwickeln, das schnell und einfach zu erlernen ist. Wenn wir für unser Videospielprojekt Break a Leg neue Funktionen benötigten, so entwarfen wir diese innerhalb des Tools. Schließlich testeten Personen, die nichts mit der Entwicklung zu tun hatten, unser Toolkit, um festzustellen, ob der Prozess auch wirklich einfach und unproblematisch funktioniert. Das traf vollkommen zu.“

Doch auch für Entwickler ist die Plattform sinnvoll, denn Programmierer können darin ihre eigenen Codes einbetten, während Designer diese für ihre Arbeit nutzen können. Zukünftig sollen weitere Funktionen hinzu kommen, wie beispielsweise eine AI, Spracherkennung und vieles mehr.

Zu Beginn des Jahres 2018 möchte Apelab zudem eine eigenständige Software namens SpatialStories Studio veröffentlichen. Mit dieser wird es Designern ermöglicht, interaktive Inhalte direkt innerhalb der AR- und VR-Umgebung zu erstellen.

(Quellen: VRFocus | SpatialStories)

Der Beitrag SpatialStories: Erstellung von AR- und VR-Inhalten ohne Vorkenntnisse zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

ARZombi Is A Zombie Shooter Made Using ARKit

ARZombi Is A Zombie Shooter Made Using ARKit

What’s popular in VR right now? That’s right: zombies and guns. Naturally it follows that if developers were to work on AR games that they’d bring over those zombies and guns and let you see them immersed in your actual world. That’s the idea behind ARZombi, an upcoming proof-of-concept zombie game for iOS made using ARKit. In some ways it reminds us of the upcoming The Walking Dead: Our World Pokemon Go-style AR game as well.

Using some room-mapping techniques similar to what we’ve seen from this Minecraft AR prototype and even the pre-ARKit and ARCore mobile game Night Terrors, the zombies will actually appear as if they were really in your room or surrounding you outside.

ARZombi allows you to live out an 80’s inspired zombie apocalypse in your very own living room,” writes Sean Evans, a developer on the project, in an email to UploadVR. “The game will map your doors and windows, board them up and your job is to protect your home from becoming a zombie all you can eat buffet! Grab the closest weapons to you, keep your boards intact and save yourself and those around you.”

You can see some footage of the game in the video above and find out more information on the official websiteThe game is slated to initially release this fall with a Story Mode update and multiplayer coming at a later time.

Let us know what you think of ARZombi so far down in the comments below!

Tagged with: , ,

Create VR and AR with No Experience in SpatialStories

apelab, a Swiss-American company, partnered with Technicolor to deliver SpatialStories earlier this year. A tool that will help optimise your virtual reality (VR) and augmented reality (AR) applications without coding, SpatialStories currently works for HTC Vive, Oculus Rift, Samsung Gear VR and HoloLens, with support for Apple ARKit and Google ARCore coming soon.

Break a Leg headerSpatialStories is designed as an easy editing tool to create native VR and AR projects. With the Unity plugin, SpatialStories allows teams to create interactive experiences in a simple and accessible manner. The design and prototyping process is dramatically reduced, as can be seen in Break a Leg, apelab’s next videogame release, which was developed in only a couple of months using the toolkit.

SpatialStories allows users to select an object or a character and assign to it predetermined specifications. Everything is interdependent, enabling designers to craft a flow of actions and to quickly test interactions.

“We wanted a flexible and intuitive tool people without coding knowledge could master within a few days” says Michaël Martin, CTO and Co-Founder apelab. “It is really how we developed it. Whenever we needed a new function in Break a Leg, our game project, we created the appropriate tool. Everything was tested by non-developers, so we really worked hard to make the process as simple and smooth as possible for them. And we are improving it everyday!”

SpatialStories also allows for a gap to be bridged between team members without coding knowledge and experienced developers. The interactive interface helps creatives focus on the creative work while programmers can build their own code embedded into SpatialStories to build custom functionalities for the creative team to use.

ARKitapelab plan on adding new features to SpatialStories in the coming months. As stated above, support for ARKit and ARCore is on the way, but also AI, voice recognition and machine learning are technologies currently being researched for incorporation.

“We’re building a future where computers seamlessly fit in our lives, designing digital worlds that fit seamlessly over the physical world. SpatialStories is an important tool that will let today’s storytellers build the stories, movies and games of the future,” said Timoni West, Head of Authoring Tools at Unitylabs, who recently joined apelab’s advisory board.

Early next year, apelab will release SpatialStories Studio, a standalone software through which designers and creatives will conceive and build interactive content directly within a VR or AR environment. VRFocus will continue to keep you updated with all the latest details on SpatialStories.

How good is Apple’s ARkit? A technical explanation

Matt Miesnieks from Superventures provided a thorough technical explanation about the recently released Apple’s ARkit.

I’ve been working in AR for 9 years now, and have built technology identical to ARKit in the past (sadly before the hardware could support it well enough). I’ve got an insiders view on how these systems are built and why they are built the way they are.

This blog post is an attempt to explain the technology for people who are a bit technical, but not Computer Vision engineers. I know some simplifications that I’ve made aren’t 100% scientifically perfect, but I hope that it helps people understand at least one level deeper than they may have already.

Continue reading the full article here: https://medium.com/super-ventures-blog/why-is-arkit-better-than-the-alternatives-af8871889d6a

Neon: Mit der ARKit-App für iPhone Freunde finden

Die ersten Teaser zu Neon sind vielversprechend: Die auf Apples ARKit basierende Social-App weist den Weg und blendet auf Wunsch ein Neon-Zeichen ein, der den gesuchten Standort anzeigt. Das kann beispielsweise praktisch auf Festivals sein oder überhaupt in großen Menschenmengen, um den Freund oder die Freundin zu finden. Aber auch eine Party lässt sich mit der App einfacher finden und wird mit großen Neonzeichen beworben.

Neon: Zeichen setzen in der Augmented Reality mit ARKit

Der Entwicklerzug für AR-Apps für Apples ARKit ist mt Volldampf unterwegs. Wenn iOS 11 im Herbst erscheint, sollten vom Start weg etliche Lösungen von der Spielerei bis zur tatsächlich praktischen Augmented-Reality-Anwendung vorhanden sein. Zu einer praktischen Spielerei könnte sich Neon entwickeln, das in Videos einen guten ersten Eindruck machen will. In der App für iPhone und iPad lassen sich Orte und Menschen einfacher finden, denn sie blendet auf Wunsch ein großes Neon-Zeichen in die „Wirklichkeit“ auf dem Bildschirm ein. Damit markiert Neon beispielsweise den Aufenthalt eines Freundes in einer Menschenmenge. Zusätzlich zeigt die App an, wie weit das Ziel entfernt ist. Kommentare lassen sich offenbar ebenfalls über die App verschicken.

In einem zweiten Beispiel-Video tippt ein Anwender „House Party hier heute Nacht ein“ und bekommt dann ebenfalls eine Anzeige in Form eines Neon-Leuchtzeichens, das anzeigt: Hier geht es heute Abend rund. Besitzer der App können so andere Neon-User einladen und ihre Party bewerben. Ob das eine wirklich so gute Idee ist, sei allerdings dahingestellt. Der Gedanke mit dem Markieren von Freunden auf Festivals scheint uns da wesentlich sinnvoller zu sein. Nach der Vorstellung von ARCore für Android-Smartphones scheint eine Umsetzung für Googles Plattform sehr wahrscheinlich zu sein. Wer vom Start der App direkt informiert sein will, kann sich auf der Webseite des Entwicklers registrieren. Man findet sie auch ohne AR-Unterstützung über diesen Link.

(Quelle: UpLoad VR)

Der Beitrag Neon: Mit der ARKit-App für iPhone Freunde finden zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Google Releases ARCore for Android, The Company’s Answer to Apple ARKit

In an answer to Apple’s recently released ARKit, a developer tool used for making augmented reality apps and games that run on newer iPad and iPhones, Google today released a preview of a new Android-compatible software development kit (SDK) called ARCore.

Google prides itself on its stewardship of Android, the largest mobile platform in the world with over 80 percent of the mobile market share in 2016, according to research and advisory firm GartnerDrawing from their work developing Tango, the company’s older AR platform that works with only two publicly available devices—Lenovo Phab 2 Pro and Asus Zenfone—Google is opening up their new ARCore SDK to run on “millions of devices,” initially supporting both the Pixel line and Samsung S8 line, running 7.0 Nougat and above.

Google is currently working with Samsung, Huawei, LG, ASUS and unnamed others, saying they’ll be targeting “100 million devices at the end of the preview.” The company hasn’t said specifically when the end of preview access will occur however.

The company outlines the ARCore’s three main abilities:

  • Motion tracking: Using the phone’s camera to observe feature points in the room and IMU sensor data, ARCore determines both the position and orientation (pose) of the phone as it moves. Virtual objects remain accurately placed.
  • Environmental understanding: It’s common for AR objects to be placed on a floor or a table. ARCore can detect horizontal surfaces using the same feature points it uses for motion tracking.
  • Light estimation: ARCore observes the ambient light in the environment and makes it possible for developers to light virtual objects in ways that match their surroundings, making their appearance even more realistic.

Including support for projects created in Java/OpenGL, Unity and Unreal Engine, Google is also releasing prototype AR web browsers, which allow developers to create AR-enhanced websites that can run on both Android’s ARCore and Apple’s iOS/ARKit.

Google calls ARCore their “next step in bringing AR to everyone,” but says more information will be coming out later this year.

ARCore, saddled with what Google’s learned from creating Tango’s recently teased Visual Positioning Service (VPS), the inside-out tracking system which hooks into Google Maps, could mean some wild things for current and future Android devices. We’ll be keeping our eyes out for all things Google, so check back for more updates on what could become one of the world’s largest AR platforms in short order.

In the meantime, check out some of these cool projects created with ARCore.

The post Google Releases ARCore for Android, The Company’s Answer to Apple ARKit appeared first on Road to VR.

ARCore Is Google’s Answer To Apple’s ARKit, Now Available For Pixel And Galaxy S8

ARCore Is Google’s Answer To Apple’s ARKit, Now Available For Pixel And Galaxy S8

Today, Google is announcing the debut of ARCore, an initiative to bring mobile-powered AR experiences to the masses like never before. Previously Google’s Tango was the best way to see powerful AR projects in action, but that required extra cameras and sensors on high-end smartphones to work. Now, ARCore is aiming to democratize augmented reality for the Android ecosystem by offering a software-only solution.

“[Today] we’ll be announcing a preview of something we call ‘ARCore,'” said Clay Bavor, VP of Augmented and Virtual Reality at Google, during an interview with UploadVR. “It’s an SDK for Android developers to build AR experiences for Android phones — a software only solution for doing stuff. So basically we’re bringing much of the goodness of Tango onto a very broad range of AR devices.”

Starting right now, Google is making the ARCore SDK available to owners of the Google Pixel (running Oreo) and Samsung’s Galaxy S8 (running at least 7.0 Nougat,) with a target of running on millions of devices by “this Winter,” according to Bavor. Other Android devices from Samsung as well as smartphones from LG, Huawei, and ASUS are expected to all get support over time as well.

In an aim to make it as easy as possible to develop AR applications, ARCore will work with Java/OpenGL, Unity, and Unreal from day one. It’s aiming to leverage three core principles, according to a prepared statement from the company:

  • Motion tracking: Using the phone’s camera to observe feature points in the room and IMU sensor data, ARCore determines both the position and orientation (pose) of the phone as it moves. Virtual objects remain accurately placed.
  • Environmental understanding: It is common for AR objects to be placed on a floor or a table. ARCore can detect horizontal surfaces using the same feature points it uses for motion tracking.
  • Light estimation: ARCore observes the ambient light in the environment and makes it possible for developers to light virtual objects in ways that match their surroundings, making their appearance even more realistic.

Hands-On Impressions

During a visit to Google’s San Francisco office last week I got the chance to see three different ARCore demos showing off what the SDK can do. Everything has been built off of the foundation laid by Tango previously and is proof of the scalability of the technologies that Google is creating. At the meeting I got to speak with Jon Wiley, the Director of Immersive Design, and Nikhil Chandhok, Director of Product for Google AR, as well as the aforementioned Clay Bavor.

The first demo was a relatively standard AR proof of concept that let me place 3D models on a tabletop, move them around, and resize them. I could play around with a tree, house, mountain, and little Android robot mascot. As great as it was the most impressive thing about the whole demo was that every model was created inside Blocks, the latest VR 3D modeling program from Google. This platform, combined with Tilt Brush, is dramatically lowering the barrier to entry for intrepid designers and platforms like ARCore only serve as a means to further expand access.

One of my other favorite bits of the demo is how the little Android robots wobbled around and walked across the table. If I leaned down and put the phone close enough they’d even look at me and wave. Everything persisted if I moved the phone away and pointed at the ground and the camera was able to even track the location and plane of flat surfaces such as the table and floor. This meant I could move models from one surface to the next and they’d retain their scale and size relative to the rest of the environment.

All of that without any depth sensors or extra cameras on the phones. It was running on a Google Pixel.

The second demo Google showed me was one focused on large, life-sized 3D modeled characters. Each of the characters on display (a lion, tin man, and scarecrow) were all themed after The Wizard of Oz, because why not? They took me to another corner of the room and placed the lion next to chair with a light source behind him. He stood there and the light cast shadows across his torso in a surprisingly realistic manner.

Then, Jon Wiley stepped into the frame and stood next to the lion as it towered over him, similar to how I’m standing in the image above. The lion then recognized his presence, looked down at him, and flexed its muscles to try and display his superiority. Then Elizabeth Markman, a Communications Manager at Google, turned off the lights. The lion grabbed his trail, looked up at the ceiling, and quivered in fear. It was a remarkable series of events and it all played out flawlessly right before my eyes.

The final demo I saw during my meeting was the most practical. Using a plugin on the Wayfair website Nikhil Chandhok measured a corner of our meeting room using his finger on the phone’s screen. He dragged a cursor to represent the length, width, and height of the type of chair he wanted and then the Wayfair website displayed results only for chairs that would fit in that space. I can see this type of technology being used to buy furniture as shown, to buy paint for walls, sheets and blankets for beds, pillows for couches, and so much more. It’s exciting to think about.

Interestingly this latest example on the Wayfair website is the first I’ve seen of what Clay Bavor described as “WebAR” wherein the user doesn’t actually need to have a special application installed on their phone to get it to work. Instead, just by visiting the website that has the ARCore code implemented with a compatible browser, the phone can automatically channel an AR experience from the web directly.


In a world where Apple already has ARKit it was only a matter of time before Google unveiled something similar. With support for both Pixel and Galaxy S8 devices starting today, and even more Android phones in the near future, the number of AR-capable smartphones in the world is starting to dramatically increase. You can read more about Google’s plans for ARCore and what it means for immersive computing right here.

What do you think of Google’s ARCore? Do you have plans to develop for it? Let us know down in the comments below!

Tagged with: ,