ManoMotion Launches Second Generation Hand-Tracking SDK for AR/VR Applications

Having launched its gesture technology for Apple’s ARKit last year, ManoMotion recently announced the rollout of the second generation software development kit (SDK), providing developers with even more tools for incorporating hand tracking into virtual reality (VR), augmented reality (AR), mixed reality (MR), and embedded IoT applications.

The ManoMotion SDK 2.0 has a range of new core additions such as Depth sensor support; the SDK now understands 3D space and can offer gesture control for different depth sensors; Two-hand support; Rotation support for portrait and landscape mode; Skeleton tracking – enabling the new version to capture and track joint information; and Layering support, so version 2.0 understands where objects are in space in relation to the hand being tracked.

“Over 2,500 developers have applied to use our SDK to incorporate hand gestures into everything from video games to UI control to control of appliances such as lighting, and so much more,” said Daniel Carlman, CEO of ManoMotion in a statement. “Due to our team size we have been limited in how many customers that we initially could handle. We are now better staffed and more able to meet the demand for the latest version of the SDK.”

In addition to the new SDK ManoMotion has also unveiled several hand-tracking applications, including a new remote guidance application, all of which were demoed during AWE 2018 last week.

Also on display was the SDK Application, showcasing all the new tracking and analysis features of the SDK 2.0, ARKit Drawing Application, essentially controller free Tilt Brush light, and Magestro, a mobile experience in which players can control Nanoleaf lights using hand gestures.

Supporting both native iOS and Android, as well as ARKit and ARCore, the ManoMotion SDK also comes with a Unity engine plugin for both iOS and Android for videogame developers. Interested developers can sign up on ManoMotion’s website to get priority access. For any further updates, keep reading VRFocus.

Future AR Games and Apps for the new iOS11

At Apple’s special event, Apple revealed some exciting apps and videogames that Apple users would be able to download with the coming of iOS11. When Apple launched their ARKit, developers flocked at the opportunity to use augmented reality (AR) to the future Apple devices.

Alti Mar from Directive Games demonstrated competitive multiplayer AR game The Machines live on stage whilst Apple showcased other AR apps like Warhammer 40K: FreebladeMajor League Baseball’s At Bat app where you can learn about your players on the field or find star constellations in the sky with Sky Guide.

Watch the video below to see how you can chase pigeons in Pigeon Panic, see the capabilities of remote support from experts through Remote AR app, interact with AR objects using your hands instead of tapping the phone in Manomotion or play God in God simulation game ARrived.

ManoMotion Provides Hand Gesture Control For ARKit Apps

ManoMotion demo

ARKit is almost here, and we see more demos of Apple’s Augmented Reality toolset every day. The latest, from Sweedish developer ManoMotion, is an additional SDK that allows developers to integrate hand gesture controls into their AR apps quickly.

Available within a few weeks, ManoMotion’s tech uses the iPhone’s camera to provide accurate hand tracking with 27 degrees of freedom. It recognizes a variety of gestures including swipes, clicking, tapping, grabbing, and releasing. The video included with the announcement features an AR adaptation of beer pong, with the developer pinching to summon a ball from the ether into his hand, and releasing to throw it into a set of computer-generated red Solo cups.

The company says that the tech demands a minimal amount of battery, CPU, and memory overhead. It also provides tracking for both left and right hands, and a set of pre-defined gestures for developers to integrate into their apps.

For now, the gesture technology remains locked to Apple’s platform, but ManoMotion promises that ARCore integration for Android will be arriving in the future. The SDK will roll out for Unity on iOS first, followed by native iOS 11 support at a later date. Interested developers can sign-up to take a look at ManoMotion’s documentation, and the company will also support their new SDK via email and their own forums.

Tagged with: ,

ManoMotion Unveils Hand Gesture Control for Apple’s ARKit

Back in June ManoMotion released a software development kit (SDK) to allow developers to include add hand gestures into any virtual reality (VR), augmented reality (AR) or mixed reality (MR), applications. One of the biggest AR apps to launch this year was Apple’s ARKit. Now the computer vision specialist has included support into its SDK.

ManoMotion’s gesture technology uses a standard 2D camera to recognise and track many of the 27 degrees of freedom (DOF) of motion in a hand, all in real-time. So now ARKit developers will be able to include their hands in projects rather than just tapping on a screen, being able to pick up AR objects.

ManoMotion ARKit 1

The current version features a set of predefined gestures, such as point, push, pinch, swipe and grab, offering a range of interactive possibilities depending to what they want to achieve, or allow the users to do.

“Up until now, there has been a very painful limitation to the current state of AR technology – the inability to interact intuitively in depth with augmented objects in 3D space,” said Daniel Carlman, co-founder and CEO of ManoMotion in a statement. “Introducing gesture control to the ARKit, and being the first in the market to show proof of this, for that matter, is a tremendous milestone for us. We’re eager to see how developers create and potentially redefine interaction in Augmented Reality!”

To begin with ManoMotion’s SDK will initially be made available for Unity iOS, followed by Native iOS in subsequent updates. Developers interested in using ManoMotion’s SDK with ARKit should visit: https://www.manomotion.com/get-started/.

In addition to ARKit, Google’s recently announced ARCore will also see ManoMotion integration, with a release date coming in the near future.

VRFocus will continue its coverage of ManoMotion, reporting back with the latest updates.

ManoMotion Brings Hand Gesture Input to Apple’s ARKit

ManoMotion, a computer-vision and machine learning company, today announced they’re integrated their company’s smartphone-based gesture control with Apple’s augmented reality developer tool ARKit, making it possible to bring basic hand-tracking into AR with only the use of the smartphone’s onboard processors and camera.

With Google and Apple gearing up for the augmented reality revolution with their respective software developer kits, ARCore and ARKit, developers are fervently looking to see just how far smartphone-based AR can really go. We’ve seen plenty of new usecases for both, including inside-out positional tracking for mobile VR headsets and some pretty mind-blowing experiments too, but this is the first we’ve seen hand-tracking integrated into either AR platform.

image courtesy ManoMotion

Venture Beat got an early look at the company’s gesture input capabilities before they integrated support for ARKit, with ManoMotion CEO Daniel Carlman telling them it tracked “many of the 27 degrees of freedom (DOF) of motion in a hand.” Just like their previous build, the new ARKit-integrated SDK can track depth and recognize familiar gestures like swipes, clicking, tapping, grab, and release—all with what ManoMotion calls “an extremely small footprint on CPUs, memory, and battery consumption.”

In ManoMotion’s video, we can see the ARKit-driven app recognize the user’s hand and respond to a flicking motion, which sends a ping-pong ball into a cup, replete with all of the spatial mapping abilities of ARKit.

A simple game like beerpong may seem like a fairly banal usecase, but being able to interact with the digital realm with your own two hands (or in this case, one hand) has a much larger implication outside of games. AR devices like HoloLens and The Meta 2 rely upon gesture control to make UI fully interactive, which opens up a world of possibilities including productivity-related stuff like placing and resizing windows, or simply turning on Internet-connected lights in your house with the snap of the finger. While neither Google nor Apple have released word on future AR headsets, it’s these early experimental steps on the mobile platforms of today—which necessarily don’t have access to expensive custom parts—that will define the capabilities of AR headsets in the near future.

“Up until now, there has been a very painful limitation to the current state of AR technology – the inability to interact intuitively in-depth with augmented objects in 3D space,” said Carlman. “Introducing gesture control to the ARKit, and being the first in the market to show proof of this, for that matter, is a tremendous milestone for us. We’re eager to see how developers create and potentially redefine interaction in Augmented Reality!”

ManoMotion says ARKit integration will be made available in the upcoming SDK build, which will be available for download “in the coming weeks” on the company’s website. The integration will initially be made available for Unity iOS, followed by Native iOS in subsequent updates.

The post ManoMotion Brings Hand Gesture Input to Apple’s ARKit appeared first on Road to VR.

ManoMotion Releases SDK for Developers to Incorporate Hand Gestures Into VR

Computer vision specialist ManoMotion has announced the launch of its software development kit (SDK) for developers, enabling them to add hand gestures into any virtual reality (VR), augmented reality (AR) or mixed reality (MR), applications they create.

Up until this point, ManoMotion has been working with customers on a purely one-on-one basis, but with the SDK’s release far more developers to get their hands in and on the company’s technology.

It’ll be offering its SDK in a freemium model, tiered to fit different customer needs. The SDK will allow people to see their hands and move objects in a VR/AR/MR space, using the left or right. Dynamic gestures, such as swipes and clicks, can be added for the manipulation of menus and displays, while predefined gestures, such as point, push, pinch, swipe and grab can also be included.

manomotion screenshot

“The launch of our SDK is a significant milestone in our company?s history,” said Daniel Carlman, co-founder and CEO of ManoMotion in a statement. “It marks the start of a new community and knowledge base around gesture technology, to which ManoMotion will show undying commitment and contribution. We can?t wait to see what developers create!”

ManoMotion’s 3D real-time gesture recognition technology uses a standard 2D camera to recognise and track many of the 27 degrees of freedom (DOF) of motion in a hand. It also tracks depth and handles dynamic gestures (such as swipes, clicking, tapping, grab and release, etc), whilst taking up a small footprint on CPUs, memory, and battery consumption.

The SDK supports both Native iOS and Android and comes with a Unity game engine plugin for both iOS and Android. Head to the ManoMotion website to apply.

For any further updates on ManoMotion, keep reading VRFocus.

ManoMotion’s SDK Welcomes Hand Gestures Development for AR/VR

ManoMotion’s SDK Welcomes Hand Gestures Development for AR/VR

Mobile VR’s high accessibility and convenience is looking more and more like the path of least resistance when it comes to virtual reality having a common presence in households. Price and form factor play a big part in this, but with those benefits comes a hindered experience when it comes to degrees of movement and interaction with virtual experiences on these mobile platforms. Creators are regularly attacking the limitations of mobile VR so we’re likely to have more robust entry level VR in coming years. ManoMotion is one such company and they’ve released a new SDK that specifically tackles opportunities around input and interaction for mobile VR platforms.

ManoMotion’s new SDK, which is available on their website today, provides developers with the tools to create experiences centered on hand gestures for VR, AR, and MR applications. ManoMotion’s tech used the cell phone’s camera to pick up hand gestures so users can grab, hit, move, and tap digital objects.

“The launch of our SDK is a significant milestone in our company’s history,” said co-founder and CEO of ManoMotion Daniel Carlman in a prepared statement. “It marks the start of a new community and knowledge base around gesture technology, to which ManoMotion will show undying commitment and contribution. We can’t wait to see what developers create!”

Google Daydream took an important step forward with its controller for VR input, but is still very limited when compared to tethered headsets. If ManoMotion manages to provide truly robust interaction while not hogging up smart devices’ already limited power, it could be an incredible leap for the platform. Also, while this is immediately more impactful for mobile platforms that have limited interaction, technology such as this could impact the more advanced VR headsets in the future as well.

Tagged with:

ManoMotion’s SDK Welcomes Hand Gestures Development for AR/VR

ManoMotion’s SDK Welcomes Hand Gestures Development for AR/VR

Mobile VR’s high accessibility and convenience is looking more and more like the path of least resistance when it comes to virtual reality having a common presence in households. Price and form factor play a big part in this, but with those benefits comes a hindered experience when it comes to degrees of movement and interaction with virtual experiences on these mobile platforms. Creators are regularly attacking the limitations of mobile VR so we’re likely to have more robust entry level VR in coming years. ManoMotion is one such company and they’ve released a new SDK that specifically tackles opportunities around input and interaction for mobile VR platforms.

ManoMotion’s new SDK, which is available on their website today, provides developers with the tools to create experiences centered on hand gestures for VR, AR, and MR applications. ManoMotion’s tech used the cell phone’s camera to pick up hand gestures so users can grab, hit, move, and tap digital objects.

“The launch of our SDK is a significant milestone in our company’s history,” said co-founder and CEO of ManoMotion Daniel Carlman in a prepared statement. “It marks the start of a new community and knowledge base around gesture technology, to which ManoMotion will show undying commitment and contribution. We can’t wait to see what developers create!”

Google Daydream took an important step forward with its controller for VR input, but is still very limited when compared to tethered headsets. If ManoMotion manages to provide truly robust interaction while not hogging up smart devices’ already limited power, it could be an incredible leap for the platform. Also, while this is immediately more impactful for mobile platforms that have limited interaction, technology such as this could impact the more advanced VR headsets in the future as well.

Tagged with: