Having launched its gesture technology for Apple’s ARKit last year, ManoMotion recently announced the rollout of the second generation software development kit (SDK), providing developers with even more tools for incorporating hand tracking into virtual reality (VR), augmented reality (AR), mixed reality (MR), and embedded IoT applications.
The ManoMotion SDK 2.0 has a range of new core additions such as Depth sensor support; the SDK now understands 3D space and can offer gesture control for different depth sensors; Two-hand support; Rotation support for portrait and landscape mode; Skeleton tracking – enabling the new version to capture and track joint information; and Layering support, so version 2.0 understands where objects are in space in relation to the hand being tracked.
“Over 2,500 developers have applied to use our SDK to incorporate hand gestures into everything from video games to UI control to control of appliances such as lighting, and so much more,” said Daniel Carlman, CEO of ManoMotion in a statement. “Due to our team size we have been limited in how many customers that we initially could handle. We are now better staffed and more able to meet the demand for the latest version of the SDK.”
In addition to the new SDK ManoMotion has also unveiled several hand-tracking applications, including a new remote guidance application, all of which were demoed during AWE 2018 last week.
Also on display was the SDK Application, showcasing all the new tracking and analysis features of the SDK 2.0, ARKit Drawing Application, essentially controller free Tilt Brush light, and Magestro, a mobile experience in which players can control Nanoleaf lights using hand gestures.
Supporting both native iOS and Android, as well as ARKit and ARCore, the ManoMotion SDK also comes with a Unity engine plugin for both iOS and Android for videogame developers. Interested developers can sign up on ManoMotion’s website to get priority access. For any further updates, keep reading VRFocus.