ResearchVR Episode 29: Did Somebody Say Hand Interaction? Round-table With Alex Colgan

ResearchVR Episode 29: Did Somebody Say Hand Interaction? Round-table With Alex Colgan

This week on ResearchVR we dig deep into Leap Motion with Alex Colgan, lead writer at the startup.

This week’s episode is a little unusual. Considering Upload’s recent hands-on session with TPCAST and the ResearchVR episode on challenges of wireless VR, we start our discussion with a quick recap on this topic. Then we move on to our guest, the person with the overview of what Leap Motion is cooking up, their experience, and best practices to date.

Episode Preview

The most exciting technology this hand tracking company is working on are embedded sensors for mobile VR. You can be skeptical about it, however, all things are pointing toward a useful solution with FOV as big as your real world, low latency, and high tracking flexibility – all confirmed by Azad’s hands-on experience with the prototype.

Considering that Leap Motion is based purely on video input, they also have a lot of experience with the embodiment of hands based only on visual feedback. We had a heated discussion about WOW effect bias, replay value, anticipation, and whether or not it all leads to diminished experience fidelity.

Last but not least, we also dig neck-deep into VR GUI topics. In his recent article on the subject and our discussion on the podcast, Alex compares where we are with user experience and expectations towards GUI on desktop versus in VR. It turns out, in the world of desktop, we’ve evolved beyond skeuomorphism (physical metaphor), while in VR we still need more direct cues. However, extrapolating from the desktop learning curve, soon VR interfaces will be far more advanced.

Learn more in Episode 29 – Did Somebody Say Hand Interaction? Round-table with Alex Colgan.

Do you have any questions for us? Ideas for future guests or future discussion topics? Let us know on Twitter and down in the comments below!

Tagged with: , , , , , ,

Leap Motion: Mobile VR Hand-Tracking mit 180 Grad Field of View

Leap Motion ist bereits seit einigen Jahre im Virtual Reality Umfeld aktiv und hat in diesem Jahr dem Leap Motion Controller ein wichtiges Update verpasst und damit das Tracking der Hände deutlich verbessert. Doch der Leap Motion Controller hat das Problem, dass er nicht speziell für Virtual Reality gedacht war und somit nicht den nötigen Anforderungen entspricht. Nun hat das Unternehmen aber eine neue Variante vorgestellt, die wegweisend für das Tracking der Hände sein könnte und mit mobilen VR Headsets funktioniert.

Mobile VR Hand-Tracking mit 180 Grad Field of View

reference-fov-1-759x1024

Bisher war das Tracking der Hände auf einen zu kleinen Bereich beschränkt und dadurch entstand schnell das Gefühl, dass die gezeigte Welt nicht echt sei. Mit dem neuen Controller soll dieses Problem der Vergangenheit angehören, denn er besitzt ein Field of View von 180 Grad und kann eure Hände somit deutlich früher erkennen, als ihr sie im Headset sehen könnt. Außerdem soll der neue Tracker die zehnfache Geschwindigkeit bieten und dabei deutlich weniger Strom benötigen. Deshalb ist es auch möglich, diesen Tracker mit einem Virtual Reality Headsets für Smartphones zu kombinieren. Aktuell verwendet Leap Motion die Samsung Gear VR und noch in diesem Monat möchte man das neue Produkt auf einem Event vorstellen.

 

Der Beitrag Leap Motion: Mobile VR Hand-Tracking mit 180 Grad Field of View zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Leap Motion’s New Mobile Hand-tracking Sensor Brings 180-degree Field of View

The new Leap Motion Mobile Platform consists of hardware and software optimised for VR and AR hand tracking on mobile devices. Building on the success of the original Leap Motion device, the brand new hardware aims to be tightly integrated into future mobile VR headsets.

Designed as a natural motion interface for PC and Mac, the original Leap Motion Controller began shipping in volume in July 2013, for $80. This attractive price was largely achieved by a breakthrough in software; the hardware itself was fairly simple, containing two cameras and three infrared LEDs. Early hand tracking applications which interfaced with traditional displays seemed somewhat abstract, but the compact dimensions and light weight meant that Leap Motion Controllers soon found themselves attached to the front of Oculus Rift development kits. This allowed users to finally ‘see’ their hands in VR, with fully-tracked fingers, marking the beginning of a fruitful relationship between Leap Motion and VR.

Since then, Leap Motion has improved their VR support significantly, and the importance of software was further illustrated by the huge jump in technology delivered by ‘Project Orion’ which began as a major software update in February designed specifically for VR. While still using the original hardware, Orion delivered massive improvements to tracking speed and accuracy. However, the Leap Motion Controller’s hardware was finalised in 2012, and we’ve been waiting patiently for the second phase of Orion, a brand new tracking system designed for VR and built in to headsets.

leap-motion-mobile-platform-sensortThat day is almost here, with Leap Motion’s chief technology officer David Holz revealing the Leap Motion Mobile Platform on the company’s blog. The improvements address the field of view, the ‘biggest request from the VR community’; the brand new sensor, which is designed for low-power mobile devices, delivers a 180×180 degree field of view—up from 140×120 degrees on the original Leap Motion Controller—and is said to run at 10 times the speed of the original hardware while using “much lower power”. The increased field of view means the user’s hands can continue to be tracked even when held in a more natural location rather than directly front of them, a major pain point for VR use with the original device.

As clarified in an answer in the blog’s comments, this is intended to be embedded technology, and the company has no plans for a new standalone peripheral. No further information was provided in terms of timeframe or pricing, but the company says we can expect to see Leap Motion technology in multiple mobile headsets in the near future.

leap-motion-mobile-platform-reference-headsetHolz said in the announcement that the company has created a reference platform using the new Leap Motion Mobile Platform that’s built on the Gear VR headset. The plans to demo the reference device at upcoming events starting this month.

The post Leap Motion’s New Mobile Hand-tracking Sensor Brings 180-degree Field of View appeared first on Road to VR.

We Tried Leap Motion’s New Hand Tracking Module On Gear VR

We Tried Leap Motion’s New Hand Tracking Module On Gear VR

Leap Motion is a young technology company that is perhaps best known at this point for what it sells at your local Best Buy. It’s a $79.99 virtual reality peripheral that can be mounted onto an Oculus Rift headset. This device then uses infrared scanning technology to detect the position of your hands in 3D space. The system communicates that information as input in a VR experience. It can even track individual fingers and has been well received by the immersive community so far. However, an $80 peripheral is not what the Leap team has in mind for the future.

This week the company announced a brand new product. It will not be sold in stores. This new device is a hacked together hand-tracking module that does two very important things for the Leap brand: it expands the trackable range of their previous iteration, and it makes the technology work on mobile VR devices.

The improved tracking was the easier of the two for the team to accomplish. During an in-person demonstration at UploadVR’s San Francisco office, the Leap team explained that their new module is compiled using simple, “off the shelf” components. User feedback on the previous sensor indicated a significant desire for an expanded field of view, and so now the new module can track at 180×180 degrees.

Essentially what this means is that the device begins tracking your hand before it even comes into view on the screen of the VR headset itself. This creates a more immersive and less restrictive experience and during our hands-on time with the product, which was running on a Samsung Gear VR, the expanded tracking worked flawlessly.

The presence of a Gear VR in the demo room is significant for Leap Motion. Previously, the company’s tech could only work well with the higher processing power provided by PC VR headsets like the Rift. The new module, however, runs on just a smartphone.

“Your smartphone is meant for checking email and browsing the web,” one company representative explained, “it was not necessarily meant to be a graphical powerhouse capable of running something as intense as virtual reality.”

For Leap, this meant that heavy optimization would be required to get IR hand tracking to work on mobile devices. The company has therefore created proprietary software that wrestles every last bit of performance out of a smartphone and gets their tech to work on mobile without any significant dip in performance.

According to Leap, its goal for the new module is not to package and sell it in Best Buy as the “Leap Motion 2.” Instead, the group wants to work with VR headset manufacturers themselves to integrate the hand-tracking sensor directly.

The company declined to state who those manufacturers might be at this time, but did say that VR enthusiasts would “definitely recognize” the names of these potential partners.

Tagged with: , , , ,

Leap Motion’s ‘Interaction Engine’ Unlocks Natural Human Input for VR

Leap Motion has announced it’s to early access to the beta of its Interaction Engine, a set of systems designed to help developers implement compelling, realistic and hopefully frustration-free input with just their hands.

If you’ve ever spent time with a physics-sandbox title, you’ll know that a large part of the appeal is the satisfaction and freedom to play within a virtual world that behaves somewhat like reality – with none of the real-world restrictions applied. But this presents myriad problems, not least of which is that those real-world modelled physics breakdown when physicality is removed. Without physical boundaries in place, objects on the virtual plane will behave according to the digital physics model, right up to the point you accidentally put your digital self through said objects – at which point things kinda breakdown.

maxresdefault (2)

These issues are particularly acute when it comes to integrating naturalistic hand interaction with a digital space and its objects, for example in VR. Bridging the “gray area” between accuracy and what ‘feels good’ to a human being is part of that elusive magic when you encounter an input interface that just works. More specifically, in the case of VR, that bridging involves implementing an alternative set of rules when a player connects and grasps a virtual object in 3D space, bending realities rules in favour of a visual experience that more closely matches our expectations of what should happen.


These are all issues that Leap Motion, the company most well known for its depth sensing sensor peripheral of the same name, have been grappling with for many months now and they’re Interaction Engine aims to remove a lot of the pain for developers by providing a framework that “exists between the Unity game engine and real-world hand physics,”

The last time we encountered Leap Motion, they showed us the first glimpses of their work to try and boil down such an enormously complex set of problems into something that developers can interface with easily. At CES in January the Leap Motion team let us get our hands on Orion with an early verison of their Interaction Engine, a significant milestone for the company in terms of their overall tracking framework with impressive leaps in lowered tracking latency and the systems ability to handle hand tracking issue

Leap Motion’s release of Interaction Engine’s Beta completes another piece of the peripheral-free VR input puzzle that the company has dedicated itself to over the last couple of years.

“The Interaction Engine is designed to handle object behaviors as well as detect whether an object is being grasped,” reads a recent blog post introducing the Interaction Engine, “This makes it possible to pick things up and hold them in a way that feels truly solid. It also uses a secondary real-time physics representation of the hands, opening up more subtle interactions.”

Leap Motion have always had a knack for presenting complex ideas involved in their work in a visual way immediately graspable by the viewer. These latest demo’s illustrate that user-friendly fuzzy logic Leap Motion believe strike a neat balance between believable virtual reality and frustration-free human-digital interaction.

The Interaction Engine represents another milestone for Leap Motion on its quest to achieve hardware free, truly human input. And if you’re a developer, it’s something you can get your hands on right now as the beta is available for download here, read all about it here and chat with others doing the same.

The post Leap Motion’s ‘Interaction Engine’ Unlocks Natural Human Input for VR appeared first on Road to VR.

Leap Motion: Neue Interaction Engine vorgestellt

Leap Motion, ein Controller der auf eine optische Erkennung der Hände und Finger setzt, gab es schon vor dem großen VR-Hype. Leider fand das Produkt damals aber nur schleppend Anklang, da der Nutzen häufig fragwürdig war. Warum sollte man z.B. Windows auch mit seinen Händen an einem Monitor bedienen, wenn die Eingabe mit Maus und Tastatur deutlich schneller ist. In der virtuellen Realität sieht das aber anders aus. Daher hat Leap Motion sein Produkt mittlerweile vollends auf VR ausgerichtet und möchte ein Stück vom Virtual Reality Kuchen abhaben.

Neue Interaction Engine für den Leap Motion Controller

Auch wenn der Leap Motion Controller zu Beginn nur sehr eingeschränkt für Virtual Reality nutzbar war, so hat er sich mit den letzten Updates zu einem relativ soliden Trackingsystem weiterentwickelt. Ein Problem bestand aber weiterhin beim Greifen von Objekten. Diese konnten weggleiten oder wegrollen, wenn man sie mit den Fingern umschließen wollte. Dieses Problem geht Leap Motion nun mit der neuen Interaction Engine an.

Wie im Bild zu sehen ist, sorgt die Interaction Engine dafür, dass ein Objekt gehalten wird, wenn ihr es berührt und die Hand schließt. Dies ist zwar nicht in jedem Fall realistisch, aber ein sinnvoller Kompromiss und sollte die Erfahrung deutlich verbessern. Wenn ihr dies selber ausprobieren wollt, dann könnt ihr euch die entsprechenden Dateien hier besorgen.

Der Beitrag Leap Motion: Neue Interaction Engine vorgestellt zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!