VRgineers stellt verbesserte VR-Brille XTAL mit integriertem Hand Tracking vor

Das tschechische VR-Start-up VRgineers präsentiert seine neue VR-Brille XTAL, die als optimierte Version den Nachfolger des Vorgängermodells VRHero 5K darstellt. Die verbesserte Brille soll nicht nur einen kleineren Formfaktor besitzen und weniger Gewicht aufweisen, sondern auch integriertes Hand Tracking von Leap Motion besitzen.

VRgineers – Neue VR-Brille XTAL bietet weniger Gewicht und integriertes Handtracking

Die Kooperation zwischen VRgineers und Leap Motion begann bereits vor einiger Zeit. Nun trägt die gemeinsame Arbeit Früchte, denn das Unternehmen präsentiert seine neue VR-Brille XTAL. Das neue Modell stellt den Nachfolger der VRHero 5K dar und soll entsprechende Schwächen des Vorgängers ausmerzen.

Die VRHero 5K besaß zwei Displays mit einer Auflösung von  jeweils 2560 x 1440 Pixel bei einem Sichtfeld von 170 Grad. Der Bildeindruck konnte überzeugen, jedoch wiesen die vorgestellten Prototypen noch einige Schwächen auf. Besonders das Gewicht galt als Problemkind, denn die Brille fiel mit über 1 kg doppelt so schwer wie Rift oder Vive aus. Auch die Bildwiederholrate von 70 Hz galt nicht dem Standard entsprechend.

Mit der neuen XTAL-Brille möchte das VR-Start-up nun aber genau an diesen Punkten ansetzen. Die aktuelle Version soll um satte 12 % leichter sein als das Vorgängermodell und nun 770 Gramm auf die Waage bringen. Dies ist zwar immer noch schwerer als die eingesessenen Konkurrenzprodukte, jedoch deutlich abgeschlackter im Vergleich zum hauseigenen Vorgänger. Mit zwei OLED-Displays mit einer Auflösung von 2560 x 1440 Pixeln und einem Sichtfeld von 170 Grad bleibt das Unternehmen seiner Linie treu.

VRgineers-XTAL-Brille

Links im Bild: XTAL-Brille, rechts im Bild: VR Hero 5K – Image courtesy: VRgineers

Als besonderes Feature enthält die Brille eine automatische Funktion zur Ermittlung des Augenabstands. Dank AutoEye soll die Distanz zwischen den Linsen exakt auf die jeweilige Gesichtsform angepasst werden. Zudem integrierten die Entwickler/innen das Hand-Tracking-Modul von Leap Motion direkt in die Hardware. Dadurch ist die XTAL in der Lage, Handbewegungen innerhalb eines Bereichs von 180 x 180 Grad direkt zu erfassen.

Auch im Preis ist die Brille deutlich zurückgegangen. Die VR-Brille ist nach wie vor für den Business-Sektor vorgesehen und entsprechend kostspielig in der Anschaffung. Doch während VRgineers für die VRHero 5K noch 9000 US-Dollar verlangte, kostet die neue XTAL nur noch 5800 US-Dollar in der Anschaffung. Erste Auslieferungen sollen im September 2018 beginnen.

(Quellen: VRgineers | Road to VR | Video: VRgineers Youtube)

Der Beitrag VRgineers stellt verbesserte VR-Brille XTAL mit integriertem Hand Tracking vor zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Leap Motion Announces New Project North Star Demo

Leap Motion attracted a great deal of attention when it announced the prototype of a new augmented reality (AR) headset which offered an increased field of vision beyond what currently available hardware such as the Microsoft HoloLens can currently offer.

The headset was called Project North Star, and the company has now announced a new demo that shows off some of the capabilities of the headset prototype and what would be possible for its commercial use.

Leap Motion previously released the schematics for the reference design of Project North Star, along with a short guide on how to build a North Star headset, giving creators and businesses the opportunity to create their own version of the headset.

The new demo uses Table Tennis to show how the Project North Star hardware can work with Leap Motion hand tracking combined with a handheld paddle controller. By using the paddle, users can cause the virtual ball to bounce across the table, with an AI opponent to act as a challenge to the user.

The AR Table Tennis demo is designed to show how the AR headset can be used to train skills in a way that can be transferred into real life. The team at Leap Motion hope that Project North Star can be used as part of a system to optimise a task or behaviour.

The Table Tennis demo uses physics simulations to keep the AR experience accurate to the real world, so any hand-eye coordination and muscle memory built up using the simulation can be transferred into the real world.

Leap Motion said as part of its press release: “The realism and physical reproducibility of this demo were built with the intent that the user should grow in their understanding of the system by interacting with it. As a medium, AR has the potential to improve how we learn about and interact with the real world. Simulations like this have the unique ability to adjust their difficulty downward to accommodate novices and upward to challenge experts in a whole new way – appealing to players at all skill levels.”

Further information can be found on the Leap Motion blog, along with YouTube videos of the demo. Further news on Leap Motion will be covered here on VRFocus.

Leap Motion’s New North Star Demo Brings Table Tennis To AR

Leap Motion’s New North Star Demo Brings Table Tennis To AR

Leap Motion has found a new use for its open source AR headset, North Star; table tennis.

The hand-tracking specialist today revealed a new demo for its headset, which was announced last April. Whereas we’ve seen North Star handle advanced AR UI, table tennis showcases some other features of the headset. For starters, it allows the player to jump into a virtual game of the sport using a real table and a specialized paddle; the headset creates both the ball and an AI opponent for you to challenge.

Using a hand gesture the player summons a virtual ball and then serves it just like they would a real one. The opponent, meanwhile, is designed to make only humanly possible returns, giving you a real game of table tennis without anyone there to practice with. Check it out in the video below.

But what may first appear as a simple game has deeper implications about the future of AR training and beyond.

“Eventually, as AR systems become more advanced and lifelike, we will be able to practice against “impossibly difficult” artificial opponents and use that intuition in the real world like never before,” Leap Motion’s Johnathon Selstad said in a blog post. “Current and near-future professions may be aided by advanced AR training systems that allow us to casually achieve levels of skill that previously required months of determined practice.”

For now these demo is purely experimental, just like North Star itself.

Tagged with: ,

The post Leap Motion’s New North Star Demo Brings Table Tennis To AR appeared first on UploadVR.

Leap Motion Reveals First Extended Demo Shot Through North Star Headset

Back in April Leap Motion first revealed North Star, a prototype AR headset that’s designed to replicate the features of a future high-end AR headset, as a platform for experimentation. Today the company revealed the first extended look of a demo shot through the headset, offering a glimpse of its capabilities.

Leap Motion is best known for its markerless hand-tracking technology, but in the last few quarters the company has been increasingly showing of its design chops in both software and now hardware. Project North Star, which was recently open-sourced, cares not about form-factor; the headset was built purely to push the limitations of the end AR experience, and thus to serve as an experimental platform for what might be achieved with features that will hopefully one day fit into a compact and affordable headset.

Image courtesy Leap Motion

The company claims North Star has “best-in-class field-of-view, refresh rate, and resolution,” and today revealed a new demo designed to show it off, including the company’s hand-tracking tech. Below you’ll see through-the-headset footage of Leap Motion’s table tennis demo on North Star, which has the player facing off against an AI opponent. In the video, the table and the player’s paddle are real, while the ball and the opponent’s paddle are virtual.

The demo appears to use a professional motion capture system for tracking the headset and paddle, while the user’s free hand is tracked with a Leap Motion sensor on the headset. The user uses a pinch gesture to spawn a ball for each volley.

SEE ALSO
Toward Truly Glasses-sized AR: First Look at DigiLens' AR HUD Reference Headset

Leap Motion has no plans to manufacture the North Star themselves, but believes the device could be produced at $100 per headset at scale, which could make for a excellent AR development kit.


Disclosure: Leap Motion’s Barrett Fox and Martin Schubert have recently published a series of guest articles on Road to VR highlighting their experiments in AR/VR interface design. The latest piece is here: Validating an Experimental Shortcut Interface with Flaming Arrows & Paper Planes

The post Leap Motion Reveals First Extended Demo Shot Through North Star Headset appeared first on Road to VR.

E3 2018 Pre-show Roundup – ‘Beat Saber’ on PSVR, Xbox Mum on VR, New Game Announcements & More

While E3 2018 didn’t technically start until Tuesday this week, much of the big news comes during the pre-show period from Saturday to Monday. Here’s a roundup of our coverage of E3 2018 before the official start of the event.

Beat Saber Coming to PSVR

Image courtesy Hyperbolic Magnetism

Indie studio Hyperbolic Magnetism announced ahead of the Sony conference that their VR rhythm game is on its way to PlayStation VR later this year. The game sold over 100,000 copies in its first month of availability on PC VR devices, and is set to launch on Sony’s platform with a new song, and potentially more content.

Read More

Microsoft Stays Mum on Xbox VR

Image courtesy Microsoft

During Microsoft’s main E3 2018 presentation, Head of Xbox Phil Spencer talked briefly on stage about the future of the brand, mentioning ‘future “consoles”‘ in development, but there was once again no sign of VR support for Xbox One. This follows the surprising no-show last year at the launch of Xbox One X. The hardware is more than capable of taking the fight to Sony, but for now it continues to be one-way traffic.

Read More

Wolfenstein: Cyberpilot Announced

Image courtesy Bethesda

Set two decades after the events of Wolfenstein II: The New Colossus (2017)Wolfenstein: Cyberpilot is a standalone VR game set to launch next year. Bethesda announced the title at their E3 2018 showcase; little information is known about the game so far, but we’ll report our findings once we go hands-on this week.

Read More

Prey (2017) DLC to Include VR Modes

Image courtesy Bethesda

Bethesda also announced their new Prey: Mooncrash DLC will soon include two VR-compatible game modes. This consists of a single player escape room game and a multiplayer game called ‘Typhon Hunter’ which, according to Bethesda, pits the series’ protagonist Morgan Yu against mimics that will stalk, hunt and hide in plain sight as they try to take Yu down.

Read More

Elder Scrolls: Blades is Bethesda’s First Mobile VR Game, Also Coming to High-end Headsets with Crossplay

Image courtesy Bethesda

Introduced by director Todd Howard as a smartphone-first title, Elder Scrolls: Blades was demonstrated to have touch-friendly controls, and is set to launch on as many platforms as possible later this year, including mobile VR all the way up to high-end VR. The game promises to include both handcrafted and procedurally generated dungeons, character customisation and levelling, and several game modes including an infinite dungeon.

Read More

Major Tracking Update for Leap Motion

Image courtesy Leap Motion

Leap Motion has released another major software update to their markerless hand tracking hardware – a product which hasn’t changed specification since it launched in 2012, but has seen dramatic improvements to the quality of tracking over the years. The new updates claim to bring major improvements “across the board.”

Read More

Transference Comes to PSVR, Vive, and Rift in Fall 2018

Photo courtesy Ubisoft

Psychological thriller from Elijah Wood’s studio SpectreVision was shown again at Ubisoft’s E3 conference, this time with a new launch trailer. The game blends live action and rendered environments in a perspective-shifting narrative. While it is built with VR in mind, the title is also launching in traditional formats sometime in Fall 2018, on PSVR, Rift, Vive, PC, Xbox One, and PS4.

Read More

Space Junkies Beta Announced for Late June

Image courtesy Ubisoft

Zero-G multiplayer shooter Space Junkies was another VR game announced last year to make a second appearance at Ubisoft’s E3 2018 presentation. The new trailer showed some new gameplay, and revealed the dates for the open beta: June 28th to July 2nd. Signups for the closed beta is already open on the official Space Junkies website.

Read More

Insomniac Reveals More Stormland Info

Image courtesy Insomniac Games

Insomniac’s CCO Chad Dezern appeared on stage at the E3 2018 PC Gaming Show to talk about upcoming open-world adventure game Stormland, which was announced just before the event. Dezern detailed some of the key movement mechanics optimised for VR motion controllers, as well as some of your character’s scavenging abilities.

Read More

Trover Saves the Universe Announced for PS4 and PSVR from Squanch Games

Image courtesy Squanch Games

Rick and Morty creator Justin Roiland’s game studio Squanch Games enjoyed some time on the big screen at Sony’s main E3 2018 presentation with their new game Trover Saves the Universe, an action-platformer of sorts. In typical style, the haphazard trailer gave very little information, but it is targeting an early 2019 release for PS4 and PSVR.

Read More

PSVR Exclusive Ghost Giant Announced

Image courtesy Zoink

Detailed in a post on the PlayStation Blog just before E3, Ghost Giant is an adorable puzzle adventure from Zoink, creators of Fe (2018) and Flipping Death (TBA), coming exclusively to PSVR. The player assumes the role of the ghost, and can interact with the environment by lifting” furniture, vehicles and trees,” or even opening up “entire buildings to reveal the stories going on inside.”

Read More

Déraciné Announced for PSVR From Developer Behind Dark Souls

Image courtesy From Software

From Software, creators of popular RPG series Dark Souls, has revealed a game that is very different in tone compared to the studio’s most famous works. According to the description of the teaser trailer, which was shown just after the main Sony press conference, Déraciné is about a young girl in a secluded boarding school who summons a spirit. The game is said to task the player with proving the spirit’s existence and building “a unique bond with the students through clever interactions.” The game is set to arrive sometime in 2018 exclusively on PSVR.

Read More

The post E3 2018 Pre-show Roundup – ‘Beat Saber’ on PSVR, Xbox Mum on VR, New Game Announcements & More appeared first on Road to VR.

Leap Motion Releases Major Update For Tracking

Leap Motion began as a company that created a markerless tracking device for desktop PC applications. Since then, the company has become far better known for its work in augmented reality (AR) and virtual reality (VR), but has now released a new update for its hand-tracking unit.

Leap Motion has announced an update which brings in major improvements across the board, and it comes along with three new demos for users to test out its capabilities.

The company has previously introduced updates for its software to implement various improvements, and that the latest update will be a big step forward for the technology. Leap Motion are referring to the new update as the ‘Fourth generation of our core software’.

The new update introduces the following improvements:

  • Better finger dexterity and fidelity
  • Significantly smoother hand and finger tracking, with motions that look and feel more natural
  • Faster and more consistent hand initialization
  • Better hand pose stability and reliability
  • Improved tracking fidelity against complex backgrounds and extreme lighting
  • More accurate shape and scale for hands

Along with the beta version of the new software for Windows, three demo applications are also being made available. The first is simply called Paint, which lets users draw using a paint palette and a pinching gesture (as if holding a paintbrush). The second is called Particles, which lets users interact with lots of simulated particles. The third is called Cat Explorer, which lets you take apart a cartoon cat to see how it is put together, a cruel fate for a cute-looking creature.

Leap Motion are also releasing improvements to the developments tools, including a new updated integrations for Unity and Unreal Engine. Further information can be found on the Leap Motion blog.

For future coverage of updates from the VR and AR industry, keep checking back with VRFocus.

Leap Motion Improves Its Hand Tracking (Again)

Leap Motion Improves Its Hand Tracking (Again)

Leap Motion continues to improve its hand tracking technology with the latest update today.

Software updates issued over the last few years for Leap Motion’s hand-tracking sensors have steadily improved the hardware’s functionality. Mounted facing outward on a VR headset, Leap Motion’s $80 Controllers allow some developers to build VR software with complex interactions and without using hand-held controllers.

The fourth generation of Leap Motion’s software is said to include, among other improvements, better “finger dexterity and fidelity” as well as “significantly smoother hand and finger tracking, with motions that look and feel more natural.” The latest “Orion” software update is available as a testing release on Windows.

Check out some demos Leap Motion prepared to showcase the update alongside a variety of interactions:

Tagged with:

The post Leap Motion Improves Its Hand Tracking (Again) appeared first on UploadVR.

Leap Motion Releases Major Tracking Update and New Demos to Show It Off

Leap Motion builds the leading markerless hand-tracking technology, and today the company revealed a update which they claim brings major improvements “across the board.” The upgraded tracking and improved developer tools are available in beta today on Windows, alongside three new demos to try it out for yourself.

Founded in 2010, Leap Motion released its initial product (today called the ‘Leap Motion Controller‘) in 2012, a desktop-focused peripheral which offered markerless hand-tracking. Though an interesting and functional device, the Leap Motion Controller had trouble finding a niche—seemingly a solution in search of a problem as far as desktop input was concerned. As virtual reality began to heat up, it became clear that there were major opportunities for novel input in the sector, and over time Leap Motion has pivoted its focus from desktop to VR, offering a bespoke mount to allow developers to attach the device to VR headsets for hand-tracking in VR experiences.

Updated Tracking

While Leap Motion hasn’t publicly released new hardware since the original 2012 device (lately focusing instead on building newer hardware into future VR and AR headsets), the company is adamant that their ‘secret sauce’ is actually in their software—which is why they’ve been able to significantly improve the unit’s hand-tracking performance over time, like when they introduced the ‘Orion’ update back in 2016.

Image courtesy Leap Motion

Having announced a $50 million Series C investment last year, the company today says its hand-tracking tech is taking another big step forward with a major update to Orion. The company notes the following improvements in what they’re calling the “fourth generation of our core software”:

  • Better finger dexterity and fidelity
  • Significantly smoother hand and finger tracking, with motions that look and feel more natural
  • Faster and more consistent hand initialization
  • Better hand pose stability and reliability
  • Improved tracking fidelity against complex backgrounds and extreme lighting
  • More accurate shape and scale for hands

New Demos

Along with the updated Leap Motion tracking software, the company is releasing three demo applications: Paint, where you can use a pinching gesture and pallet to draw; Particles, where you can play with lots of simulated particles; and Cat Explorer, where you’ll dissect a cartoon cat that’s entirely too cute to deserve such treatment.

Improved Developer Tools

Along with the updates to their tracking technology, Leap Motion is also releasing improvements to their developer tools, including newly updated Unity and Unreal Engine integrations, and deprecating some older APIs. The company details the developer-level changes on their blog here.


Disclosure: Leap Motion’s Barrett Fox and Martin Schubert have recently published a series of guest articles on Road to VR highlight their experiments in AR/VR interface design. The latest piece is here: Validating an Experimental Shortcut Interface with Flaming Arrows & Paper Planes

The post Leap Motion Releases Major Tracking Update and New Demos to Show It Off appeared first on Road to VR.

Exclusive: Validating an Experimental Shortcut Interface with Flaming Arrows & Paper Planes

Last time, we detailed our initial explorations of single-handed shortcuts systems. After some experimentation, we converged on a palm-up pinch to open a four-way rail system. Today we’re excited to share the second half of our design exploration along with a downloadable demo on the Leap Motion Gallery.

Guest Article by Barrett Fox & Martin Schubert

Barrett is the Lead VR Interactive Engineer for Leap Motion. Through a mix of prototyping, tools and workflow building with a user driven feedback loop, Barrett has been pushing, prodding, lunging, and poking at the boundaries of computer interaction.

Martin is Lead Virtual Reality Designer and Evangelist for Leap Motion. He has created multiple experiences such as Weightless, Geometric, and Mirrors, and is currently exploring how to make the virtual feel more tangible.

Barrett and Martin are part of the elite Leap Motion team presenting substantive work in VR/AR UX in innovative and engaging ways.

We found the shortcuts system comfortable, reliable, and fast to use. It also felt embodied and spatial since the system didn’t require users to look at it to use it. Next it was time to put it to the test in a real-world setting. How would it hold up when we were actually trying to do something else with our hands?

We discussed a few types of potential use cases:

#1. Direct abstract commands. In this scenario, the system could be used to directly trigger abstract commands. For example, in a drawing application either hand could summon the shortcut system – left to undo, right to redo, forward to zoom in, or backwards to zoom out.

#2. Direct contextual commands. What if one hand could choose an action to take upon an object being held by the other hand? For example, picking up an object with your left hand and using your right hand to summon the shortcut system – forward to duplicate the object in place, backward to delete it, or left/right to change its material.

#3. Tool adjustments. The system could also be used to adjust various parameters of a currently active tool or ability. For example, in the same drawing application your dominant hand might have the ability to pinch to draw in space. The same hand could summon the shortcut system and translate left/right to decrease/increase brush size.

#4. Mode switching. Finally, the system could be used to switch between different modes or tools. Again in a drawing application, each hand could use the shortcut system to switch between free hand direct manipulation, a brush tool, an eraser tool, etc. Moreover, by independently tool-switching with each hand, we could quickly equip interesting combinations of tools.

Of these options, we felt that mode switching would test our system the most thoroughly. By designing a set of modes or abilities that required diverse hand movements, we could validate that the shortcuts system wouldn’t get in the way while still being quickly and easily accessible.

Mode Switching and Pinch Interactions

In thinking about possible abilities we’d like to be able to switch between, we kept returning to pinch-based interactions. Pinching, as we discussed in our last blog post, is a very powerful bare handed interaction for a few reasons:

  • It’s a gesture that most people are familiar with and can do with minimal ambiguity, making it simple to successfully execute for new users.
  • It’s a low-effort action, requiring only movement of your thumb and index fingers. As a result, it’s suitable for high-frequency interactions.
  • Its success is very well-defined for the user who gets self-haptic feedback when their finger and thumb make contact.

However, having an ability triggered by pinching does have drawbacks, as false triggers are common. For this reason, having a quick and easy system to enable, disable, and switch between pinch abilities turned out to be very valuable. This led us to design a set of pinch powers to test our shortcut system.

Pinch Powers!

We designed three pinch powers, leaving one shortcut direction free as an option to disable all pinch abilities and use free hands for regular direct manipulation. Each pinch power would encourage a different type of hand movement to test whether the shortcut system would still function as intended. We wanted to create powers that were interesting to use individually but could also be combined to create interesting pairs, taking advantage of each hand’s ability to switch modes independently.

The Plane Hand

For our first power, we used pinching to drive a very common action: throwing. Looking to the physical world for inspiration, we found that paper plane throwing was a very expressive action with an almost identical base motion. By pinching and holding to spawn a new paper plane, then moving your hand and releasing, we could calculate the average velocity of your pinched fingers over a certain number of frames prior to release and feed that into the plane as a launch velocity.

Using this first ability together with the shortcuts system revealed a few conflicts. A common way to hold your hand while pinching a paper plane is with your palm facing up and slightly inwards with your pinky furthest away from you. This fell into the gray area between the palm direction angles defined as ‘facing away from the user’ and ‘facing toward the user’. To avoid false positives, we adjusted the thresholds slightly until the system was not triggered accidentally.

To recreate the aerodynamics of a paper plane, we used two different forces. The first added force is upwards, relative to the plane, determined by the magnitude of the plane’s current velocity. This means a faster throw produces a stronger lifting force.

The other force is a little less realistic but helps make for more seamless throws. It takes the current velocity of a plane and adds torque to bring its forward direction, or nose, inline with that velocity. This means a plane thrown sideways will correct its forward heading to match its movement direction.

With these aerodynamic forces in play, even small variations in throwing angle and direction resulted in a wide variety of plane trajectories. Planes would curve and arc in surprising ways, encouraging users to try overhanded, underhanded, and side-angled throws.

In testing, we found that during these expressive throws, users often rotated their palms into poses which would unintentionally trigger the shortcut system. To solve this we simply disabled the ability to open the shortcut system while pinching.

Besides these fixes for palm direction conflicts, we also wanted to test a few solutions to minimize accidental pinches. We experimented with putting an object in a user’s pinch point whenever they had a pinch power enabled. The intention was to signal to the user that the pinch power was ‘always on.’ When combined with glowing fingertips and audio feedback driven by pinch strength, this seemed successful in reducing the likelihood of accidental pinches.

We also added a short scaling animation to planes as they spawned. If a user released their pinch before the plane was fully scaled up the plane would scale back down and disappear. This meant that short unintentional pinches wouldn’t spawn unwanted planes, further reducing the accidental pinch issue.

The Bow Hand

For our second ability we looked at the movement of pinching, pulling back, and releasing. This movement was used most famously on touchscreens as the central mechanic of Angry Birds and more recently adapted to three dimensions in Valve’s The Lab: Slingshot.

Virtual slingshots have a great sense of physicality. Pulling back on a sling and seeing it lengthen while hearing the elastic creak gives a visceral sense of the potential energy of the projectile, satisfyingly realized when launched. For our purposes, since we could pinch anywhere in space and pull back, we decided to use something a little more lightweight than a slingshot: a tiny retractable bow.

Pinching expands the bow and attaches the bowstring to your pinched fingers. Pulling away from the original pinch position in any direction stretches the bowstring and notches an arrow. The longer the stretch, the greater the launch velocity on release. Again we found that users rotated their hands while using the bow into poses where their palm direction would accidentally trigger the shortcut system. Once again, we simply disabled the ability to open the shortcut system, this time while the bow was expanded.

To minimize accidental arrows spawning from unintentional pinches, we again employed a slight delay after pinching before notching a new arrow. However, rather than being time-based like the plane spawning animation, this time we defined a minimum distance from the original pinch. Once reached, this spawns and notches a new arrow.

The Time Hand

For our last ability, we initially looked at the movement of pinching and rotating as a means of controlling time. The idea was to pinch to spawn a clock and then rotate the pinch to turn a clock hand, dialing the time scale down or back up. In testing, however, we found that this kind of pinch rotation actually only had a small range of motion before becoming uncomfortable.

Since there wasn’t much value in having a very small range of time-scale adjustment, we decided to simply make it a toggle instead. For this ability, we replaced the pinch egg with a clock that sits in the user’s pinch point. At normal speed the clock ticks along quite quickly, with the longer hand completing a full rotation each second. Upon pinching, the clock time is slowed to one-third normal speed, the clock changes color, and the longer hand slows to complete a full rotation in one minute. Pinching the clock again restores time to normal speed.

Continued on Page 2: Mixing & Matching

The post Exclusive: Validating an Experimental Shortcut Interface with Flaming Arrows & Paper Planes appeared first on Road to VR.

Leap Motion Open-sources Project North Star, An AR Headset Prototype With Impressive Specs

Earlier this year Leap Motion had been teasing some very compelling AR interface prototypes, demonstrated on an unknown headset. The company revealed that the headset is a prototype dev kit, designed in-house, offering a combined 100 degree field of view, low latency, and high resolution. Leap Motion has begun open-sourcing the design of the device, which they’re calling Project North Star.

Update (6/6/18): Leap Motion has begun the process of open-sourcing the North Star headset. Today the company published a new hub page for project North Star and detailed documentation showing how to construct the headset. Mechanical and electronic schematics and designs have been release on GitHub, and everything falls under an Apache 2.0 license.

Image courtesy Leap Motion

“Our goal is for the reference design to be accessible and inexpensive to build, using off-the-shelf components and 3D-printed parts. At the same time, these are still early days and we’re looking forward to your feedback on this initial release. The mechanical parts and most of the software are ready for primetime, while other areas are less developed,” the company notes on its blog. “The reflectors and display driver board are custom-made and expensive to produce in single units, but become cost-effective at scale. We’re also exploring how the custom components might be made more accessible to everyone.”

The company promises additional details and updates to the open-sourced information “in the coming weeks.”

Original Article (4/9/18): Founded in 2010, Leap Motion develops leading hand-tracking hardware and software. Though their first piece of hardware was designed for desktop input, the company pivoted into VR, and more recently the AR space, exploring how their hand-tracking tech can enable new and intuitive means of interacting with virtual and augmented information.

With AR hardware still in its infancy, the company sought to build their own in-house prototype AR headset, targeting specifications far beyond what’s available to consumers today. This was so they could design AR interfaces, based on their hand tracking tech, targeting the capabilities of future AR headsets. They’re calling this work Project North Star, and plan to open-source the design next week, saying that such a headset could cost “under $100 dollars to produce at scale.”

Image courtesy Leap Motion

The prototype headset uses side-mounted displays with large ‘bird bath’ style optics (similar to the Meta 2 approach), which afford the device a 1,600 × 1,400 per-eye resolution at 120 FPS, with over 100 degrees of combined field of view, and hand-tracking from the company’s latest hardware which tracks at 150Hz over a 180 × 180 degree area.

The version of Project North Star which Leap Motion plans to open-source is actually a pared back version of an earlier prototype which boasted greater specs, but was quite a burden to wear. The team at Leap Motion constructed this earlier version as a baseline of what could be achieved.

An earlier Project North Star prototype aimed to embody top specs, but wasn’t very concerned with form factor. | Image courtesy Leap Motion

“[…] we wanted to create something with the highest possible technical specifications, and then work our way down until we had something that struck a balance between performance and form-factor,” the company shared on its blog today. “[…] The vertical field of view struck us most of all; we could now look down with our eyes, put our hands at our chests and still see augmented information overlaid on top of our hands. This was not the minimal functionality required for a compelling experience, this was luxury.”

With a good look at what could be achieved, the team used masking tape over the lenses to crop down the field of view to get a feel for how much they could reduce the size of the lenses before losing some of the essential experience due to the lower field of view. Once they found that balance they began the process of cutting smaller optics and shrinking the headset, moving from cell phone displays to a custom display system using a pair of 3.5″ fast-switching LCD displays.

“We ended up with something roughly the size of a virtual reality headset. In whole it has fewer parts and preserves most of our natural field of view. The combination of the open air design and the transparency generally made it feel immediately more comfortable than virtual reality systems (which was actually a bit surprising to everyone who used it),” the company writes. “[…] Putting this headset on, the resolution, latency, and field of view limitations of today’s [AR] systems melt away and you’re suddenly confronted with the question that lies at the heart of this endeavor: What shall we build?”

Indeed, with hardware in hand, the company has been focusing on that question; using a wearable camera, it was the Project North Star prototype through which Leap Motion’s VP of Design, Keiichi Matsuda, shot those tantalizing ‘virtual wearable’ prototype videos which we recently called “a potent glimpse at the future of your smartphone.”

Image courtesy Leap Motion

Leap Motion says they have a few tweaks left to do before open-sourcing the Project North Star design next week, including “room for enclosed sensors and electronics, better cable management, cleaner ergonomics and better curves […] and support for off the shelf head-gear mounting systems.”

There’s also a number of areas where Leap Motion says that Project North Star is ripe for further development:

  • Inward-facing embedded cameras for automatic and precise alignment of the augmented image with the user’s eyes as well as eye and face tracking.
  • Head mounted ambient light sensors for 360 degree lighting estimation.
  • Directional speakers near the ears for discrete, localized audio feedback
  • Electrochromatic coatings on the reflectors for electrically controllable variable transparency
  • Micro-actuators that move the displays by fractions of a millimeter to allow for variable and dynamic depth of field based on eye convergence

With the open-sourcing of the Project North Star hardware and software, Leap Motion hopes that the design will “spawn further endeavors that will become available to the rest of the world.”

The post Leap Motion Open-sources Project North Star, An AR Headset Prototype With Impressive Specs appeared first on Road to VR.