HTC Reveals Passthrough AR Faceplate For Vive Cosmos

A new faceplate for HTC’s modular Vive Cosmos headset will introduce AR capabilities to the device — at least for developers.

The Cosmos XR Faceplate (as it’s called) is a developer product planned for launch in Q2 2020. The kit features two “high-end”, front-facing cameras used for passthrough to see the real world when wearing the headset. Cosmos XR features a 100 degree field of view (FOV) when using AR. Four of the Cosmos’ inside-out tracking cameras remain in place for six degrees of freedom (6DOF) tracking.

Cosmos XR side

The Cosmos should be able to provide an augmented reality experience with the faceplate — allowing for a view of the real world with digital objects inserted into the scene. HTC Vive Americas GM Dan O’Brien told us the kit supports hand-tracking input as well as a single Vive Cosmos controller in experiences built by Vive’s own internal team.

Whereas headsets like HoloLens and Magic Leap provide a dedicated AR solution made possible with transparent optics to see the real world, a traditional VR display is opaque and powers a view of a fully virtual environment. The XR Faceplate, then, brings in the view of the outside real world to allow a “passthrough” way to view AR. This should unlock different types of experiences that mix elements of both AR and VR.

Crucially, though, the XR Faceplate is not an add-on intended for gamers and other consumers that bought a Vive Cosmos, at least it isn’t yet. O’Brien explained that the device would eventually be “for enterprise and professionals”, but it’ll need a developer test run first.

“The way it works on the enterprise side is you go proof-of-concept, to pilot, to full distribution,” O’Brien said of the kit’s roll out. “It’s a different adoption cycle, much different than game and consumer.”

Developers will be getting a first look at the XR Faceplate at GDC 2020 in mid-March. HTC will then ship it out to studios in Q2, though a price wasn’t mentioned. Given its enterprise-focus, though, we wouldn’t expect it to be cheap.

Vive Cosmos isn’t the only headset bringing passthrough AR to market. Earlier this month we took a look at the LYNX-R1, another enterprise level device, albeit without the modular features.

The post HTC Reveals Passthrough AR Faceplate For Vive Cosmos appeared first on UploadVR.

LYNX-R1 Is A $1500 Standalone Passthrough AR Headset

LYNX-R1, from France-based startup Lynx, is the first 6DoF standalone passthrough video AR headset. It is priced at $1499.

Passthrough AR

There are currently two fundamental types of AR headsets: see-through and passthrough. Most AR headsets, such as Microsoft’s HoloLens 2, are see-through. The user views the real world through the glass directly, with virtual objects superimposed onto that glass.

The technology behind see-through AR optics is still in the very early stages. The field of view is narrow and virtual objects cannot be fully opaque.

Passthrough headsets, like LYNX-R1, use the same kind of display system as VR headsets, except instead of rendering an entirely virtual world they display the real world via cameras. While the real world won’t necessarily look as good, this allows for AR across a much wider field of view, as well as full virtual object opacity and lower cost (HoloLens 2 is priced at $3500).

Some, including Lynx, call this kind of product a “mixed reality headset”.

Fully Standalone

LYNX-R1 is not the first passthrough AR headset. Varjo XR-1 and XTAL both promote similar capabilities, as was the Vrvana Totem acquired by Apple.

But other 6DoF passthrough headsets have to be tethered to a PC in order to function. Like the Oculus Quest, LYNX-R1 is standalone and wireless — it has the computing hardware, battery, and storage onboard.

Lynx is using Qualcomm’s new Snapdragon XR-2 chipset. That’s a for-XR variant of the Snapdragon 865, roughly twice as powerful as the Oculus Quest’s Snapdragon 835. This is paired with 6GB of RAM and 128GB of onboard storage.

Other Specifications

The headset uses dual 1600×1600 LCD panels running at 90Hz- LCD provides better sharpness than PenTile OLED, but with less rich colors. The lenses are unique “4-fold catadioptric freeform prisms” with a claimed circular field of view of 90°.

There are four cameras on the exterior. Two black & white cameras provide positional tracking, and two color cameras are used for the passthrough and computer vision tasks such as occlusion mapping and hand tracking, including gesture recognition.

Inside the headset are eye tracking cameras. The headset also has two speakers and two microphones, enabling positional audio and voice communications.

Charging is done via the USB Type-C port, and Lynx claims the battery lasts for two hours of “active use”.

Gap In The Market?

LYNX-R1 could fill a gap in the enterprise market for a wireless, standalone AR headset which offers a relatively wide field of view for a relatively affordable price. For consumers, the idea of video passthrough may be unappealing, but professionals might not care whether they’re looking through glass or a camera.

LYNX-R1 is slated to ship in summer of this year. Preorders are available now from the company’s website.

The post LYNX-R1 Is A $1500 Standalone Passthrough AR Headset appeared first on UploadVR.

Join Us LIVE On YouTube To Discuss The Week’s News & The XR/VR/AR/MR Debate

UploadVR’s weekly podcast, The VR Download, is LIVE on YouTube today at 10:30am PST (18:30 UTC)!

Unlike regular video podcasts, The VR Download is broadcast from virtual reality! Our team are together in a virtual space, giving us many of the benefits of a studio even though we live on different continents.

This week’s Hot Topic: What do the terms VR AR MR & XR actually mean? Does it matter?

If you want to know more about The VR Download, head on over to our new webpage for the show!

As always, we’ll also be making it available for audio-only listening on Apple, Google, Spotify, TuneIn/Alexa, Stitcher, and more within a couple of hours of airing.

Watch In VR With Bigscreen!

Every episode, you can watch The VR Download LIVE in virtual reality with an audience of other VR users on any major VR headset (including Oculus Quest and Oculus Go!), via the Bigscreen platform.

Click on the image above to subscribe to the event.

The post Join Us LIVE On YouTube To Discuss The Week’s News & The XR/VR/AR/MR Debate appeared first on UploadVR.

MIT ‘Reality Hack’ XR Hackathon Runs From January 16-20

MIT is hosting an event called the MIT ‘Reality Hack XR Hackathon between the 16th and 20th of January. The event is aimed at bringing people who work, study or are interested in VR and AR, requiring them to collaborate and compete in workshops over a 3-day hackathon, after which their work will be judged at a public expo.

The Hackathon is held every January and “comprises of thought leaders, brand mentors and creators, participants, students, and technology lovers to come together and attend tech workshops, talks, discussions, fireside chats, collaborations, hacking, and more.” The first 3 days of the event are for workshops where the teams produce their work. The final two days comprise of the expo, where their work is judged with the second day of the expo open for public viewing.

The event also ran in 2019, with 400 participants (with 40% representation of women and non-binary genders) who were split into 100 teams and worked with 60 mentors and 25 judges. The hackathon also has some pretty high profile sponsors this year as well, including Microsoft, Nreal, Magic Leap, HTC Vive, HP and more.

For more information on the Hackathon, or to register interest in competing in the event, you can visit the MIT Reality Hack site.  If you’re in Massachusetts, you can go and see all the hard work that comes out of the hackathon yourself. The public expo is open on the 20th of January, at the MIT Ray and Maria Stata Center in Cambridge, from 2:00-4:30pm. You can register for a free ticket on Eventbrite.

The post MIT ‘Reality Hack’ XR Hackathon Runs From January 16-20 appeared first on UploadVR.

Microsoft’s SharePoint Spaces Collaborative Tool Is Coming This Year, Including Quest Support

Microsoft has revealed that its file sharing and digital collaborative work platform, SharePoint, is getting a VR Maker Tool later this year, dubbed SharePoint Spaces.

The tool was originally revealed back during the SharePoint Virtual Summit in 2018 but is nearing release generally across the gamut of widely-used VR headsets including the Oculus Quest and Microsoft’s own HoloLens AR headset. Currently it’s in private preview now but is expanding beyond that soon and is expected to launch in the first half of 2020, as spotted by Virtual Reality Times.

According to Microsoft’s own official description, SharePoint Space allows you to:

“Build immersive experiences with point-and-click simplicity; start with smart templates that have beautiful surroundings, ambient sounds, rich textures, and lighting. Then add content, which can include 2D files and documents or images you may already have in SharePoint. Immerse yourself in mixed reality, focus your attention, engage your senses, and spark your curiosity and imagination. Unlock new scenarios for communication, learning, and collaboration.”

Reportedly it will be seamlessly integrated with SharePoint proper in a very intuitive way. For example, once something is uploaded into a folder it will be immediately viewable in a VR space. SharePoint Spaces will also support immersive content like 360 photos and videos, as well as 3D models and objects. You can connect together pieces of information to make courses and information flows without needing any programming knowledge — it’s all point-and-click based inside SharePoint Spaces itself.

Would your company benefit from having a VR workspace like SharePoint Spaces? Let us know if this is appealing to you down in the comments below!

The post Microsoft’s SharePoint Spaces Collaborative Tool Is Coming This Year, Including Quest Support appeared first on UploadVR.

Unity Adds Toolkit For Common VR/AR Interactions

The Unity game engine released a preview (beta) ‘XR Interaction Tookit’ which handles some core interactions for VR and AR.

Like most optional Unity features, the XR Interaction Tookit is downloaded and activated from the Package Manager.

Unlike with the VRTK toolkit, Unity is preferring laser pointer selection to direct manipulation. This works well with a wider variety of platforms, but can be less immersive for higher end systems.

Unity’s XR Interaction Tookit currently provides the following 4 features:

Object Selection & Manipulation (AR & VR)

This lets the user point a laser at objects, select them, and either directly or from a distance grab the object. With an object grabbed the user can rotate it or throw it. This behaviour is configurable.

UI Interaction (VR)

The same kind of laser pointer used for object selection can also be used for UI interactions. This means that the built in Unity UI system that developers are already used to can be used in VR.

Teleportation & Snap Turning (VR)

This allows developers to quickly add teleportation and snap turning to their apps. “Smooth” locomotion isn’t built in, but that’s much easier for a developer to add.

Object Placement (AR)

This feature, for Apple’s ARKit and Google’s ARCore, allows smartphone users to swipe to place virtual objects on real world planes.

No Need To Reinvent The Wheel

The goal of the XR Interaction Tookit seems to be to make it so that developers don’t need to “reinvent the wheel” for simple VR & AR interactions.

For new developers this means less time is required on the basics, and this time can then be used to craft the actual experience. Of course, developers on larger projects may use a more advanced framework such as VRTK– although VRTK has no UI features yet.

For end users, since Unity powers the majority of XR apps, this could result in more consistency and standardization for fundamental VR & AR interactions. If you know how to select, grab, and locomote in one VR app, that should transfer over to others — similar to how control schemes for console games eventually standardized over time.

The post Unity Adds Toolkit For Common VR/AR Interactions appeared first on UploadVR.

Eden Snacker Is A Low Friction VR System Aimed At Enterprise

Eden Immersive unveiled a new VR headset kit for enterprise applications called the ‘Snacker’.

The Snacker is a 3DOF headset without straps or head mount. Instead, the headset uses a grip that allows users to hold the headset to their face during use. The grip system was developed specifically with the enterprise market in mind, so that companies can provide a solution to users and customers that has less friction and boosts engagement, while also ensuring users can avoid messing up their hair or makeup during use. The idea being that, per its name, this makes VR content ‘snackable’.

snacker specs

The headset claims a field of view of 105 degrees and a refresh rate of 72Hz, with a 3840 x 2160 LCD panel.

The headset and grip rests together on a stand, allowing anyone to easily pick up and use the Snacker headset. The stand also wirelessly charges the headset and can support 4G or 5G to keep the headset up to date without the need for a wired or Wi-Fi connection.

eden snacker system

The headset is accompanied by a tablet that allows enterprises to control and guide users through the VR experience with ease. There is also a ‘self-serve’ version of the tablet, that allows the user to control the experience themselves by selecting options on the tablet before raising it to their face.

The Eden Snacker is scheduled to launch in 2020. Any businesses or other parties interested in the headset can sign-up for early access.

The post Eden Snacker Is A Low Friction VR System Aimed At Enterprise appeared first on UploadVR.

Survey: Enterprise VR growing twice as fast as consumer use

(Image courtesy rawpixel via Pixabay.)

VR and AR vendors are seeing a lot more growth on the enterprise side of the market, with 46 percent saying that growth has been strong or very strong for enterprise VR, and 47 percent saying the same about enterprise AR and MR.

By comparison, only 24 percent are reporting strong growth for consumer virtual reality, and only 31 percent say the same for consumer augmented reality and mixed reality technology, according to a survey of hundreds of businesses released this month by VR Intelligence.

How companies describe VR and AR business growth over the past 12 months, according to the 2019 XR Industry Survey by VR Intelligence. (Image courtesy VR Intelligence.)

As a result, two-thirds of VR and AR technology companies are focusing on the enterprise market, compared to a third who are prioritizing the consumer side.

In fact, most respondents said that they don’t expect to see widespread consumer adoption of extended reality technologies — virtual reality, augmented reality, or mixed reality — for at least three years, with about a fifth saying that this won’t happen for at least five years.

Obstacles to adoption include the price of the headsets, lack of content, usability problems, and motion sickness.

However, report authors said that 2020 could be a breakout year due to a series of recent announcements from companies such as Google, Apple, Facebook, YouTube, and Huawei.

The report also included responses from enterprise uses of XR technology.

The results were overwhelmingly positive — 93 percent of enterprises said that VR had a positive impact, and 88 percent said that AR and MR had a positive impact.

Business impact of VR and AR, according to the 2019 XR Industry Survey by VR Intelligence. (Image courtesy VR Intelligence.)

The XR Industry Insight Report was researched and produced in association with VR Intelligence‘s upcoming VRX conference and expo, which takes place from December 12-13, 2019 in San Francisco.

Why XR needs diversity

(Image courtesy Draw & Code.)

Immersive technology means you are able to place yourself inside a world, not just to peer inside it. Like all of the tech industry, extended reality features a diversity imbalance — is this burgeoning XR sector a chance for us to finally build a diverse corner of technology? And is its inherently first-person nature making it all the more important that diversity is pursued?

Today I am thinking about stories. Why? Because this morning I was reminded of the eternal Isaac Newton quote that “If I have seen further than others, it is by standing upon the shoulders of giants.” And I began to wonder who and what these giants may be in our world. Our urgent, cutting edge, immersive world.

The legacy of giants is very much alive in the everyday for us. Ada lives in each line of code, Steve in every swipe and Tom in every VR headset. Their work is alive in the hardware and the software that is used by nearly every single person both in our studio and on this planet. A commendable success, a feat that most certainly makes them giants amongst us mere mortals. To put it simply, their work altered our world and ourselves forever.

And with time this legacy of theirs has morphed into something far greater and most certainly far more human than I suppose they ever would have anticipated. Machines and solutions deemed as robotic and missing of human emotions — and human error — are being reimagined into something that really is very human. Their legacy has begun to remold the art of storytelling.

Storytelling is one of the most deep-rooted and empirical aspects of being human. We are all storytellers: in our minds as we think about the everyday, in the boardroom where we discuss future sales and in our homes as we raise our children, build lasting relationships and gorge on the works of J.K Rowling, Steven King, Charles Dickens, Jane Austin. Even our neighbor’s series of tweets about the big match tell a tale — at the core, these things that we do are all stories. And these stories encompass our visions of the future, opinions on the past and our interaction with the present.

Look at the applications we have devoured; Snapchat tells a story through quickly captured footage, Facebook tells a story through a status, Instagram through an image, Twitter through a microblog. We are all authors in those worlds.

But here in the corner of the tech industry that Draw & Code inhabits, we boast something quite magical; immersive technology. It is the wholehearted belief of many that immersive technology is one of the greatest gifts we have created as storytellers. Immersive experiences, particularly via VR, are about embodiment — you are in the shoes, or at least the eyes, of the protagonist. It’s not distant, there is no cinematographer — this is as close to living the story for yourself as it gets.

Yet, we are serving it an injustice. It is our job as the purveyors of this magic, to tell the story correctly. However, we cannot tell stories properly, fully, unless we have the experiences and the opinions of many. We cannot tell stories properly until we understand the many angles and focal points it’s viewed through. And so we bang on our drum that we need more diversity, but I often wonder if shouting about it is enough?

Surely if people knew how incredible this corner of technology was, they would come and join it? Society is clouded with old judgments and stereotypes, all too quick to assume that technical work is filled with only the things that a man can love, rather than the inspirational and incredible things that actually exist in our day-to-day life in this studio. At Draw & Code, I’ve found myself surrounded by and contributing to projects that take audiences back in time to the world inhabited by the Terracotta Army, make toys bursts into life in the palm of your hand, explored interior design solutions for leading retail brands. Immersive technology is in demand right now and it’s a passport to adventures across a multitude of sectors. And it’s the opportunity to work in teams filled with exceptional talent and exceptionally warm hearts — people love what they do in XR.

So we, like much of the XR sector, continue to work on our magic with our male-dominated teams and our solutions that enchant and excite. But how much could the output of the immersive industry be improved by looking beyond the current workforce? And while we will always enthrall the tech-savvy, could the products produced by this evolving industry have an even wider appeal if they were coming from teams that represented a bigger audience? Our immersive technology corner has a duty to encompass all and everyone, if at all possible. To awaken the minds and memories of the old, to excite the generation of tomorrow and to alter the thinking of those who can shape the world. Our corner has the potential to make more stories — in the home, in the workplace, on the bus. And, it will.

In immersive technology, our work is far from robotic and dull. We’re not just machines and code, we’re the people who will tell the stories of tomorrow and we want our stories to be whole.

This column reprinted with permission from Draw & Code.

OpenXR: Version 0.9 des XR-Standards veröffentlicht; Finale Version folgt noch 2019

Die Khronos Group vereint seit 2017 die größten Unternehmen der AR- und VR-Branche, um einen gemeinsamen XR-Standard festzulegen. Nun veröffentlichen die Verantwortlichen eine neue 0.9 Version, die als Vorreiter zu Testzwecken für VR-Devs bereitsteht. Basierend auf dem Feedback der VR-Experten soll daraufhin die finale Version 1.0 innerhalb dieses Jahres erscheinen.

OpenXR – Version 0.9 des XR-Standards veröffentlicht, Finale Version folgt noch 2019

Die Khronos Group, das Konsortium führender Hard- und Software-Unternehmen, veröffentlicht offiziell die OpenXR 0.9-Version und gibt somit die ersten, vorläufigen Spezifikationen und Ratifikationen für den zukünftigen XR-Standard bekannt.

OpenXR soll einen gemeinsamen, einheitlichen und lizenzfreien Standard bereitstellen, um Devs die Möglichkeit zu geben, Software für unterschiedliche VR-Endgeräte zu entwickeln bzw. diese, dank einer plattformübergreifenden API zu portieren. Das Ziel des Projekts ist es, die Branchenfragmentierung der XR-Industrie dauerhaft zu reduzieren.

OpenXR-Khronos-Group-XR-Standard

Image courtesy: Khronos Group | via: Road to VR

So schreibt Brent Insko, leitender VR-Architekt von Intel und Leiter der OpenXR-Arbeitsgruppe:

OpenXR zielt darauf ab, die AR-/VR-Software-Entwicklung zu vereinfachen, indem es Anwendungen in die Lage versetzt, eine größere Anzahl von Hardware-Plattformen zu erreichen, ohne ihren Code portieren oder neu schreiben zu müssen, und es Plattformanbietern ermöglicht, die OpenXR-Zugriff auf weitere Anwendungen unterstützen. Die vorläufige OpenXR-Spezifikation wird, zusammen mit den bei Markteinführung öffentlich verfügbaren und in den nächsten Wochen kommenden Runtimes, praktische, plattformübergreifende Tests durch App- und Engine-Entwickler ermöglichen. Die Arbeitsgruppe begrüßt das Feedback der Entwickler, um eine OpenXR 1.0-Spezifikation zu gewährleisten, die den Anforderungen der XR-Branche wirklich gerecht wird.”

Die vorläufige 0.9-Version soll nun zu Testzwecken von XR-Entwicklern genutzt werden, um Feedback für die finale 1.0 Version zu sammeln. Basierend auf der Rückmeldung der XR-Implementierer soll schließlich der Feinschliff für die Endvariante stattfinden. Die finale Endfassung der OpenXR soll noch 2019 erscheinen.

Um den OpenXR-Standard für AR und VR durchzusetzen, unterstützen namenhafte Vertreter, wie Oculus, Valve, Unity, Epic, Samsung, Sony, Google, Intel, Nvidia, Microsoft, Magic Leap und HTC das Projekt.

OpenXR-Khronos-Group-XR-Standard

Image courtesy: Khronos Group

Zeitgleich mit dem vorläufigen Release veröffentlicht Microsoft eine OpenXR-Runtime, um den neuen Standard mit Windows-VR-Brillen kompatibel zu machen. Support für die kommende HoloLens 2 soll ebenso zeitnah folgen. Auch Oculus soll noch in diesem Jahr eine passende Runtime-Unterstützung für Oculus Rift und Oculus Quest bereitstellen. Das Unternehmen Collabora kündigte zudem eine Open-Source SDK für Linux an.

Die 0.9-Version sowie weitere Informationen finden sich auf der offiziellen Webseite der Khronos Group.

(Quellen: Road to VR | Khronos Group)

Der Beitrag OpenXR: Version 0.9 des XR-Standards veröffentlicht; Finale Version folgt noch 2019 zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!