Batman “Mixed Reality” Demo Shows VR Cloud Gaming Still Has a Long Way to Go

At this year’s Mobile World Congress (MWC) it seemed every mobile network operator on the planet was hocking the next hotness in data connectivity, the magical alphanumeric ‘5G’. It’s true 5G is slated to make way for plenty of changes in how users consume mobile content thanks to a dramatic increase in bandwidth and lowered latency, but if you’re salivating over the possibilities for what that means for VR gaming in the near future, you may want to step back a bit.

Cloud gaming isn’t a new concept, at least in the world of traditional flatscreen games. Nvidia has GeForce Now, Sony has PlayStation Now, and both Google & Microsoft have their own future cloud gaming projects in the work too. While the infrastructure around gaming-focused edge computing is still very much in its early stages, requiring companies to maintain servers as close to the end-user as humanly possible, the hypothetical benefit to gamers is obvious. Extremely low-powered computers can stream games only previously available on the best of the best rigs.

Mobile network operators like Verizon, Sprint, AT&T, T-Mobile, Vodafone and many others are signing on to paint the world with 5G starting this year. An increasing number of users will soon have access to data faster than even at-home fiber optic cables can provide, making the migration from wired to wireless almost a forgone conclusion in the minds of many. To wit, some of these companies publicly showed off VR cloud gaming actually working for the first time—one of the most difficult problems due to the inherent need to keep VR games chugging at or below the 20ms latency threshold, which is considered the bare minimum before users notice anything.

Stepping into Ericsson’s enormous MWC booth, second only to Huawei at over 6,000 mt² (~65,000 ft²), I cautiously ambled over to an abandoned station outfitted with a Vive Pro and Vive Wireless Adapter. The booth attendant claimed the “mixed reality” Batman experience, which was built in partnership by AT&T, Ericsson, Warner Bros. and Intel, was delivering a total latency between 4 – 6 ms through their mock-up 5G network. That’s basically the bare minimum you can expect, so I was excited to pop in and see for myself.

Image courtesy Ericsson

Although the real-time rendered experience didn’t suffer any discernible latency, it was an absolute failure at demonstrating why VR users want cloud gaming in the first place. In short: it was hot garbage.

With video from Vive Pro’s passthrough cameras placed as a backdrop behind my head (that’s totally “mixed reality,” right?), essentially what I experienced was a 180-degree mess. I was treated to extremely low poly graphics that looked about on par with what can be accomplished on a mobile VR headset like Gear VR or Oculus Go. Adding insult to injury, the two-minute experience, which featured Batman stopping the Scarecrow from—no joke—using 5G for evil, was presented to me in 3DOF and not in the full positional tracking Vive Pro was capable of. I was also told there was an interactive bit using a single Vive controller, but the booth attendants removed it because “nobody understood what to do.”

After seeing it in 3DOF and without any level of interactivity, I was pretty skeptical whether it was actually real-time rendered experience or just a 180-degree stereoscopic video. I was assured it all real-time.

Image courtesy Ericsson

I imagine this was done for the singular reason of showing the setup’s lowest possible latency. It’s not an unsubstantial achievement from a technical aspect either, but low latency is as good as useless if this is the sort of toothless VR content AT&T, Intel, Ericsson and Warner Bros. thinks will fit into a real-world use case. Cutting literally every possible corner on content to get latency down to something you can proudly advertise as ostensibly solved borders on willful deception.

There was a company at MWC pushing a more realistic version of VR cloud gaming though, warts and all. Two days earlier I got a chance to visit HTC’s booth where they were showing a similar setup streaming Superhot VR (2017) to a Vive Focus Plus over a mock-up 5G network. Although the implementation was far from perfect, it at least showed real SteamVR content running in the cloud, and delivered in 6DOF like you’d expect.

Image courtesy HTC

HTC’s streaming latency was well above 20ms, and it seemed to be heavily relying on time warp to keep things smooth. To me, it further drives home the fact that even in controlled environments with purpose-built networks completely dedicated to the task of remotely rendering VR games, there’s still a long way to go before we get plug-and-play VR cloud gaming.

While Ericsson’s demo failed to accurately sell the core idea behind the technology, it did manage to unwittingly reveal that VR cloud gaming is going to be an extreme balancing act when it comes at some point in the future.

The post Batman “Mixed Reality” Demo Shows VR Cloud Gaming Still Has a Long Way to Go appeared first on Road to VR.

Hands-on: Vive Focus Plus Brings 6DOF Controllers, New Lenses & Better Comfort

Announced just last week, Vive Focus Plus is HTC’s next iteration of their standalone VR headset that adds a pretty big missing puzzle piece, namely a replacement for its predecessor’s single 3DOF controller (rotation only) for the new 6DOF controllers. But that’s not exactly all there is to Vive Focus Plus.

We’ve seen Vive Focus using 6DOF controllers a few times before the new version’s public debut at MWC 2019—it was only last month at CES that the company’s hardware partner Chirp Microsystems (now acquired by TDK) was still giving demos of the 6DOF controller dev kit in action, which included two controllers and an external snap-on faceplate studded with the little pinhole-size emitters and receivers for its ultrasonic controller tracking.

Vive 6DOF controller dev kit, Photo by Road to VR

A few thing have changed about Vive Focus Plus from its predecessor—the new headset now boasts better comfort, a clearer image delivered by new lenses, and of course the 6DOF controllers—although not much else has changed from Focus to Focus Plus outside of that.

It still contains the same Qualcomm Snapdragon 835 SoC, the same dual 1,600 × 1,440 OLED displays, and the same optical headset tracking as before.

Vive Focus Plus, Photo by Road to VR

I’ll save you some time if you’ve been following along already by not rehashing the entire product; if not, check out our in-depth hands-on with the original Vive Focus from last year’s MWC to learn more about what fundamentally makes both of the headsets tick.

Now let’s dive into what’s new.

6DOF Controllers

Predictably the controllers handle nearly the same as Chirp’s, and we’ve written good and plenty about them over the past few months too. A quick refresher: it’s a pretty acceptable controller tracking implementation that offers ‘good enough’ latency and a wide enough tracking volume to keep the controllers spatially positioned even when they’re directly outside of the user’s field of view, meaning you won’t actually ever notice when your controllers go outside of their operational range (putting your hands behind your back, or behind your head).

Photo by Road to VR

The headset and controllers send and receive ultrasonic sound in a frequency that it’s well out of the range of what even a dog can hear, so there’s seems to be little that can throw them off in terms of interference. I found the controller tracking to work as advertised even on the noisy expo floor.

That said, the final production controller isn’t the best design out there, but certainly not the worst. This mostly comes down to button layout and its ambidextrous design—both left and right controllers are interchangeable. While a bespoke left and right controller molded to each hand would invariably be a better fit, it’s decidedly much more natural-feeling than the wands that come with the other Vive varieties on offer.

Photo by Road to VR

On the controller you’ll find a touchpad, an index finger trigger and a secondary grip button on the under side, and two awkwardly placed menu and home buttons that require you to adjust your grip to press.

In contrast to the dev kit, the new controllers offer only one new real hardware feature I could tell just by handling them; button and trigger presses offer an improved analogue feel. In short, the controllers get the job done and appear to do so without taxing the Snapdragon 835 too much.

SEE ALSO
Hands-on: HoloLens 2 is a More Than Just a Larger Field of View

Improved Ergonomics & New Lenses

The new headset makes one obvious change from the original Vive Focus in the ergonomics department, a larger forehead piece that provides better weight distribution. And while it’s not an unsubstantial change, I’d really need a much longer session than an expo floor demo to figure out just how much of an effect it has. That said, it did feel more comfortable than Vive Focus, something I last tried at CES in January.

Now that the ultrasonic controller tracking system is embedded in the headset itself, the new Focus Plus boasts an obvious benefit over the dev kit by delivering better balance, as the front-heavy tracking tech is offset internally.

The one change that is pretty significant is the new optics, which are remarkably clearer than the previous Vive Focus. While I wasn’t able to get confirmation on what specifically changed from Focus to Focus Plus, thankfully it’s fairly easy to see with the naked eye.

Vive Focus Plus lenses, Photo by Road to VR

It appears the company has taken a few design cues from Samsung HMD Odyssey, which have a similar Fresnel ridge layout. All other Vive products, the original Vive Focus included, feature concentric Fresnel ridges that terminate at a much smaller circle in the middle of the lens. In Vive Focus Plus you can see the ridges dissipate much earlier before reaching the center.

Here’s a look at Samsung Odyssey for comparison:

Samsung Odyssey lenses, Image courtesy Tom’s Hardware

And now for the original Vive Focus.

Vive Focus lenses, Photo by Road to VR

The upside to the new lens design is a reduced perception of an optical artifact unique to Fresnels known as ‘god rays’, or streams of light that appear to emanate from the center of the lens and jut outwards like wispy beams of light. The only app on display with high enough contrast to test this was the company’s conceptual 5G Hub cloud-rendering setup that streamed PC VR shooter Superhot VR (2017) to Vive Focus Plus. The reduction in god rays was notable, and I’m hoping HTC moves more towards this lens style in future products.

Continued on Page 2 : Enterprise? Consumer? »

The post Hands-on: Vive Focus Plus Brings 6DOF Controllers, New Lenses & Better Comfort appeared first on Road to VR.

Hands-on: LetinAR Brings a Larger FOV & Depth of Field to AR with ‘Pinhole Effect’ Optics

LetinAR, a Korea-based startup developing optical solutions for AR headsets and smartglasses, showed off a new implementation of their ‘pin mirror’ optical tech at Mobile World Congress (MWC) this week, bringing a sharp image quality, a large range of focal depth, and a field of view (FOV) that rivals many of the bigger players who instead use waveguide-based optics.

Dubbing their optics ‘PinMR’, the company develops a number of lens configurations featuring small pinhole-sized mirrors oriented to reflect images from displays (both micro and larger formats) to the user’s eye.

The company’s latest product demo, which they call PinMR “8K”, sources imagery from two 5-inch 4K LCD displays held within a fixed viewing frame. Questionable branding aside (you only see a 4K stereoscopic image), LetinAR told me the “8K” demo was created specifically to show off the boundaries of what their optical tech was capable of.

Photo by Road to VR

The image was bright, very sharp, and more importantly allowed me to focus on 3D images at varying distances, from 50cm (~20 inches) to infinity—something that’s done without the use of lightfields. The concept at play here is similar to pinhole camera photography, which creates images with a near-infinite depth of field, making everything appear in focus by limiting the amount of light to a small, more concentrated beam. In contrast to HoloLens 2 or Magic Leap One, PinMR doesn’t use waveguides, which the company says cuts down manufacturing complexity and cost.

Side note: Karl Guttag, chief science officer of AR company RAVN and all around AR expert, notes in a previous demo at CES 2018 that LetinAR appears to be following in the footsteps of research done at the University of North Carolina that produced a similar ‘pinlight’ optic for wide FOV augmented reality.

SEE ALSO
Hands-on: HoloLens 2 is a More Than Just a Larger Field of View

With an embedded Leap Motion hand tracker on the opposite side of the fixed frame, I was able to reach out to the 3D objects and bring them right up to the point where the vergence-accommodation conflict took over and I went a bit cross-eyed trying to resolve the image. That point seemed to a little under an arm’s length away, although the company is able to produce similar configurations that go down to a 25cm in focal depth.

I had to get fairly close to the optics themselves, which means glasses wearers would need a special prescription lens add-on, although the setup’s 80-degree FOV is seemingly only rivaled by larger ‘bird bath’ style optics such as Leap Motion’s Project North Star. This, I was told, was a result of 50 tiny pin mirrors used in each one their ‘E99g’ optics.

Photo by Road to VR

The demo wasn’t without its downsides though. While the company claims the pin mirrors themselves aren’t detectable by the eye because they’re technically smaller than a human pupil, I could see a distinct blur in the extreme near-field.

This, I was told, was the result of creating slightly larger pin mirrors to ensure that the eyebox (or ‘sweet spot’) was large enough to accommodate a wide range of users. My time with a pair of smartglasses that was calibrated to my unique interpupillary distance seemed to diminish the issue, although I still could see somewhat of a distinct shadow of a blur when individual pin mirrors were unlit.

Photo by Road to VR

The smartphone-tethered smartglasses were built to demonstrate just how small a form factor their tech can achieve. The resolution of the single OLED microdisplay housed within the pretty normal-looking pair of ’50sstyle frames wasn’t anything to write home about—only 640 × 400 pixels—although reading small-sized text was still possible. What’s more, the smartglasses didn’t feature a chunky protrusion like Google Glass, or super wide temple pieces as with standalone smartglasses like Vuzix Blade.

The smartglasses demo was pretty standard fare; a small map for supposed turn-by-turn directions, a mock real-time translation from English to Spanish—the sort of conceptual things to get manufacturers excited about creating their own smartglasses using LetinAR’s optical tech.

As a smartglasses solution, LetinAR shows some real promise since the pin mirrors only sit in the periphery of one eye. As a full AR headset, we’ll have to wait and see. At this point, LetinAR is in talks with manufactures to supply engineering samples, which consist of a PinMR lens and a microdisplay from external partners.

LetinAR is coming to Augmented World Expo (AWE) this summer, so we’ll definitely be checking back in to see how the company progresses with what so far appears to be an intriguing and novel optical solution on the rise.

The post Hands-on: LetinAR Brings a Larger FOV & Depth of Field to AR with ‘Pinhole Effect’ Optics appeared first on Road to VR.

Microsoft Significantly Misrepresented HoloLens 2’s Field of View at Reveal

To significant anticipation, Microsoft revealed HoloLens 2 earlier this week at MWC 2019. By all accounts it looks like a beautiful and functional piece of technology and a big step forward for Microsoft’s AR initiative. All of which makes it unfortunate that the company didn’t strive to be clearer when illustrating one of the three key areas in which the headset is said to be improved over its predecessor.

On stage this week at MWC 2019, HoloLens visionary Alex Kipman was the one to officially reveal HoloLens 2. The headset, he said, delivers improvements in three key areas that customers of the original headset had consistently asked for: field of view, comfort, and business ROI right out of the box.

For field of view—how much of your view is covered by the headset’s display—Kipman said that HoloLens 2 delivers “more than double” the field of view of the original HoloLens.

“More Than Double” …What, Exactly?

Image courtesy Microsoft

Within the AR and VR markets, the de facto descriptor used when talking about a headset’s field of view is an angle specified to be the horizontal, vertical, or diagonal extent of the device’s display from the perspective of the viewer. When I hear that one headset has “more than double” the field of view of another, it says to me that one of those angles has increased by a factor of ~2. It isn’t perfect by any means, but it’s how the industry has come to define field of view.

It turns out that’s not what Kipman meant when he said “more than double.” I reached out to Microsoft for clarity and found that what he was actually referring to was not a field of view angle, rather the field of view area, but that wasn’t explained in the presentation at all, just (seemingly intentionally) vague statements of “more than twice the field of view.”

But ok… I get it. Field of view area isn’t a bad way to compare headsets by any means, if somewhat peculiar compared to how other companies rely this information. Not the end of the world.

But then Kipman moved onto a part of the presentation which visually showed the difference between the field of view of HoloLens 1 and HoloLens 2, and that’s when things really became misleading.

From 2x to 5.2x

In the center of the image Microsoft showed the HoloLens 1 field of view, and then drew out the edges of the image to show how much larger the HoloLens 2 field of view was by comparison. Except it was hugely exaggerated.

I pulled the below photo right from the presentation and used a perspective-correct transformation to make sure the shapes are correct relative to the camera angle. Then I overlaid the actual HoloLens 2 field of view (since confirmed to me by Microsoft) as it should have been sized in the comparison:

Photo by Road to VR, based on image courtesy Microsoft

The difference between the actual field of view and what was presented is not just a little bit wrong… it’s like, way wrong. Microsoft made a big deal about the “more than double” increase (in area) compared to HoloLens 1, but managed to misrepresent the difference by an even larger margin, showing a 5.2 times area increase over HoloLens 1 in the visual.

What makes this even more unfortunate is that they attempted to show how the wider field of view would let you see more of a virtual object, but the actual HoloLens 2 field of view in this case would have truncated the example on all sides.

When I asked Microsoft about this, I was told that what was shown just an “illustration,” and only a brief part of the presentation. But it seems surprising that they didn’t take more care to faithfully represent one of the headset’s most important improvements, especially considering this is the only means of comparison short of actually having a HoloLens 1 and 2 in front of you to try.

Initially when I went asking through official Microsoft press channels for specifics on the HoloLens 2 field of view, I was told that “more than double” HoloLens 1 is all the information that would be shared for the time being. When I took to Twitter to air my frustration at the lack of clarity (especially considering that the stated “more than double” was relative to the HoloLens 1 FOV which Microsoft had also been coy about sharing) Alex Kipman responded by pointing to a huge Wired pre-reveal feature article published this week where he confirmed a 34 degree diagonal field of view (16:9) for the original headset, and a 52 degree diagonal (3:2) for HoloLens 2.

But even in an in-depth pre-reveal briefing, Microsoft still didn’t manage to accurately convey the field of view message; the Wired piece reports that the “diagonal field of view has more than doubled” on HoloLens 2—the same thing most people reasonably thought when the company said it on stage this week.

Fool Me Once

This isn’t the first time that Microsoft has been called out for doing a poor job of representing the HoloLens field of view. When the original headset was revealed back in 2015, the company used (admittedly very cool) mixed reality compositing to show a third-person perspective of all the virtual objects floating around the user. This was billed as ‘what the user sees,’ but left out the important detail that with such a small field of view there is significant truncation that is not shown. Microsoft later began adjusting many of its HoloLens marketing visualizations to be more clear about what how the user-perspective actually looked and was impacted by the field of view limitations.

– – — – –

To be clear, this isn’t the end of the world. Companies try to present their products in the best light, and oftentimes that involves stretching the truth. It isn’t clear if Microsoft was being intentionally misleading, or if someone just made a bad mockup, but it does mean that if the only thing you saw was the HoloLens 2 reveal presentation, the field of view is actually significantly smaller than what was shown on stage (and for anyone curious, nearly identical in size to that of Magic Leap).

SEE ALSO
Hands-on: HoloLens 2 is a More Than Just a Larger Field of View

This will be forgotten in time; HoloLens 2 will go on to do great things. But misrepresenting the field of view doesn’t make it bigger, and people are still going to be asking for even more field of view from HoloLens 3 and beyond. So in the meantime, Microsoft, why not just aim for accuracy and let the device speak for itself.

The post Microsoft Significantly Misrepresented HoloLens 2’s Field of View at Reveal appeared first on Road to VR.

Hands-on: HoloLens 2 is a More Than Just a Larger Field of View

Yesterday at Mobile World Congress, Microsoft unveiled HoloLens 2, the company’s next iteration of its enterprise-focused standalone AR headset. Microsoft is coming strong out of the gate with its fleet of partners as well as a number of in-house developed apps that they say will make it easier for companies to connect, collaborate, and do things like learn on-the-job skills and troubleshoot work-related tasks. That’s all well and good, but is the HoloLens 2 hardware truly a ‘2.0’ step forward? That’s the question that ran through my mind for my half-hour session strapped into the AR headset. The short answer: yes.

Stepping into the closed off demo space at Microsoft’s MWC booth, I was greeted by a pretty familiar mock-up of a few tables and some art on the wall to make it feel like a tastefully decorated home office. Lighting in the room was pretty muted, but was bright enough to feel like a natural indoor setting.

Image by Road to VR

I actually got the chance to run through two HoloLens demos back-to-back; one in the home office, and another in a much brighter space with more direct lighting dedicated to showcasing a patently enterprise-focused demo built by Bentley, a company that deals in construction and infrastructure solutions. The bulk of my impressions come from my first demo where I got to run through a number of the basic interactions introduced at the HoloLens 2 reveal here at MWC.

Fit & Comfort

Putting the headset on like a baseball cap, I tightened it snug to my skull with the ratcheting knob in the back. Overtightening it slightly, I moved the knob in the opposite direction, eliciting a different click.

Although I’m not sure precisely how Microsoft arrived at the claim that it’s “three times more comfortable” than the original HoloLens (they say comfort has been “measured statistically over population”), it presents a remarkably good fit, sporting a low density replaceable foam cushion that comfortably rests the front weight on the top of the forehead. The strap, which is a firm but flexible material, wraps around to fit snugly under what Wikipedia scholars refer to as the occipital bone. Giving it a few good shakes, I was confident that it was firmly stuck to my head even without the need of the optional top strap. In the 30-odd minutes wearing HoloLens 2 over the course of two demos, it seemed to be a comfortable fit that could probably be worn for the advertised two to three hour active-use battery life without issue.

SEE ALSO
HoloLens 2 Specs Reveal 2–3 Hour 'Active' Battery Life, Optional Top Strap, & More

Once the device was on and comfy, I was prompted with a quick eye-tracking calibration scene that displayed a number of pinkish-purple gems that popped in and out of the scene when I looked at them, then I was set and ready to start HoloLensing.

Hand-tracking & Interactions

To my left sitting on the table was a little 3D model of a cartoony miniature city with a weather information display. Like at the on-stage reveal, moving my hand closer to the virtual object showed a white wire frame that offered a few convenient hand holds to grab so I could reposition, turn, and resize the virtual object. HoloLens 2 tracks your hands and individual fingers, so I tried to throw it for a loop with a few different hand holds like an a index finger & thumb grip and a full-handed claw, but the headset was unphased by the attempt; however I found more exaggerated grasping poses (more clearly discernible to the tracking) to be the easiest way to manipulate the room’s various objects.

 

Brightness

Suffice it to say that HoloLens 2’s optics work best when a room isn’t flooded with light; the better lit space predictably washed out some of the image’s detail and solidity, but it’s clear that the headset has bright enough optics to be acceptably usable in a variety of indoor environments. Of course, I never got the chance to step outside in the Barcelona sun to see how it worked in the worst possible condition—the true test of any AR display system.

Both demos had the headset at max brightness, which can be changed via a rocker switch on the left side. A similar rocker on the right side let me change the audio volume.

Image by Road to VR

A Fitting Hummingbird

With my object interaction handling skills in check, I then got a chance to meet the little hummingbird prominently featured in yesterday’s unveiling. Materializing out of the wall, the intricate little bird twittered about until I was told to put out my open hand, beckoning it to fly over and hover just above my palm. While the demo was created with the primary purpose of showing off the robustness of HoloLens 2’s hand tracking, I couldn’t help but feel that the little bird brought more to the table. As it flew to my open palm, I found myself paying closer attention to the way my hand felt as it hovered over it, subconsciously expecting to feel the wind coming off its tiny wings. For a split second my attention drew to a slight breeze in the room.

Microsoft’s Julia Schwarz demoing on-stage, Image courtesy Microsoft

It wasn’t a staged ‘4D’ effect either. I later noticed that the whole convention floor had a soft breeze from the building’s HVAC system tasked with slowly fighting against thousands of human-shaped heaters milling about the show floor. For the briefest of moments that little hummingbird lit up whatever part of my brain is tasked with categorizing objects as a potentially physical thing.

Haptics aren’t something HoloLens 2 can do; there isn’t a controller, or included haptic glove, so immersion is driven entirely by the headset’s visuals and positional audio. Talking to Microsoft senior researcher Julia Schwarz, I learned the designers behind the hummingbird portion of the demo loaded it with everything they had in the immersion department, making it arguably a more potent demonstration than the vibration of a haptic motor could produce (or ruin) on its own. It was a perfect storm of positional audio from a moving object, visually captivating movements from an articulated asset, and the prior expectation that a hummingbird wouldn’t actually land on my hand like it would with a Disney princess (revealing it for the digital object that it was). Needless to say, the bird was small enough—and commanded enough attention—to stay entirely in my field of view (FOV) the whole time, which helped drive home the idea that it was really there above my hand. More on FOV in a bit.

Both the hummingbird and general object interaction demos show that HoloLens 2 has made definite strides in delivering a more natural input system that’s looking to shed the coarse ‘bloom’ and ‘pinch’ gestures developed for the original HoloLens. What I saw today still relies on some bits that need tutorializing to fully grasp, but being able to physically click a button, or manipulate a switch like you think you should is moving the interaction-design to where it needs to be—the ultimate ‘anyone can do it’ phase in the future when the hardware will eventually step out of the way.

Eye-tracking & Voice Input

With my bird buddy eventually dematerialized, I then went onto a short demo created specifically to show that the headset can marry eye-tracking and voice recognition into a singular task.

I was told to look at a group of rotating gems that popped after I looked directly at one and said the word “pop!” My brief time with the eye-tracking in HoloLens 2 left me with a good impression; though I didn’t have a way to measure it, I’ve tried nearly every in-headset eye-tracking implementation spanning the 2015-era Fove headset up to Tobii’s new integration with HTC Vive Pro Eye).

Image courtesy Microsoft

The last portion of the demo had one of the most plainly practical uses I’ve seen for eye-tracking thus far: reactive text scrolling. A window appeared containing some basic informational text, and as I naturally got to the bottom of the window it slowly started to scroll to reveal more. The faster I read, the faster it would scroll. Looking to the top of the window, I automatically scrolled back up. It was simple, but extremely effective.

Continued on Page 2: Field of View »

The post Hands-on: HoloLens 2 is a More Than Just a Larger Field of View appeared first on Road to VR.

NextVR and Qualcomm to Demo 5G 6DoF VR Streaming at MWC19

Love it, hate it, understand it or don’t, VRFocus is talking about the next step in wireless communication, 5G. You may have only just got used to 4G, but its bigger brother has been in development for quite a while, and Mobile World Congress 2019 is overflowing with 5G talk. That’s because companies like Qualcomm have been making some serious investments in the technology, the Snapdragon 855 Mobile Platform being the big reveal today. Also getting in on the action is immersive sports and entertainment broadcaster NextVR, announced broad support for the emerging 5G ecosystem and partnering with Qualcomm.

NextVR

For the event in Spain, NextVR will be demonstrating a new stereoscopic, ultra-high resolution, video experience called  Fearless on Qualcomm’s booth. Fearless is designed to play on a 5G enabled handset powered by the Qualcomm Snapdragon 855 Mobile Platform and features six-degrees-of-freedom (6DoF) streaming.

The video follows three professional cliff divers as they jump from an 85 feet high cliff near Koko Head on Oahu island, Hawaii. Viewers will be able to move about within the experience, with the ability to step to the edge of the cliff face and look down at the ocean below.

“The Snapdragon 855 Mobile Platform with 5G will enable XR viewers to transform how the world connects and communicates,” said Patrick Costello, Senior Director, Business Development, Qualcomm Technologies, Inc in a statement. “By leveraging 5G to offer premium immersive experiences, NextVR is evolving their content platform to deliver realistic immersive experiences for consumers with XR viewers.”

Qualcomm

“NextVR is completely committed to supporting 5G and the new product category of AR and VR devices known as XR viewers. The increased resolution, high bandwidth, and streamlined form-factor of XR viewers connected to 5G handsets allow us to deliver an ultra-realistic immersive experience,” said David Cole, NextVR CEO.

In addition to Fearless, NextVR will demonstrate a prototype of its new augmented reality (AR) portal on the nReal Light, a pair of ready-to-wear mixed reality (MR) glasses. Whilst being able to see the show floor, guests will also see a stereoscopic portal in front of them which opens into content experiences (e.g. a basketball court or concert venue). For further updates about what NextVR is up to, keep reading VRFocus.

Qualcomm’s new Snapdragon 855 Mobile Platform Will Provide AR/VR Experiences Over USB Type-C

It has already been an interesting start for immersive technologies at Mobile World Congress (MWC) 2019 this week, with Microsoft announcing the HoloLens 2 and HTC revealing plans for wireless VR streaming with a new 5G home hub. Continuing that 5G theme is Qualcomm Technologies, announcing its strategy to deliver the next generation of mobile VR experiences to USB Type-C connected 5G smartphones.

Qualcomm

This will be achieved via the new Qualcomm Snapdragon 855 Mobile Platform, with the next generation of 5G connected smartphones able to provide even richer augmented reality (AR) and virtual reality (VR) experiences through XR headsets.

While companies like Oculus and HTC are going down the standalone headset route with the likes of Oculus Go and HTC Vive Focus, Qualcomm still envisions a world powered by smartphones offering high-resolution displays and inside-out Six Degrees of Freedom (6DoF) tracking.

Combining 5G’s high data rates and low latency with USB Type-C, devices like the Acer OJO head-mounted display (HMD) or the nreal light AR glasses will help to expand the ecosystem whilst providing further bundle deals for consumers.

And to help in this endeavour Qualcomm has also announced an expansion of the HMD Accelerator Program (HAP) to include and help pre-validate components and performance between smartphones and XR viewers.

Qualcomm“Our HMD Accelerator Program has been a critical catalyst for ecosystem partners ranging from component suppliers and ODMs, to bring quality standalone XR headsets to consumers,” said Hugo Swart, senior director, Product Management, Qualcomm Technologies, Inc. “Building upon the momentum of this program, we will extend it to XR viewers and compatible smartphones, starting with smartphones enabled by the Snapdragon 855 Mobile Platform. In collaboration with ecosystem stakeholders, we are working towards the common goal of transforming how the world connects and communicates by offering premium, immersive experiences over 5G.”

“Acer is enabling a XR headset with high-resolution displays and 6 degrees-of-freedom positional tracking for Snapdragon 855-based smartphones. We’re ready to work with smartphone OEMs and operators worldwide to deliver the 5G + VR experiences,” said Andrew Chuang, General Manger, Presence Computing, IT Product Business, Acer Inc.

2019 may be the year 5G devices start to appear, but the wireless technology isn’t available just yet. Actual coverage isn’t expected until 2020 at the earliest. For further updates, keep reading VRFocus.

HTC Unveils 5G Mobile Smart Hub That Could Stream VR Content to a Vive Focus

It’s the first day of the Mobile World Congress 2019 show in Spain, with a great deal of the news focusing on the upcoming launch of 5G, with plenty of companies making preparations for it. The includes HTC, which has unveiled its new 5G mobile smart hub for both home and business use. The company has also teased that the hub would be capable of streaming virtual reality (VR) content to standalone headsets like the Vive Focus in the future.

HTC 5G Hub

Powered by the Qualcomm Snapdragon 855 Mobile Platform with the Snapdragon X50 5G Modem and antenna modules,  the HTC 5G Hub features a 5-inch HD touchscreen for ease of use, smooth 4K video streaming, low-latency gaming, and 5G mobile hotspot features for up to 20 users.

Due to the speed of 5G in the future HTC plans on streaming VR content from the cloud directly to headsets via the HTC 5G Hub, for a mobile, high-end VR experience in real time.

“HTC is proud to bring to market the world’s first 5G mobile smart hub,” said Cher Wang, Chairwoman and CEO of HTC in a statement. “5G will be the game-changer for VR and AR, and the new HTC 5G Hub will seamlessly deliver the great bandwidth of 5G to our devices, driving our vision of Vive Reality—a boundless, immersive environment where human experiences will come to the forefront.”

There is one small caveat, HTC does note on its website: “The above scenario will depend on development of MEC technology and 5G infrastructure.”

HTC Vive Focus Plus

“With the HTC 5G Hub, many consumers enjoy for first time the transformative experiences that 5G and advanced Wi-Fi capabilities can bring to their lives,” said Durga Malladi, senior vice president and general manager, 4G/5G, Qualcomm Technologies, Inc. “Our long-standing collaboration with HTC has a proven track record in delivering mobile innovation. Our efforts helped deliver the first Android smartphone, we worked closely together to accelerate the transition to 4G, and we are now ushering in the age of 5G with one of the first mobile devices to take advantage of 5G and next generation Wi-Fi capabilities of the Snapdragon 855 Mobile Platform.”

The HTC 5G Hub will be available through select retailers beginning in Q2 of 2019. For the latest updates from MWC19, keep reading VRFocus.

HTC Says ‘5G Hub’ Will Stream VR ‘Directly to Vive Headsets’, One Day… Probably

HTC introduced its ‘5G Hub’ product back in February—a 5G mobile hotspot mashed up with a media streaming device and a digital assistant. And still “coming soon” is a “cloud-based virtual reality” feature which promises to stream VR content directly to the Vive Focus, ‘no PC required’. Sounds great, and all you’ll need to wait for is the launch of a consumer version of the headset and hopeful deployment of massive new connectivity and cloud infrastructures.

Image courtesy HTC

Update (May 16th, 2019): HTC today announced that the 5G Hub will launch on Sprint on May 31st, with pre-orders starting tomorrow. In addition to the cost of the device, Sprint will charge $60/100GB of data, capping speeds to 2G thereafter.

HTC is still touting the VR streaming feature as “coming soon,” but Sprint appears not to mention it at all—likely because the necessary network infrastructures aren’t remotely close to being in place and the headset it relies on (Vive Focus) still isn’t available in the US as a consumer product.

The original article continues below.

Original Article (February 25th, 2019): HTC hopes that its 5G Hub, which is due out later this year in the US on the Sprint network, will be the center of your connected life. In addition to being a 5G hotspot, media player, digital assistant, and gaming device, the 5G Hub will, “in the future,” stream Viveport content from the cloud to the Vive Focus headset.

But the official page for the 5G Hub uses some teeny tiny fine print to point out two massive caveats:

“*The above scenario will depend on development of MEC technology and 5G infrastructure.”

Oh that’s all?

Multi-access Edge Computing (MEC) and 5G are hugely complex infrastructure and cloud technologies which are still very well in their infancy. Not only are these technologies just beginning to roll out to some portions of select markets, they aren’t intrinsically linked, which means you might have 5G data access in your area eventually, but if there’s not the right MEC infrastructure in town, you’re SOL.

Which is to say… most of what needs to be in place for this cloud streaming feature to happen is not there yet, and HTC has no firm timeline for when it will be.

And then there’s the fact that they’re marketing this 5G Hub VR cloud streaming feature in the US as being compatible with the Vive Focus, a headset which is currently not available to consumers and has presently has no consumer release date.

Image courtesy HTC

Don’t get me wrong, I would love to see a 5G Hub in every home streaming cloud-rendered VR content to affordable headsets across the globe. But it seems just a tad bit early to be marketing this feature which is unlikely to be available to most potential 5G Hub customers by the time a newer version of the 5G Hub is launched.

The post HTC Says ‘5G Hub’ Will Stream VR ‘Directly to Vive Headsets’, One Day… Probably appeared first on Road to VR.