There has been a lot of talks and great articles in the past on web augmented reality (AR) and virtual reality (VR). From a content and user experience point of view, it makes great sense to have VR content web based in the Cloud instead of downloading on a PC, especially as standalone VR headsets are being released. But how do you do that to deliver great experiences as good as from the ones downloaded on a PC?
Last year, HTC Vive partnered with Dalian Television and Beijing Cyber Cloud in China to launch the world’s first Cloud VR service for a commercial trial in Dalian; rather than plugging a Vive VR system to a PC, it was hooked up to a set-top box with access to the carrier’s VR Cloud content store and customers were offered a broadband package in the process.
Although there was not much feedback from those tests, it seemed that any drops in connection meant latency rate drops as well as image definition, therefore impacting the VR experiences where reliable high quality is a must.
More recently, Verizon ran trials involving streaming VR over 5G of a Super Bowl experience at 50 megabits per second per HTC Vive VR headset, which is relatively the low end of 5G but still much higher than 4G.
This month, at the Winter Olympics, KT Corporation and Intel are also showcasing the technology. High-definition pictures and 360 Video VR from the Games in Pyeongchang are streamed through 5G from multiple cameras.
It is said that VR will flourish thanks to 5G, because headsets can work free of wires and debilitating motion sickness because latency reduces to almost zero with 5G.
But as I found out at a pre-Mobile World Congress 2018 (MWC) briefing by Huawei recently in London, it is not as simple. First of all, connection drops and variations in bandwidth like the ones experienced over fibre broadband in the Cloud-based Chinese trial could be the same over 5G: there could be signal drops and switching from 3G to 4G back to 5G, really potentially impacting the experience.
So, in a world where fibre is very likely to be taken over by mobile telecoms and therefore 5G, how can we make sure that we get a compelling Cloud-based experience for VR? And therefore for 360 degree video and traditional video streaming as those will also most probably deliver all our media in the future, think terabytes monthly plans for 24K video feeds. Especially as 24K has been theoretically identified in the past as one of the important element to deliver true realistic VR experiences.

To solve those problems, it was interesting to first learn at the Huawei brief about Massive MIMOs (Massive Multiple Input Multiple Output) technologies which are basically mobile data masts with very clever multi antennas which are working together following algorithms to ensure that 5G signals do not get dropped. Those have been used in 4G already and it seems that they are just getting better and better with more security, smaller form factors and more power efficient.
The second important aspect to ensuring that the signal is stable and the VR content is of high quality is an AI Cloud which is capable at managing GPUs (the Graphics Processing Units running high quality video) and content in such a way that the latency output and resolution stay of high enough quality. There is currently more and more competition in AI Clouds but what seemed really interesting about the Huawei AI Cloud (called Atlas), is that it is directly feeding from the rest of the network infrastructure data in order to optimise itself using machine learning. It looks at the massive MIMOs 5G clients connections quality as well as the video GPU throughput from the Cloud to make sure the content delivery is optimised.

This end to end hardware / software optimisation using AI machine learning is surely the way forward in order to make sure we get compelling VR experiences in our future standalone high-quality VR headsets, free of wires and computers and I cannot wait to experience more of it. MWC 2018 will showcase more demonstrations of those technologies and it will be exciting to start getting those to the end user. Some partnerships such as last year’s announced TPCast & Huawei X Labs Cloud VR rendering solution partnership, are surely exciting developments to watch as well as the launch of the Open Lab cooperation plan launched last year to focus on innovation around Cloud VR. Those will be a game changer as offering a lot less friction for the user to experience VR with stable high quality and therefore making adoption much easier for applications such social VR, eSports, live broadcast, business remote collaboration, and much more.