Meta announced and launched pre-orders for its Meta 2 AR headset dev kit more than a year ago, and expected to begin shipping the device at the end of 2016. While the company says a small number of headsets have indeed gone out the door, Meta acknowledged in May that manufacturing delays had prevented a bulk of pre-orderers from getting their headsets in a timely fashion. A recent update from the company suggests that shipping quantities will pick up in August.
Update (7/30/17): Since offering a vague “soon” timeframe for when shipping of the Meta 2 AR dev kit would pick up beyond the “limited quantities” mentioned back in May, an update to the FAQ on the company’s pre-order page for the headset notes that “Volume shipping will occur in mid Q3 2017,” which would fall in the month of August.
“The number one topic my customer success team has been receiving over the past couple months is ‘when will my headset ship?’,” says Meta’s Senior Director of Customer Success, Gary Garcia. Not surprising, I would say, as I’m not sure what other questions you might ask prior to actually having the hardware.
Photo courtesy Meta
Garcia says that things started off well but the company eventually ran into issues as they tried to scale up their manufacturing and processing procedures. He says the company has doubled the size of their manufacturing floor as they have tackled the issues.
Aside from acknowledging the delays, the company doesn’t offer any specifics about how many headsets have actually shipped at this point, and offers only a vague update on when developers can expect their headsets.
“Soon headsets will be rolling off the manufacturing line in greater volume.”
The Meta 2 augmented reality dev kit began pre-orders in March of 2016 and has apparently seen quite a bit of interest. By the end of 2016, the company says they began shipping the first Meta 2 dev kits, and by the end of 2017 they expect to ship some 10,000 headsets.
“There’s more Meta 2 pre-orders than there are HoloLens headsets in the world,” Ryan Pamplin, VP of Sales & Partnerships at Meta, told me during a recent visit to check out the latest changes to the company’s AR headset. The Meta 2 uniquely features an impressive 90 degree field of view that’s a huge step up in immersion from the company’s prior Meta 1 device and other AR headsets in its class.
Actually, the Meta 2 was not originally going to be the headset to follow the Meta 1. Back in 2013 the company had announced and taken pre-orders for a $3,000 headset called the Meta Pro. But Pamplin says that that company heard resounding feedback from those working with the original Meta 1, saying that the 36 degree field of view was simply not enough. Meta Pro was only set to bump it up to 40 degrees; at some point the company realized that the Meta Pro was not the right move, and Pamplin told me that they never charged any pre-orders.
Then in March 2016 the company revealed the Meta 2, this time with a huge increase in field of view (and a more digestible $950 price tag), one that makes the headset the most visually immersive of AR devices in its category.
Photo by Road to VR
The big difference in field of view came from a total overhaul of the headset’s optics which is now using a dual large combiner approach rather than the waveguide approach of the Meta 1 and (would-be) Meta Pro. Waveguide optics are great because they’re thin and enable very small form-factor designs, but they have their drawbacks.
“Waveguide optics cap out around 50 degrees field of view,” Pamplin said. And while the hope is that one day we’ll be able to have AR glasses with an immersive field of view while managing to be no bigger than a pair of sunglasses, Meta realized that developers needed a wide field of view AR headset to begin building applications on today, in parallel with continued optical R&D; hence the bulky but functional Meta 2.
And it seems developers are in agreement with approach, with the company on track to ship some 10,000 Meta 2 dev kits in 2017, according to Pamplin, who says that all the initial pre-orders will ship in the next few months.
That’s a strong start when you consider not only the $950 price tag, but that these are ostensibly purely developers buying this device so that they can build stuff for other people to do with it. This traction reminds me of the early excitement of the first Rift DK1 development kit which would go on to sell around 56,000 units over roughly two years before being replaced by the Rift DK2 which would eventually double that figure.
Photo by Road to VR
Good developer traction at the outset is good not just for Meta, but for the entire AR space. VR in general would not be where it is today if Oculus hadn’t been able to generate early excitement and ship more than 175,000 DK1 & DK2 units to developers and early enthusiasts who were figuring out what works well in VR and what doesn’t. Having a hardware platform that’s good enough to stir the imaginations of developers cedes a foundation of conceptual and technical of knowledge that boosts other AR headsets and platforms too.
Indeed, even though it may be a few years until we see widely available consumer-ready AR devices, the AR headset space is rapidly heating up. While Meta has raised a considerable $73 million in venture capital to fund their ongoing development, giants like Microsoft, Magic Leap, and perhaps even Apple loom.
Despite the competition, the benefit of having so much activity happening in the augmented reality space is not lost on Pamplin. “Bring it on,” he says, welcoming the developments and contributions that are sure to come from many different players.
Outside of a few demonstrations at tradeshows, Meta has only give glimpses of the content on its impressive AR headset dev kit in short clips. During a recent visit to the company’s Silicon Valley headquarters to catch the latest improvements to the headset, I got to see a selection of demo content which Meta has allowed Road to VR to capture and share in full for the first time.
Having initially taken pre-orders for the $3,000 Meta Pro glasses back in 2013, the company rebooted the headset and revealed it in early 2016 as the Meta 2, this time poised as a more affordable (at $949) development kit with a much wider and more immersive field of view at 90 degrees (compared to 36 degrees on the original). After taking pre-orders throughout 2016, Meta tells Road to VR that the Meta 2 dev kit is shipping in limited quantities, and the company expects to ramp up shipments in the months ahead.
Many people right now think that the VR and AR development timelines are right on top of each other—and it’s hard to blame them because superficially the two are conceptually similar—but the reality is that AR is, at best, where VR was in 2013 (the year the DK1 launched). That is to say, it will likely be three more years until we see the ‘Rifts’ and ‘Vives’ of the AR world shipping to consumers.
Without such recognition, you might look at Meta’s teaser video and call it overhyped. But with it, you can see that it might just be spot on, albeit on a delayed timescale compared to where many people think AR is today.
Like Rift DK1 in 2013, Meta 2 isn’t perfect in 2016, but it could play an equally important role in the development of consumer augmented reality.
At the time we weren’t able to share an in-depth look at what it was actually like to use the headset, save for a few short clips taken from promotional videos released by the company. Now, running on the shipping version of the development kit, Meta has exclusively allowed us to capture and share the full length of several of the headset’s demo experiences, blemishes and all.
Photo by Road to VR
A few notes to make sense of the videos below:
The footage is captured with the Meta 2’s front-facing camera, and then overlaid with the AR content, so this isn’t precisely what it looks like through the headset’s lens itself, but it’s pretty darn close.
The real resolution actually looks a lot sharper than what you’re seeing here; all the text in the demos was quite readable. For comparison, the captures are done at 1280×720 (and not at a very good bit-rate), while the Meta 2 provides a 1280×1440 per-eye resolution.
The AR content is rendered to be correct for my perspective, not the camera’s. That means that when I reach out to touch things it sometimes looks like my hand is way off from the camera’s vantage point, but to me it actually looks like I’m touching the objects directly (which feels surprisingly cool when you reach out and it’s right where you think it is).
Occlusion is not rendered in these videos, though it is present in the view through the headset, so while it looks like I’m reaching behind some objects (also an issue of the above point), there’s some (rough) occlusion happening in my view which, combined with stereoscopy, maintains the illusion that the objects are actually behind my hands and out into the real world.
The tracking fidelity and latency seen in the videos is a fair representation of the current latency seen through the lens itself.
You will see some moments when I clip through the AR content, though the clipping doesn’t align with my own view through the headset due to the camera’s different perspective.
The picture-in-picture video from the outside view is hand-synchronized and isn’t representative of actual latency.
The field of view seen here is technically representative of what it looks like in the headset, except that the field of view of the human eye is much wider than what the camera captures; thus while the AR content can be seen here stretching across the entire width of the camera’s captured field of view, it doesn’t stretch across the entire human field of view. That said, it’s quite wide and much more compelling compared to a lot of other AR headsets out there.
There’s some occasional jumpiness in the framerate of the capture (especially during the brain scene), likely due to the capture process struggling to keep up; through the headset the scenes were perfectly smooth.
Meta 2 Evolving Brain Demo
Meta 2 Cubes, Earth, and Car Demo
Take Away
Meta 2’s biggest strength is its wide field of view and its biggest weakness is tracking fidelity (both latency and jitter). I’ve often said that if you mashed together HoloLens’ tracking and Meta’s field of view, you’d have something truly compelling. Meta said they would ship with “excellent” inside-out tracking, but as of today they aren’t there yet.
Photo by Road to VR
That said, this is a development kit. The company tells me that there’s still optimization to be done along the render pipeline, including the use of timewarp—an important latency-reducing technique—which is not currently implemented.
Like the Rift DK1 back in 2013—it didn’t have perfect head tracking or low enough latency, but it was still enough to stir the imaginations of developers and get the ball rolling on AR app development. Meta knows that their tracking isn’t consumer ready yet, but the company intends to get there (and they seem committed to doing it in-house). Meanwhile, the Meta 2 could serve as that kick-in-the-imagination that gets a core community developers excited about the possibilities of AR and actually starting to build for the platform.
For a deeper dive into our hands-on time with Meta 2, be sure to see our prior writeup.
Holo-this, holo-that. Holograms are so bamboozling that the term often gets used colloquially to mean ‘fancy-looking 3D image’, but holograms are actually a very specific and interesting method for capturing light field scenes which have some real advantages over other methods of displaying 3D imagery. RealView claims to be using real holography to solve a major problem inherent to AR and VR headsets of today, the vergence-accommodation conflict. Our favorite holo-skeptic, Oliver Kreylos, examines what we know about the company’s approach so far.
Guest Article by Dr. Oliver Kreylos
Oliver is a researcher with the UC Davis W.M. Keck Center for Active Visualization in the Earth Sciences (KeckCAVES). He has been developing virtual reality as a tool for scientific discovery since 1998, and is the creator of the open-source Vrui VR toolkit. He frequents reddit as /u/Doc_Ok, tweets as @okreylos, and blogs about VR-related topics at Doc-Ok.org.
RealView recently announced plans to turn their previous desktop holographic display tech into the HOLOSCOPE augmented reality headset. This new headset is similar to Magic Leap‘s AR efforts in two big ways: one, it aims to address the issue of vergence-accommodation conflict inherent in current VR headsets such as Oculus Rift or Vive, and AR headsets such as Microsoft’s HoloLens; and two, we know almost no details about it. Here they explain vergence-accommodation conflict:
Note that there is a mistake around the 1:00 minute mark: while it is true that the image will be blurry, it will only split if the headset is not configured correctly. Specifically, that will not happen with HoloLens when the viewer’s inter-pupillary distance is dialed in correctly.
The remaining questions are how exactly RealView creates these holograms, and how well a display based on holograms will work in practice. Unfortunately, due to the lack of known details, we can only speculate. And speculate I will. As a starting point, here is a demo video, allegedly shot through the display and without any special effects:
I say allegedly, but I do believe this to be true. The resolution is surprisingly high and quality is surprisingly good, but the degree of transparency in the virtual object (note the fingers shining through) is consistent with real holograms (which only add to the light from the real environment shining through the display’s visor).
There is one peculiar thing I noticed on RealView’s web site and videos: the phrase “multiple or dynamic focal planes.” This seems odd in the context of real holograms, which, being real three-dimensional images, don’t really have focal planes. Digging a little deeper, there is a possible explanation. According to the Wikipedia entry for computer-generated holography, one of the simpler algorithms to generate the required interference patterns, Fourier transform, is only able to create holograms of 2D images. Another method, point source holograms, can create holograms of arbitrary 3D objects, but has much higher computational complexity. Maybe RealView does not directly create 3D holograms, but instead projects slices of virtual 3D objects onto a set of image planes at different depths, creates interference patterns for the resulting 2D images using Fourier transform, and then composes the partial holograms into a multi-plane hologram. I want to reiterate that this is mere speculation.
This would literally create multiple focal planes, and allow the creation of dynamic focal planes depending on application or interaction needs, and could potentially explain the odd language and the high quality of holograms in above video. The primary downside of slice-based holograms would be motion parallax: in a desktop system, the illusion of a solid object would break down as the viewer moves laterally to the holographic screen. Fortunately, in head-mounted displays the screen is bolted to the viewer’s head, solving the problem.
So while RealView’s underlying technology appears legit, it is unknown how close they are to a real product. The device used to shoot above video is never shown or seen, and a picture from the web site’s medical section shows a large apparatus that is decidedly not head-mounted. I believe all other product pictures on the web site to be concept renders, some of them appearing to be (poorly) ‘shopped stock photos. There are no details on resolution, frame rate, brightness or other image specs, and any mention of head tracking is suspiciously absent. Even real holograms need head tracking to work if the holographic screen is moving in space by virtue of being attached to a person’s head. Also, the web site provides no details on the special scanners that are required for real-time direct in-your-hand interaction.
In conclusion, while we know next to nothing definitive about this potential product, computer-generated holography is a thing that really exists, and AR displays based on it could be contenders. Details remain to be seen, but any advancements to computer-generated holography would be highly welcome.