Hands-on With The Santa Cruz Inside Out Position Tracking Oculus Prototype

Hands-on With Santa Cruz Inside Out Position Tracking Oculus Prototype

We went hands on with the Santa Cruz prototype from Oculus, which is the current name for an early standalone mobile VR headset the company is working on that includes position tracking.

No photos or videos were allowed for the demo, and it was a brief hands-on. The device looked and felt much like an Oculus Rift. Demo providers declined to say whether the demo ran at 60, 75 or 90 frames per second, nor did they reveal the resolution of the headset. Inside the headset, I walked from one end of the room to another and back again, jumped and crouched, and the self-contained unit didn’t lose tracking. The headset displayed a sparse environment with no clear interactivity available beyond the ability to move freely around the room.  The headset featured four cameras on the front face, two pointed toward the ceiling and two more pointed toward the floor.

When I approached a wall a blue grid line came up warning me of my proximity. Tracking did have one small hiccup when I reached a corner of the room and turned quickly, but otherwise tracking was really solid and I felt no discomfort quickly taking several steps across the room. Overall, the experience felt more like a Rift that went wireless rather than a Gear VR that gained position tracking.

The headset was in a super early state, with a battery hanging on the back that got disconnected as they carefully placed it on my head.

Intel and Qualcomm are among a number of tech companies attempting to figure out how to do inside out position tracking in a way that doesn’t drain the battery too quickly. With a big reveal on ‘Santa Cruz’ at Oculus Connect by Facebook CEO Mark Zuckerberg, it is clear a self-contained mobile unit is a high priority of the company.

4K Headsets, ‘Perfect’ Eye-Tracking, and ‘Augmented VR’: Oculus’ Abrash Predicts VR in 2021

Oculus Chief Scientist Michael Abrash Predicts 2021 Headset Specifications

Oculus Chief Scientist Michael Abrash predicts dramatic improvements to field of view and resolution for VR headsets over the next five years among many other areas. Save the image above, because come 2021 we can check in and see if Abrash painted an accurate picture for the improvements we can expect.

Practically everything Abrash said whilst “sick as a dog” at his Oculus Connect 3 talk today could have been a major headline in a normal week, but because it’s OC3 we’ll have to make do with cramming it all into one post.

Screen resolutions are a hot topic in the VR and tech industry as a whole right now, especially with the introduction of new 4K ready games consoles like the PlayStation 4 Pro and Project Scorpio. High-end VR headsets currently utilize either 1080p or 2K displays, but Abrash predicts they’ll have shifted to 4K by 2021. That’s good news for those of us that find VR’s screen door effect distracting, though Abrash has previously stated that we’ll have to reach resolutions of 16K for VR to match our eyes.

What complicates resolution, however, is pixel density, or rather its usage. As Abrash explained, we could choose to use 4K screens to prioritize boosting either resolution or field of view (FOV) within a headset, though one would come at the expense of the other. In this mind, the industry may side with the latter once larger viewing angles are made available.

In fact, Abrash predicted that headsets may have reached a 140 degree FOV by 2021. Ultimately, he expects the visual capabilities of VR headsets to be “good enough to pass a driver’s licence test” by that time, though the industry will need a breakthrough in optics that he’s “confident” will be made soon.

Finally, the legendary developer talks about depth of focus. This is a much more complicated topic, referring to our eyes’ ability to adjust to focus on different objects at different distances. With current VR, everything appears focused as if you were staring at it from 2 meters away. It’s not a big problem in Abrash’s eyes (quite literally given that he described his eyes to be as limited as the Rift’s display), but it would be “good to fix”.

To do so, we might need to turn to new types of displays including holographic, light field, multifocal, and varifocal types. If you’ve never heard of any of those don’t worry; Abrash noted that none of them were good enough to tackle the issue yet, especially for a headset. He wasn’t sure which would provide the solution, but was confident that at least one would provide a “good” depth of field in the next five years.

Elsewhere, Abrash went far beyond the headset’s optics in predicting the next five years of VR. Headsets will get lighter and more comfortable, though wireless PC-based kits may still be further out. Audio didn’t seem to pose the same challenges, with significantly improved 3D audio propagation predicted in five years. He seemed bullish that Oculus Touch-like controllers will be a primary input device for a long time yet. It’s “quite possible”, in Abrash’s opinion, that Touch and its contemporaries could be seen as the “mouse” of VR. Despite this, he also stated that hand tracking should be important to headsets five years from now.

He also suggested, though, that if eye-tracking doesn’t improve quickly enough it could throw off his predictions. This tech needs to be “virtually perfect” for “core VR technology” like foveated rendering, a potentially essential technique in which hardware needs only completely render the area of a display you’re looking at. It’s a tricky subject, but Abrash described it as “so central” to the future of VR he believed it would be solved in five years, though it was the “single greatest risk factor” in his predictions.

Another big part of Abrash’s predictions was what he described as ‘augmented VR’, referring to headsets that expand into the real world with virtual images. It essentially sounded like he was talking about improved versions of mixed reality headsets like HoloLens. Somewhat surprisingly, Abrash thought this would be “so important” to the tech’s future that issues with real world capture and other areas would be overcome in the next five years and the lines between virtual and real realities will “blur”.

Augmented VR will also necessitate the creation of virtual humans, which Abrash described as an essential social VR experience. This may be, however, “the single most challenging aspect” of VR as a whole given the incredible complexity of accurately replicating them in real-time. However, the developer predicts big strides in facial animation and body tracking among other areas over the next five years.

That’s a lot of fascinating predictions, each with a lot to dig into. If Abrash is right? He wants to be using these new systems and technologies for work collaboration apps. “Done well enough, that’s the most productive solo working environment I can imagine,” he told the crowd.

The five year future of VR looks bright in Abrash’s eyes, but take a look at the picture below. This was a slide taken at the start of the presentation, showing the capabilities of the human visual system.  It far outstrips even his five year prediction.

Like Abrash said at the beginning of his speech, we’ll always be hoping that current VR could be just a little bit better, but his vision of the future gives us plenty to look forward to for now.

How Facebook’s ‘Quill’ Art Tool Differs From Medium And Tilt Brush [Hands-on]

How Facebook’s ‘Quill’ Art Tool Differs From Medium And Tilt Brush [Hands-on]

I was standing inside an illustration made with the art tool Quill from Facebook’s Oculus. My right hand was for painting and my left hand showed a tool panel with a nested list of objects representing everything in the scene. Using this list, I could select all or part of the illustration, with control over the scale of whatever is selected.

What this means is that if my illustration had a a road with a car driving on it and a person in the driver seat, I could use the same intuitive stretching gesture to resize all of these elements together, or the car and the person driving in it or just the person. If I want a life size road, I can do that. If I want a tiny little car, I can do that too. If I wanted to remove the driver — it’s as easy as grabbing him. This layering system also lets artists modify — like for coloring — only a single part of the overall illustration.

We spoke this week to Oculus Story Studio creative director Saschka Unseld about the differences between Quill and Google’s Tilt Brush, as well as from the Medium creation software, which is also from Oculus but has been in development for a longer period of time.

While Medium is used to sculpt and mold virtual objects into shape, according to Unseld, “the most successful things in Quill are loose, like there is the illusion of volume but it is is still an illustration.” The goal with Quill is to offer a tool that lets artists express themselves without feeling constrained.

Google’s Tilt Brush is perhaps the most well-known HTC Vive app, and Unseld noted “what makes Tilt Brush so incredibly fun is you have a line with sparkles, or it glows or it’s neon. But what if I don’t want my line to glow?” According to Unseld, Quill differs from Tilt Brush in that its brushes lacks these extra “effects.”

“I never ever ideally want people to be like, oh, this was made in Quill,” he said. “This doesn’t have a look, you can do whatever you want in there.”

Quill has been in development since late last year and Oculus built it in tandem with the newest Dear Angelica project from Oculus Story Studio. Both Quill and Dear Angelica should be releasing in early 2017, around the Sundance Film Festival.

Mark Zuckerberg On Standalone Inside Out VR Headset: ‘We Have A Demo, But We Don’t Have A Product Yet’

Mark Zuckerberg On Standalone Inside Out VR Headset: ‘We Have A Demo, But We Don’t Have A Product Yet’

CEO Mark Zuckerberg made clear Facebook is planning a standalone mobile VR headset capable of inside out position tracking.

“We’re working on this now,” Zuckerberg said. “It’s still early.”

Like Microsoft’s upcoming Scorpio Xbox console, the announcement from Zuckerberg gives developers a long-term view of what to expect from Oculus. We hope Oculus CTO John Carmack has more information to reveal about the system in his keynote on Friday.

“We have a demo,” Zuckerberg said. “But we don’t have a product yet.”

Inside out position tracking is one of the biggest problems yet to be solved in an inexpensive VR headset. With the technology, the headset knows where it is in a room without any external cameras or base stations needed. We’ve been asking Carmack about it for more than a year, and the issue to be solved is with the amount of power the approach consumes. Nobody wants a VR headset that only lasts a few minutes before needing to be recharged and that’s the challenge Carmack will face in bringing the technology to mobile devices.

Updates to come.

Apple Should Be Very Worried About Google’s Pixel

Apple Should Be Very Worried About Google’s Pixel

Pixel looks like an iPhone, minus the “unsightly” camera bump, takes amazing photos and videos like an iPhone, offers unlimited storage for those images and looks like it does good VR.

These facts have me considering jumping ship from Apple seriously for the first time since 2007.

I’ve been an iPhone owner since the first generation, but over the last few years I’ve purchased Samsung phones for VR too. While I’m deeply embedded in the Apple ecosystem, Pixel tempts me more than ever before to consider leaving Apple behind completely. Just how embedded am I with Apple? I basically didn’t talk to my family for a week during a recent outage caused by my need to repair the iPhone because our iMessage chat group didn’t port over to Android. If I ever want to jump ship, I have to convince my family to use a different app to chat with one another.

And it’s a jump I’m thinking about making, made more tempting by the promise that if I do decide to “switch,” even my iMessages will come over to Pixel.

Overall, the big draw for me is two-fold.

  1. Google is claiming Pixel has a better camera than the iPhone 7 (though they didn’t mention the 7+) and it is pairing that camera with unlimited full resolution storage of photos and videos shot with the device.
  2. Google is releasing Pixel alongside the Daydream View reference VR headset that will offer experiences to compete with Samsung’s Gear VR.

Google knows running out of storage on an iPhone is one of the most annoying moments for an Apple owner, and it already solved that problem through Google Photos. You get unlimited storage for free when you use the app on an iPhone, and the photos are easily searchable so it is easy to find your images again. Unfortunately, the photos aren’t kept at their full resolution. Instead, Google stores “high quality” versions said to be “good for typical printing and sharing.” Not the same with Google Pixel. Google says it will store all videos and photos shot with the phone at full resolution, including 4K videos.

Overall, with this one device and its supporting services, Google has completely eliminated concern about how I’m going to store, or find, my photos and videos. Simultaneously, the company is reclaiming that storage space for apps and VR experiences — the things that make phones so alluring. By giving my phone an unlimited camera roll, Google has also given me a blank slate to install all the software I want on a single device.

I’m not saying I’m jumping ship immediately — but Google Photos has been far and away a better product for storing photos when compared with Apple’s own “optimize iPhone Storage” option, which moves full resolution images off the device to save space. When I’ve wanted to find photos using an iPhone that have been moved to Apple’s cloud, I found it took too long to view that full resolution image again. Google’s cloud is simply faster, and that matters when trying to find a photo quickly that’s buried in the archives.

In Pixel, Google has taken the most exciting things about phones — the images they capture and the apps they run — and made them complementary. You don’t have to choose between taking a photo or installing an app anymore, and if you’re getting the $80 Daydream headset for free, that means you’ll have that much more space for storage-eating VR apps.

Pixel appears to hit Apple squarely where it counts most today — photo quality.  The phone will succeed or fail on this feature alone. But Pixel is also the leading device in a push for a number of Android VR phones that could draw developer attention to the new medium. For years people have been drawn to iOS because of its quality apps alongside the quality of the camera, but if developers start investing time and money in Daydream VR apps instead of iOS because they see a promising new market, it could mark the beginning of a reversal to that trend.

Eventually more impressive VR devices than Daydream will arrive powered by Google, like an all-in-one headset that doesn’t require a phone or an HMD capable of inside-out position tracking, and when that happens these early efforts with Cardboard and Daydream could pay off handsomely. If that shift occurs, Google would own a major head start versus Apple in offering a library of VR content for the new medium. That’s why Apple CEO Tim Cook needs to worry. The company’s focus on the long game might be running out of time.

Of course, Facebook and Samsung are also playing this game and we still have yet to see what they will announce at Oculus Connect 3 later this week.

Google’s Tilt Brush Is Going Places You (and Google) Might Not Expect

Google’s Tilt Brush Is Going Places You (and Google) Might Not Expect

It’s Saturday morning. I’ve both cooked and successfully eaten french toast. Wife and kids are away for the morning. Cats are fed.

Check, check, check. The core essentials are in place.

Let’s go.

Google’s Tilt Brush team gave us a peek into what’s coming down the pipe last night and my mind has been buzzing ever since. If you’ve not seen their teaser video, I strongly recommend you take a peek:

There are a bunch of things here that I’m going to try to unpack. I might be writing this more for me than you dear reader, so buckle in or jump out as you see fit.

Before we talk about the future, let’s talk about the past and something been clawing for attention in my mind since May of this year.

Tucked away in a corner of the internet is a YouTube user who immediately stood out like a blazing sun to me for two reasons:

  1. He was posting HTC Vive Tilt Brush content.
  2. He has the same last name as me.

Both VR and the McGregor clan have taken more than their fair share of battles over the 30–500 years, so it is a rare thing to catch them together in the wild.

John Sterling McGregor is a student.

John Sterling McGregor is a student learning calculus.

John Sterling McGregor is a student learning calculus and he’s learning it using Tilt Brush in VR with the HTC Vive.

If I were talking to you directly, I’d now turn to you fully and start leaning forward while saying, “No really, he’s doing this, I think this is a big, important thing, very much worth your time and attention. YOU NEED TO LOOK AT THIS. LOOK AT IT!”

( You then would take a step back and glance at the watch you stopped wearing 9 years ago. As one does.)

Night after night he’s been getting together with friends/tutors to learn math while immersed in VR.

Why this has got me so very excited?

It’s because John does not appear to be out to “prove anything” to the world and I just happened across him.

The scientist places technology amongst the fauna for a year, steps back and watches what nibbles on it. -Me, just now.

John isn’t a VR researcher. He isn’t an educator trying to blaze an experimental path or prove a paper. He’s just a student trying to learn something and get through his next exam, just like all the rest of students around the world…

Let me just write that for dramatic effect “…just like the rest of students around the world.” Um… Did it work? Did you feel the drama?

There is a big difference between talking theory about using VR for learning VR actually seeing it applied in the wild by the learner. — me, just now

More impressive, is this doesn’t appear to be just a casual, “hey, let’s try this out.” type of one shot experiment.

No…

John has spent roughly 16 hours (so far) studying calculus in VR.

A lot of virtual reality game developers would be VERY happy to have a user spend that much time in their game.

Something here is clearly working…

Here’s are a few examples of a few sessions, some in excess of 3 hours.[below].

I find it a to be a fascinating first look into what might be the beginning of something that just might redefine a lot of our traditional learning over the next decade. I encourage you to browse through the lot. You won’t find any drama, just lots and lots of very focused mathematics learning.

In VR…

With Tilt Brush…

I feel like I could just stop writing now, that pointing at these videos is enough to make at least a few people out there [hello Google] pause and say “Huh. That’s interesting.” Frankly, I feel study of the videos themselves are a hell of a lot more interesting than anything I might be able to write…

But I’ll press on and try to elaborate a bit.

In case you are reading this and have missed the last few year’s of VR, here are a few building blocks to play with at home:

Here are the tools he is using to make this happen:

While most people reading this will recognize Skype from the above list, if you aren’t familiar with the HTC Vive or Tilt Brush, the below clip does a pretty good job at illustrating (har, har) what it does:

 

“Put on the VR gear, grab the controllers and Tilt Brush allows you to walk around and paint in a 3D virtual space.”

Now, a very key, clever thing about what John is doing here, is that he is streaming his calculus session in real time with his friends/tutors via Skype. John isn’t floating alone in the void, he has plenty of company. They see everything he is doing and whisper help and advice in his ear any time he needs it.

“Only what you take with you…” — Yoda, not me

When John puts on his headset and earphones, he finds himself surrounded by the inky blackness of the VR / Tilt Brush environment. It is as close to a pure, 100% distraction free environment as I would ever wish on a person. It is a place where John is free to noodle on equations and notes anywhere he likes in the space around him. It’s like taking notes in class and then having them rise up and swallow your mind whole.

 

Image stolen from the depths of Pinterest. I don’t know who to credit but “Hey you, nice notes! I mean, wow. Look at ‘em!”

I struggle a bit to find the proper words to describe how conducive to learning this kind of pure environment might be to the right person.

They see everything he is doing and whisper help and advice in his ear any time he needs it.

You have all the advantages of discussion, collaboration, gesturing, active body movement (the list goes on…) and none of the distraction that comes with the real world. A unfettered exploration of concept.

Now, I have to stress that, as forward thinking as the designers are, I’m fairly certain that the Tilt Brush team didn’t set out to build tool for learning math… and yet here we are.

Tilt Brush does contain a surprising number of the essential tools needed…

For instance, a user can bring in any image they like into the environment. Math notes/assignments/reference materials can be added to the space:

And the straight edge / mirroring tools allow you to handily sketch out graphs:

You can bring all the reference materials you need and place them exactly where you need them in this space.

You end up surrounding yourself with the subject you are studying and your own thoughts. Nothing else exists beyond the subject at hand.

_ _ _

I’m going to stop right here and throw out the idea that maybe, in the long run, Tilt Brush isn’t in the business of providing tools for “painting” but rather in the business of giving VR users a place and the tools to express ideas. This seems about right for the level of scope that Google is in the habit of taking on.

“Um, yes, what did you think this was? Did you think we were simply gunning for MSPaint?”- Google

OK. So let’s skip forward back to last night and Google’s preview of the upcoming Tilt Brush features. Here are the biggies:

  1. Multiple Users sharing the same environment.

So, one thing missing from John’s studies above is a way for his tutors to write things down in front of him, or gesture to a particular point in an equation. You can put down your hand if your were planning on pointing that out.

Google will soon be rolling out the ability for users to join together and do just this. They could all waltz around the equations together, working out calculations together in real time from while happily residing in different countries.

 

Just like this, minus the ladders.

 

 

Google uses an volcano in their multi user example which is neat… but I feel we are rapidly leaping past the “let’s build a sand castle together” stage.

2. Painting on models and then… manipulating both model and paint.

 

processing…….

I’m honestly still trying to wrap my mind around this addition. I’m not going to do a good job of articulating how impressive this is. I won’t even try. I’m not trying.

This seemingly opens the door to a time where you could add annotations to a 3D model in space and have the individual parts of that model respect those notes / designs as they are re-positioned in space.

I’m pretty sure the Tilt Brush team might step in here and say “Wait, no, just the wood mannequin.. um, that’s… well, wait, what you are saying here…. that’s a lot of work.”

But, too late, the potential for it has been shown and I fully believe we will will it into existence. (+10 points awarded for using “will will” in a sentence.

Imagine the kinds of notes and diagrams you might add to a book or paper [ above ] and now imagine being able to do the same thing to a 3D model of a architectural model of the house you are building, parts of an engine design or battlefield map.

 

 

So if you were working on THIS kind of horribly complex thing ( RS-25 Space Shuttle Main Engine ), you could step around it and make notes on each part and make sure your engineering team understood what’s what and where what. Better yet, pull the model apart in VR and annotate each piece in real time. (“Go straight to hell” — Tilt Brush Team)

3. Animation

Animation is coming. They are hinting at it here and I feel it is only a matter of time before we see further support.

You can then imagine working with animated source materials / models, video references used for rotoscoping in 3D (and by 3D I mean… you are walking around the thing you are working on), animated scene lighting, color. The mind goes a bit nuts contemplating what could be done. All of this would be collaborative and worked on in a single shared space and subject to the annotation and communication made possible in the “Painting on models” section above.

So, enough chatter, time to wrap this up.

Summary:

  • I think there’s an inkling of something here that represents something important not only for VR but for our education system as well. Something that has tendrils that might extend into core business and design. Something tremendously important.
  • Keep your eyes closely on the VR tools Google is making but keep an even closer eye on how users are applying those tools out in the field.
  • We are still at the starting line for all of this…

If you found one of the many typos that most surely exists above, give me a shout on Twitter @ID_R_McGregor. This post originally appeared on Medium and is reprinted here with permission from the author.