How I Modded Oculus Touch So They Are Similar To Valve’s Knuckles Prototypes

How I Modded Oculus Touch So They Are Similar To Valve’s Knuckles Prototypes

Editor’s Note: If you try this at home, do so at your own risk. 

We’ve known for some time now that Valve has been experimenting with motion controllers that strap to your palms, a design that presents interesting possibilities for a more natural grabbing and throwing experience. Oculus decided to go a different route for their consumer version, even though they also experimented similarly. Nonetheless, I thought that the physical design of the Touch controllers looked ideal for a very simple mod that would attach them to your hands. Now I’ve actually come up with several approaches, two of which work decently, and one of which is so simple that I think everyone, even non-enthusiasts, could at least try. All it really takes is a single rubber band, and some foam or cloth padding. The harder ones use gloves and minimal sewing. There are advantages and disadvantages to the mods, but first, here are the instructions for the simplest.

Step 1: Get an Oculus Touch controller and a rubber band that’s around 9 cm long, relatively “wide”, and high-friction. Then tie the rubber band to the controller wrist strap.

Step 2: Loop the connected rubber band under and over (not over and under) the tracking ring, then around the wrist strap.

Step 3: Hook the rubber band onto the bottom of the controller. The high friction of the rubber band should keep it in place.

Step 4: Adjust things so that the rubber band is what is hooked onto the tracking ring, not the wrist strap. This is critical because the friction of the rubber band keeps it there reliably during play.

Step 5: Add some padding to the strap. I used cloth, and it managed to stay in place by itself, but you may use something you’re more comfortable with.

Step 6: Wear the controller. If it feels too loose, add more padding. If it feels too tight, use a longer rubber band. These ways of adjusting tightness for yourself enables you to preserve the high-friction positioning between the rubber band and tracking ring.

Step 7: Be aware that if you go without padding, the strap may be too floppy, or it may be so tight that it digs into your skin. You can try without padding if you want to get a quick impression, but it’s not recommended.

Step 8: Try it out, and if you like the experience, read on for an even better solution.

The advantages in this method are that it’s very simple and you gain the ability to let go of the controller. It now feels more natural to grab and throw objects in VR. You also stress your fingers less, although Oculus Touch is already a very light controller. Actually, if the controllers were heavy, this mod probably wouldn’t work well. In any case, this implementation isn’t perfect, as expected due to how simple it is, but it also has big problems.

One is that the controller can still fly out of your hand if you throw hard enough, so you should still be careful. The other is that the strap isn’t positioned perfectly with the controller. The strap in this method is positioned such that it forces your hand to grip the controller a bit differently from how you normally would. You might notice when you try it.

Touch was designed so that it fits into your hand a certain way, but my simple solution forces a slightly different position, which means your virtual hands can feel less accurate, or less real. And among various gameplay impacts, it could even mean less accurate aiming in VR, because the angle at which you hold guns would be off. Differences in people’s hand size and shape could add to the problem.

Before I came up with this mod, I thought of, and tried, some other implementations that weren’t acceptable, as they were too awkward feeling, or they made some sort of compromise, like putting glue on the controller, or straining the battery compartment lid. In the end, there are still problems with this method, but I mostly solved them with some additional work in the next two. If you try the single rubber band mod, and find the experience worthwhile, I highly recommend actually doing the implementation you are about to see right now.

Going Deeper With Oculus Touch Gloves

This one just adds in a glove and some sewing, so you do need more things, but it’s not too much, and the experience is greatly improved. What I did was get a fingerless glove that had a velcro strap, wear it, wear the modded controller, and then use my free hand to force the controller strap into a position such that I would be holding the controller in the correct way.

Then I sewed the strap to the glove using “guides” through which the strap could run. Ideally I would put one guide near the index finger knuckle, and one toward the bottom corner of the palm, right where the straps start begins to make contact with the hand, but the fake leather on my cheap gloves seemed quite frail, so I ended up sewing the guides in slightly different places, as you can see in the red circles.

It’s not optimal because it leaves more room for the controller to slide around in your hand, so I really recommend to stick as close as possible to the ends, around where the green circles are.

The gloves shouldn’t be expensive or too hard to find somewhere and buy, and the sewing is very minimal as well, because you’re just making a few loops with string. This is the most efficient mod I thought of and tried. So now you’ve seen my best result, and you can stop reading, but if you want to know about more challenges and alternatives, continue.

Challenges and Alternatives With Oculus Touch Gloves

In any case, this implementation solves the problem of inaccurate positioning, and also lets you throw as hard as you can without letting the controller fly. As it is now, holding the controller feels as natural as it did without the mod. But things could still be better. The controller is now attached to the hand, but the connection is not rigid. When you fully let go of your grip, the controller has the opportunity to somewhat twist and rotate on one axis, because only a single strap holds it in place. The instability caused by this might not be very consequential at all, but my next addition does solve the problem.

This time, I’ve added in a sort of holder, which keeps the controller stable in relation to the hand. I basically cut out a piece of thick rubber material and sewed it onto the glove. Then I looped the strap through an opening in the bottom, and did the usual steps from before, except this time I didn’t sew any guides, as the holder already does the same thing. Now it essentially feels perfect.

But of course, it was only perfect for my hands. Demoing it to other people, I observed some problems for those with small hands. They weren’t able to reach the farthest button, and weren’t able to reposition the controller in their hand to reach it. So actually the more floppy but flexible previous implementation worked better. This solution also took a lot more work. I probably spent an hour cutting, positioning, and sewing. While providing a great experience, I don’t recommend anyone doing this. Instead, the previous one is arguably good enough, or better.

Before I close off, there is one problem inherent to this type of controller that should be mentioned, other than sweat build up, and other than the tracking ring somewhat blocking your hand from opening comfortably. It is the fact that you now need to wear the controllers.

The Wearing Ritual

The wearing ritual gets more complicated. Try wearing these and then wearing the headset or taking them off. Or try grabbing a drink in the middle of playing. It adds quite a bit of fumbling. You wear one controller (using both hands), then you put on the other controller (using your other hand which now has an object in it), and then you put on the headset, while having objects in your hands. I personally found that wearing the controllers first and then trying to slip on the headset was a bit easier than the other way around, which has you blindfolded while putting the controllers on.

This problem unfortunately only gets worse as you use different attachment methods that are more stable and safe (which you would want for a consumer product), and that’s actually one reason why Oculus likely didn’t do it for the consumer version of Touch.

From a Wired article about the creation of the Oculus Touch controllers quoting Peter Bristol:

Should the device be held, or worn? (Answer: Wearing something lets you open your hand fully without dropping it, but getting the second one on—while wearing a headset, no less—made that a nonstarter. “At some point,” Bristol says, “it started feeling like a wearable thing was not going to provide as much value as it was a hindrance.”

I do sometimes find myself feeling slightly more hassled in order to use VR because of this, and end up using the controllers without the mod, but the experience was still worth it. In my opinion, the trade-offs are equally significant.

So here is where my modding stops, but there are many ways, if you want to get more complex or dirty with things, to make a better mod to achieve the goal of attaching the controllers to your hands. You could, for instance, 3D print or mold a sort of holder that clamps to your hand and has a strap, essentially mimicking Valve’s own prototype. In fact, Oculus have provided CAD files for accessory makers, so you could make use of them if you decide to 3D print a holder. You could even develop your own battery compartment lid that integrates with such a clamping system.

I still recommend everyone with Touch to at least try the single rubber band solution and see if it might interest them further. The most stable version of my mod does, in my opinion, take things to another level, but it’s harder to do. Again, if the simple version makes you more interested in permanently using the mod, go get yourself some fingerless gloves with velcro straps, and sew some guides on there. It shouldn’t cost much and shouldn’t be too hard to do. It also isn’t permanent. In addition, there might be solutions out there even more simple and efficient than mine, yet to be discovered. So go and get modding.

Frank He is a student at UCLA who plans to study neuroscience, but is currently focusing on VR out of a life-long desire to experience digital worlds.

Tagged with: ,

Adobe Tech Could Give Cheap 360 Cameras 6 DOF Upgrades

Adobe Tech Could Give Cheap 360 Cameras 6 DOF Upgrades

Virtual reality cameras have been hot hardware in the industry this past month and now Adobe is bringing innovative new software to the table as well.

Earlier this week, Variety reported that Adobe’s head of Research Gavin Miller is claiming a new process for 360 video post-processing.

Adobe’s system can apparently convert standard monoscopic 360 video into a much more compelling format that includes three-dimensional visuals. This process can reportedly also bring 6 DOF (degrees-of-freedom, the ability to lean in toward objects and take steps in any direction) explorability to 360 videos inside a VR headset.

These are both features highlighted by Facebook’s new x24 and x6 VR cameras, which the social media giant unveiled last week. With Adobe’s approach, however, you won’t need to shell out big time for a bleeding edge rig. You’ll simply be able to upgrade the footage from a less expensive 360 cam using what Miller refers to as a “structure-from-motion” algorithm.

In a research paper published by the Adobe Research team, more details are provided about how this algorithm works. An excerpt form the paper’s introduction reads that:

“We present an algorithm that enhances monoscopic 360-videos with a 6-DOF and stereoscopic VR viewing experience. Given an input monoscopic 360-video, in an offline stage we infer the camera path and the 3D scene geometry by adapting standard structure-from-motion techniques to work with 360 videos.

We then playback the input video in a VR-headset where we track the 6-DOF motion of the headset and synthesize novel views that correspond to this motion. We synthesize a new view for each eye in parallel to achieve the stereoscopic viewing experience. Our main contribution is a novel warping algorithm that synthesizes novel views on the fly by warping the original content.

Unlike any other previous method, this warping technique directly works on the unit sphere and therefore minimizes distortions when warping spherical panoramas. Moreover, we optimize our warping solution for GPUs and achieve VR frame rates.”

The line for cutting edge features moves forward quickly in immersive tech. Last year, the HTC Vive proved that once consumers get wind of a compelling feature like hand controls or room-scale, other companies are on the clock to create parity and stay competitive.

Now that 3D 6 DOF has been proven possible for 360 video, anything else will likely seem sub-par. Miller officially presented the algorithm this week on a panel at NAB 2017.

Tagged with:

HTC Vive Is Getting A $220 Plug-And-Play Eye Tracking Peripheral Next Month

HTC Vive Is Getting A $220 Plug-And-Play Eye Tracking Peripheral Next Month

Eye tracking is one feature that could could benefit the performance and affordability of high-end virtual reality headsets. The HTC Vive should become the first mainstream headset to put that theory to the test.

A Chinese startup known as 7invensun (pronounced seven-in-ven-sun) is announcing it will be releasing  a new eye tracking module for the Vive next month. The module is called the aGlass and it will be available for “limited pre-order sales” next month, according to HTC. The company is referring to this first roll-out as a developer kit, but pre-orders are open to anyone. The system will cost about $220 USD.

Unlike other eye tracking solutions that require hardware to be installed at the manufacturer level, the 7invensun devices are modular in nature. The thin plastic overlays can be placed manually inside the Vive headset by the average VR user, according to the company. The eye trackers are designed to be wired directly to the headset over USB. Two separate USB chords are connected to each of the aGlass devices. The two chords are then joined by a USB combiner and fed into the Vive’s single port.

The aGlass consists of two separate trackers built specifically to fit alongside the lenses of the Vive. Each tracker has a halo of IR lights combined with sensors that can track the movements of each of your eyes and eyelids. It is said to support customized lenses depending on the specific vision concerns of the individual customer.

This type of tech can have a variety of use cases but the most immediate is foveated rendering.

Foveated rendering is a process that combines eye tracking and software to adjust the way a VR experience is rendered in real time. With foveated rendering, the PC running your Vive only has to render the greatest detail in the small area on which your eyes are directly focused. This dramatically lowers the cost of the hardware required to successfully show a convincing VR experience. According to 7invensun spokespeople, this tech could allow Vive to run on older generation graphics hardware.

Currently, VR demands graphics cards and CPUs that are among the most powerful that the various manufacturers can provide. With foveated rendering, however, users can lower the workload demanded by their Vives and run VR on older, cheaper hardware from NVIDIA, AMD, Intel, etc.

The aGlass comes with custom software allowing you to manually apply foveated rendering to any HTC Vive experience and the amount of the effect being applied. In a demonstration, we saw the device running with NVIDIA’s VR Funhouse experience with a performance jump from 45 frames-per-second to 90 with the foveated rendering applied. This functionality will only be available with NVIDIA graphics cards at first, according to the company.

According to a spokesperson for Vive, the release of aGlass ties into the team’s stated goal for 2017 which is to “expand the ecosystem” for the headset by providing cutting edge peripherals like this, the TPCast wireless VR system and the Vive Tracker. To that end, Vive is officially referring to the aGlass system as an “upgrade kit” for the Vive.

7invensun is a member of the Vive X accelerator’s second class. This is Vive’s in-house startup incubator that previously gave rise to TPCast and other VR-specific startups.

The aGlass will only work with the Vive upon release. HTC emphasizes that they are not making that a requirement for 7invensun, which has full freedom to develop this hardware for other headsets in the future.

Update: after publishing, HTC confirmed that the price for this system will be around $220 USD. 

Tagged with: ,

TPCast Business Edition Runs Six Wireless Vives At Once

TPCast Business Edition Runs Six Wireless Vives At Once

The TPCast wireless solution for the HTC Vive virtual reality headset is expanding its slate of products to include a multi-user device.

Dubbed the “TPCast: Business Edition,” this system will build on the company’s existing hardware and allow up to six headsets to run wirelessly in a single space.

TPCast BE is targeting a variety of use cases but will specifically aim for “numerous VR vertical markets including arcade, theme park, education, real estate, travel, furniture, automobile, etc.”

The TPCast receivers mounted to the ceiling.

Each of the headsets sends a wireless feed through a TPCast unit attached to the Vive headset. This feed is sent to a receiver that relays the data into a PC and then back to the headset across the same channels. All of this two-way communication has to happen many times each second in order to prevent any lost frames or image latency — each of which are catastrophic for VR experiences.

The TPCast consumer version is able to keep latency to a respectable minimum, but that’s with one headset at a time. TPCast CEO Michael Liu explained to UploadVR that the BE system comes with a transmitter and reciever for each of the six headsets that the system can run. It also ships with a software package that can “dynamically manage” the frequency channels being used in real time in order to prevent latency issues when using the multiple headsets. Liu says that the TPCast BE has no additional latency when compared to the consumer version.

We had the chance to try the new system for ourselves during a demo at Vive’s San Francisco offices.  I was able to play a game of ping pong in VR while standing across from my opponent. Both of us were wearing Vives and neither of us were tethered; It was beautiful. There was a noticeably diminished frame rate during the first few moments of gameplay, but that quickly diminished and performance returned to an optimal state in the first 30 seconds or so of playing. After that, the experience was seamless and impressive.

Vive arcades are becoming more popular, especially in China, and a system like this will should be able to deliver quality multiplayer experiences to those spaces sooner rather than later.

The TPCast BE will be available “later this year” according to the company. Vive spokespeople clarified that this will likely mean sometime in Q3.

The business edition TPCast units are distinguished by blue logos.

According to Vive, the TPCast consumer version is also targeting a Q3 release in the United States and is still working to achieve FCC compliance.

Tagged with:

Hands On: Tracking The Progress With Acer’s Development Edition Windows VR Headset

Hands On: Tracking The Progress With Acer’s Development Edition Windows VR Headset

At an event in New York City this morning Acer showed its latest tech offerings. This included a VR headset that is part of Microsoft’s Windows Mixed Reality platform, simply called the Acer Windows Mixed Reality headset. After some hands-on time with the Development Edition of the Acer headset, how does it stack up compared to the consumer Rift and Vive? Here’s what we saw.

Tracking

At a demo for an internal prototype Microsoft VR headset at the Game Developers Conference in late February, we reported that the headset’s biggest innovation, the inside-out tracking, worked well — albeit we tested it with a very short cord.

This time we got a little more room to move around and the tracking for the new Acer dev kit remains as robust as it was in our previous demo. With the new demo I had roughly four feet in either direction to walk around. Despite jumping around, laying down on the floor, and quickly stepping sideways, the headset didn’t completely lose tracking. It stuttered a bit when coming up from the floor, just a few inches off the ground, but otherwise performed well. My demo station was only about two feet from a wall and the initial calibration of the room took a few tries, but once that succeeded tracking was solid.

The demo was set inside a building called the “cliffhouse.” As I walked around the building trying out apps the tracking never failed. Inside-out tracking continues to be a promising solution to make VR more convenient.

Display And Optics

Beyond the new tracking tech, the biggest question with any new headset is the quality of the visuals. The Acer headset features a 2880-by-1440 LCD screen (1440 x 1440 per eye). With that resolution, the picture was sharp and there was little sense of a screen door. Visually, the screen texture felt similar to the Oculus Rift. The fresnel lenses have the circular imprints similar to the Vive, but they seemed less pronounced than the Vive’s. There were also some God-rays around white letters floating in black, similar to the Oculus Rift.

One issue we saw in our demo prototype at GDC was that the screen was only running at 60 FPS. The result was the picture would display motion-blur when you moved around — creating a fuzziness to the graphics. This development kit for Acer’s headset runs at 90 FPS and no longer suffers from this as badly. There is still a slight blur during motion, though. Overall, this makes it not as crisp during movement as the Vive or Rift, but not too bad.

Interface

The Windows interface in virtual reality continues to impress. Windows were set up on different walls, each with different apps. They performed as they should walking or teleporting around and trying out the apps — I used an Xbox controler to interact with the system. The Edge browser and the video player ran smoothly. The apps themselves feel like a full computer operating system in VR — because it is. And it provides quite a contrast to the simpler interfaces of Oculus or Vive.

Ergonomics

The Acer headset felt good. It is light, only 360 grams (about a sixth of a pound) not counting the cable. It fit well, with a back strap that ratchets for different sizes, with foam face appliance and soft rubber gaskets around the nose.

As in our previous demo, the use of a flip-up screen is a huge convenience so that you can bounce back to the real world to talk to other people, answer your phone, whatever. It makes VR that much more amenable to everyday life.

One problem we had with that prototype from GDC was the length of the cord to the computer. The Acer headset’s cord is much longer — 4 meters or about 13 feet long, allowing plenty of room for movement.

It is said to use software to adjust for different IPDs, but there is no focus wheel like the Gear VR. The space inside the facial foam is large enough to support glasses though, and was not a problem for me. There aren’t built-in headphones but there is a built-in mic.

Conclusion

Overall, the Acer Mixed Reality Development Edition headset is a major improvement over the previous Windows prototype we tried. With accurate tracking using inside-out tech, a quality screen, and comfortable ergonomics, this is shaping up to be a solid release. Though we will have to wait for future details and developments to give it a proper comparison to the Rift and Vive — with the lack of an OLED screen being the most notable difference between those systems and Acer’s.

Microsoft’s Build developer conference is coming up in a couple weeks and we hope to hear more about the many headsets compatible with Windows being developed.

Tagged with:

Samsung’s Facesense Recognizes Facial Expressions For VR Navigation

Samsung’s Facesense Recognizes Facial Expressions For VR Navigation

With body language, humans are able to express a range of feelings with only the most the subtle shifts in demeanor. Our facial expressions alone can show off a huge range of responses. Interaction with VR is something various companies continue to try to nail down, attempting to enable users to manipulate these worlds while breaking immersion as little as possible. Samsung’s experimental Facesense is a new attempt at changing the game, harnessing the power of our facial expression for hands-free VR interaction.

April 14th – 15th at the VRLA Expo, Samsung showed a new creation from their  C-Lab division. C-Lab cultivates ideas that are more experimental and this particular one provides a new way to navigate within VR. Detailed in an announcement, Facesense tracks electric signals that are created any time we speak, change our expression, or shift our gaze. Those signals are then used for navigation input along with a few spoken commands.

We tried the technology at VRLA briefly and it was a very early concept. It likely won’t be something that dominates as a primary means of interaction within VR, but it could be a complement to VR controllers. We’ll have to see if Facesense can find a degree of consistency with its input across all users. It could also serve as an option for those that cannot use VR controllers, opening up accessibility to virtual technology in a big way.

Tagged with: ,

Antilatency Is An Intriguing Mobile Positional Tracking Solution

Antilatency Is An Intriguing Mobile Positional Tracking Solution

At VRLA we got a look at an impressive tech demo from a startup out of Russia that is moving to San Francisco soon.

The startup is called

The system features a small on-board camera on the bottom of the headset facing down and outward. It detects rapidly pulsing infrared light markers on strips at the outer edge of the tracked space. For the demo we saw there were four strips placed in a square. Jumping, crouching quickly, moving back and forth and even stepping outside the square itself all worked well as long as the camera could spot one of the markers. Each marker strip was powered by a portable USB battery pack.

The unit on the headset also includes additional inertial movement hardware, like a gyroscope and accelerometer, to hone the tracking and provide a few seconds of redundancy for any moments when the camera loses sight of the strips. This could be useful for multi-user setups when other people block the view of the markers — Antilatency says the tracking solution should work outdoors as well. During the few very brief moments when the headset did seem to lose tracking my view of the world seemed to “swim” or float as if I was bobbing around a pool. This was not a new feeling — it happened during a brief tracking hiccup in Facebook’s Santa Cruz prototype last year. Overall, though, the tracking lapses in Antilatency were barely noticeable.

“It should be fixed in one month, it’s just a question of factory calibration,” a company representative said. “It’s a known issue and the solution is known as well.”

That said, the demo shown was an extremely sparse virtual area with a TV and a couple chairs that looked like it was inspired by the empty white room scene from The Matrix. So I wasn’t able to test the hardware with any fully interactive software that would’ve painted a fuller picture of how the system could perform.

Antilatency clearly put a lot of thought into the design of its hardware — with the strips easily rolling into a tiny can for shipping and its tracking module being extensible to a thin wand-like hand controller with the camera on the tip. The controller was merely a concept at this point. The light strips can be placed overhead, too, according to the company, though the camera would need to be moved to be facing upward. Overhead light strips would alleviate worry about accidentally stepping on them.

The company hinted they have announcements planned regarding next steps, and we look forward to hearing more from them.

Tagged with: ,

NVIDIA Offers Three Free VR Games With Select New Hardware

NVIDIA Offers Three Free VR Games With Select New Hardware

In the grand scheme of things, we’re still in the early stages of virtual reality adoption. While the price of admission may still be a bit on the higher end for some, the software ecosystem is inspiring people to invest and get hardware that can support these immersive experiences. Yesterday we reported on an Oculus bundle that nets you a VR-ready PC, Rift, Touch controllers, and four games for $1300. Today, NVIDIA has revealed a bundle of their own.

Starting today and for a limited time, anyone who grabs the VR-ready GeForce GTX 1080 Ti, GTX 1080, GTX 1070, or GTX 1060 graphics cards, or a PC/laptop equipped with one of those cards, along with an Oculus Rift & Touch Set, will receive 3 VR games entirely for free: The Unspoken, SUPERHOT VR, and Wilson’s Heart.

The Unspoken is basically a magic fight club with addictive gameplay, the recently released Wilson’s Heart provides a platform best narrative experience, and SUPERHOT VR is an adrenaline rush that’s likely the most well known of the three across the gaming community. The three games are certainly nothing to shake a stick at; they all arguably represent the best the Oculus Rift ecosystem has to offer and, coincidentally, all received 9 out of 10 from the UploadVR editorial staff.

If you’re in the market for a VR-headset but don’t quite think your PC can carry the load, now’s a good time to grab a bundle that brings you into the fold and gives you some of the best content you can experience to boot.

Tagged with: ,

Nokia’s OZO Will Be Used For Star Wars: The Last Jedi VR Content, More Disney Projects

Nokia’s OZO Will Be Used For Star Wars: The Last Jedi VR Content, More Disney Projects

The Star Wars films are always a visual feast with top-tier visual effects constantly on display. The creative team behind Rogue One found a way to utilize VR while putting the movie together, dropping the director into space so he can find the best shots for the film. They even went the extra mile with VR promotion, a tool many different films have taken advantage of, by crafting a teaser called Recon that served as a prequel to the film’s main events. Rogue One was a branch off from the main Star Wars storyline, tying some loose ends revolving around the Death Star but, now, Nokia and Lucasfilm plan to deliver some behind-the-scenes VR content for the coming film The Last Jedi.

Nokia and Lucasfilm are in a multi-year deal to create VR content for Star Wars: The Last Jedi and more Disney properties. They’re supplying Disney marketers and filmmakers with the OZO 360-degree camera and accompanying software.

“We want nothing more than to share the Star Wars universe with fans around the world and fans tell us they love to have the opportunity to learn more about the process of filmmaking,” says Senior Vice President, Franchise Creative & Strategy at Lucasfilm Brian Miller. “Utilizing the Nokia OZO to capture our behind the scenes material allows viewers to be transported to fantastical locations and virtually visit the incredible sets where their favorite Star Wars scenes were captured.”

A look behind the curtain for Star Wars and other Disney films is certainly a great opportunity for fans. So many paths are being blazed across the VR industry and companies are trying to put their hardware in the best positions to succeed. Having Nokia’s 360-degree camera attached to a media powerhouse such as Disney is going to be a big win for the camera maker, and won’t escape the gaze of content creators already involved in VR or those that will be as the industry grows.

Tagged with: , ,

Google Has A New VR Camera And A ‘Roadmap’ For Answering Facebook

Google Has A New VR Camera And Is Giving It Away for Free To Creators

Google’s JUMP team is announcing a new, virtual reality camera today that is smaller and lighter than its predecessors. The new unit is being positioned as an affordable, portable, easy to use system that will allow creators to make high quality 360 videos more easily.

This new camera was built in partnership with YI technologies. It is known as the YI Halo and Google is referring to it as the “next generation” for its JUMP platform.

The Halo combines 17 4k YI Action cameras into a ring. One of these is called the “Up” camera and faces the sky in the center of the circle of cameras to capture what’s above the viewer. The entire rig weighs less than eight pounds.

In addition to the hardware, the Halo will also be able to take advantage of Google’s cloud stitching technology.

The meticulous process of interweaving the feeds of so many high-end cameras is time consuming and third-party stitching companies can charge thousands of dollars per minute for completed footage. According to Google, however, the JUMP Assembler, can complete the stitching automatically using massive server farms and deliver “seamless, artifact-free stitches,” to creators in just a few hours.

Google is also announcing Jump Start — a program that will provide qualified creators with free units to use on their 360 projects.

According to Google:

“To get Jump cameras into the hands of more filmmakers than ever before, today we’re also announcing May 22nd to x24 and x6 VR cameras are capable of creating three dimensional footage that a user can actually walk around in and explore with six degrees of freedom.

When asked if Google would be making an effort to match this sort of technology, a spokesperson responded that the JUMP team “has a roadmap” for new products, but won’t be making any announcements until they have something finalized.

Tagged with: