Community Download: What’s The Most Exciting Future Use For VR To You?

Community Download is a weekly discussion-focused articles series published (usually) every Monday in which we pose a single, core question to you all, our readers, in the spirit of fostering discussion and debate. For today’s Community Download, we want to know what you’re most excited about for the future of VR?


Here at UploadVR we do our best to not only report on the current state of the VR industry as it relates to consumers, but also to report on what’s coming next. We need to keep an eye forward on the future, especially with a technology that’s still so early in its life with so many changes happening rapidly all the time.

One example is the rise of hand tracking and finger tracking thanks to the Valve Index controllers and the Oculus Quest embedded tracking cameras. People are using it to connect online and communicate with sign language, control their desktops without any controllers, mice, or keyboards, surgeons are using VR for training, and soon we’ll have realistic real-life avatars too. There’s a lot to look forward too, especially if you consider foveated rendering, standalone headsets becoming more of a thing, and so much more.

This leads us to the key discussion question of the week: What are you most excited about for the future of VR? Do you want a big generational leap forward in visual fidelity, are you hoping for a groundbreaking social VR app, or do you look forward to creative use cases more than anything?

Let us know down in the comments below!

The post Community Download: What’s The Most Exciting Future Use For VR To You? appeared first on UploadVR.

Community Download: What’s The Most Exciting Future Use For VR To You?

Community Download is a weekly discussion-focused articles series published (usually) every Monday in which we pose a single, core question to you all, our readers, in the spirit of fostering discussion and debate. For today’s Community Download, we want to know what you’re most excited about for the future of VR?


Here at UploadVR we do our best to not only report on the current state of the VR industry as it relates to consumers, but also to report on what’s coming next. We need to keep an eye forward on the future, especially with a technology that’s still so early in its life with so many changes happening rapidly all the time.

One example is the rise of hand tracking and finger tracking thanks to the Valve Index controllers and the Oculus Quest embedded tracking cameras. People are using it to connect online and communicate with sign language, control their desktops without any controllers, mice, or keyboards, surgeons are using VR for training, and soon we’ll have realistic real-life avatars too. There’s a lot to look forward too, especially if you consider foveated rendering, standalone headsets becoming more of a thing, and so much more.

This leads us to the key discussion question of the week: What are you most excited about for the future of VR? Do you want a big generational leap forward in visual fidelity, are you hoping for a groundbreaking social VR app, or do you look forward to creative use cases more than anything?

Let us know down in the comments below!

The post Community Download: What’s The Most Exciting Future Use For VR To You? appeared first on UploadVR.

Sign Language In VR ‘Worth Exploring’ As Hand Tracking Improves

Prototype software using a new generation of hand tracking technology is showing promise for facilitating some forms of sign language learning and communication, regardless of physical distance.

Methods to track the movements of hands and fingers have been pursued for decades and the technology is seen as one of the most challenging problems to solve when it comes to consumer VR. The barriers to reliably, and affordably, translate these movements is staggering given the number of potential configurations for your digits and environmental conditions that could wreak havoc on hardware systems.

Recent implementations like Ultraleap (previously known as Leap Motion), HoleLens, and the Valve Index VR controllers recognize some of the movement of all five fingers with hardware priced to appeal to businesses or consumers. There’s also a supportive community of signers in VRChat who’ve adapted various handshapes to work around the grip of a held object.

Oculus Quest Hand Tracking ASL Test

The latest tests come by way of Facebook’s Oculus Quest, which started accepting store submissions for hand tracking apps at the end of May 2020. The standalone headset doesn’t require any external hardware to work, though that wasn’t entirely true when it originally launched in May 2019. That’s because it required a pair of Oculus Touch controllers held in each hand for input. Now, the headset can be operated with gestures like pinching and pointing with a handful of apps available for purchase using the bare-handed input system. Also, a free app was just made available on SideQuest designed to teach up to 23 handshapes from the German Sign Language’s manual alphabet.

Over the last two weeks I also used an experimental piece of software made by Daniel Beauchamp (aka @pushmatrix) to play rock paper scissors and explore the basics of social expression possible with my hands represented in VR. As soon as the video showing our initial interview hit the Internet, a number of people started asking about whether it might be useful for signed language conversations. My network of contacts put me in touch with a number of helpful folks, including three who met up in VR this week to test and offer feedback on the current state and future potential of the feature.

All three participants know American Sign Language and one is deaf. The deaf participant is Christopher Roe and he’s represented as a green sphere in the video below. He explained that he hasn’t been in close contact with other deaf people for many years so wasn’t recently practiced in ASL.

Based on the test, a number of severe limitations to the current state of the technology on Oculus Quest are clear. Hands in front of one another can block the view of the cameras on the headset used to track the hands, resulting in tracking loss or misrepresentation. Ambient lighting can affect the tracking quality, fingers on one hand can’t cross one another and a number of fundamental handshapes used to represent letters like P, Q, K, M, N, E were hard to sign or distinguish.

Roe expressed some of his feelings following the virtual meetup. He sent me the following comment:

“Even as rough as it was, it’s awesome. Sure, you could have some kind of group video chat or FaceTime dealie instead, but what VR brings is a sense of actual proximity and presence. That’s something I remember missing a lot in my youth, because I went to a residential school for the deaf in Riverside, CA.

“The student body there came from all over South California, and frequently, students are the only deaf people from their respective hometowns or only know a few deaf people in their hometowns. So, while you’re at school, you’ve got all these friends you see 5 days a week for 16 hours a day, and then you go home…and it’s mostly just family/hearing people/the couple other deaf people you may or may not know in your hometown. I hated weekends, they were boring and lonely and I couldn’t wait for the Sunday bus back to school so I could once again be around people I could actually communicate with and who understood me as a fellow deaf person rather than some broken handicapped object of pity.

“When I graduated, I had a moment of severe anxiety because where do you go from there? There weren’t any prospects for me in my hometown of Desert Hot Springs, and I certainly didn’t know any other deaf people there. I ended up moving to Manhattan Beach and later Torrance to be closer to the friends I’d made in school. This was in the ’90s, cellphones and FaceTime weren’t a thing.

“A fully functioning VR sign language chat system would make the world much smaller and far more comfortable for a lot of deaf people who grew up under similar circumstances. They’d get the feeling of being WITH people, not just signing at a Brady Bunch grid of choppy webcam streams on a tiny screen. Throw in customizable environments and stuff like that, and you’ve got a virtual party venue where deaf people can actually communicate as first citizens rather than struggling with awkward text inputs or being completely left out of spoken conversations because nobody else wants to mess with crappy virtual keyboards either.

“VRChat and Altspace make my communication anxiety peg HARD because I’m trapped in a world full of ambiguous avatars who I can’t lip read or sign to or easily communicate with, so it’s literally a personalized hell for me, and bringing sign language or even a better virtual keyboard that uses hand tracking would be a massive QOL improvement for deaf people in social VR apps.”

Another participant in the test, Shannon Putnam, was able to recognize the third participant, Cy Wise, had learned to sign in Austin, Texas, spotting what was described as a kind of regional accent distinctive to the Austin signing community.

Over the course of their time in the software — spending less than an hour total — the participants started recognizing where the Quest’s outward-facing cameras would fall off in their ability to accurately recognize movements. This led to better expression by staying within those bounds. They also learned slowing down their signing could help with expression.

State of Research

The super simple avatar system in the informal test could convey some basic movements of the head — like nodding or shaking — with a shape for mouth movements driven by the volume of the voice speaking into the headset’s microphone. With no representation for the body, though, and no actual face tracking, there are severe limits set on large parts of signed language expression relying on those movements, “such as constructed actions, important grammar information in facial expressions and body movements, and the most unique aspect of signed languages, the spatial grammar,” explained PhD researcher Athena Willis.

A paper Willis shared with me from 2019’s ACM SIGACCESS Conference on Computers and Accessibility provides an interdisciplinary perspective on the challenges of sign language recognition, generation, and translation, pointing out that “in some cases the movement of the sign has a grammatical function. In particular, the direction of movement in verbs can indicate the subject and object of the sentence. Classifers represent classes of nouns and verbs – e.g., one handshape in ASL is used for vehicles, another for fat objects, and others for grabbing objects of particular shapes. The vehicle handshape could be combined with a swerving upward movement to mean a vehicle swerving uphill, or a jittery straight movement for driving over gravel. Replacing the handshape could indicate a person walking instead. These handshapes, movements, and locations are not reserved exclusively for classifers, and can appear in other signs. Recognition software must differentiate between such usages.”

When it comes to the consumerization of VR and AR technology, major platform companies have deep pockets with well-paid research teams working on a wide range of ideas from predictive keyboards that work by touching your thumb to different fingers, facial expression-sensing headset liners or cameras directed at the face, and hyper-realistic and full-body avatars.

Sony, for example, recently showed a hand tracking system with a lot of subtle expression seemingly conveyed.

Facebook also employs some of the world’s leaders in machine learning that could be critical to filling in gaps in tracking data allowing for better communication via hands even when faced with strict battery power constraints and varying environmental conditions.

In particular, Facebook recently leaked the existence of a prototype headset codenamed “Del Mar” that’s likely a next generation Quest. Indications found in code suggest it might have cameras that sample its surroundings at a higher rate than the current generation hardware. This could lead to better hand tracking. Infrared light emitted in a pattern toward an area directly in front of a headset wearer in future hardware might also enable more robust hand tracking absent visible light.

Tech giants are also competing to lower the cost, weight and power consumption of future VR and AR systems. Facebook, for example, shows reluctance to deploy features like eye tracking that could enable huge leaps in comfort and visual detail in VR, but which might not yet be reliable or fast enough to work all the time. At the same time, some of Facebook’s VR headsets like Oculus Go and Rift S feature lenses set at a fixed width that fit the average distance between people’s eyes, a limitation that may make the image blurry or less comfortable to people outside the average range.

So there are competing priorities weighing on researchers and designers building next generation VR (and AR) systems, and it is an open question whether robust expression for signing languages like ASL are near the top of those lists.

Is Sign Language A Priority At Major Tech Companies?

I reached out to Facebook to ask whether it is a goal to robustly support signing languages like ASL (there are hundreds of other signing languages according to the World Federation of the Deaf) in future headsets, or in upcoming software updates to the current Quest. I also asked whether they have any deaf employees working on the hand tracking team. Lastly, I asked if the team responsible for this feature is consulting with members of that community to consider better support for signing in VR and in Facebook’s forthcoming VR-based social network, Horizon.

Facebook sent over the following prepared statement in response:

“We believe that by enabling people to use their real hands in VR, hand tracking opens up many new capabilities and use cases. We’ve been glad to see how the community has been experimenting with hand tracking to prove out these new possibilities, including several interesting projects related to ASL or sign language…We have an Accessibility Task Force within AR/VR at Facebook that was created by individuals across the business who want to improve the accessibility of our products, software, and content while making them more inclusive of our community, including those who are hearing impaired. There are people with disabilities who participate in this taskforce. No particular product plans to share right now related to sign language, but as we’ve said, we’re very invested in driving new modes of input and social interaction in VR, so we think it’s an area worth exploring.”

While that statement seems generally pretty supportive of the efforts, and the comment is somewhat non-specific so as not to reveal future product plans, the phrasing “worth exploring” doesn’t establish the priority level Facebook is assessing sign language at with regard to future designs.

“When the movies made the switch from silent films to talkies, deaf community lost broad and convenient access to movie theaters until the recent decade,” Willis explained in a direct message. “Now a new generation of technology, especially virtual agents and smart home agents, are threatening to leave signers behind again.”

Beauchamp plans to make the software he made for this test part of a sample scene for Normcore, the the networking software development kit he used to build the project for Oculus Quest.

The post Sign Language In VR ‘Worth Exploring’ As Hand Tracking Improves appeared first on UploadVR.

Skyfront VR Reveal Major Update That Adds Grenade Launcher

Skyfront VR, the fast-paced multiplayer first-person shooter (FPS) that pits players against each other in virtual reality (VR) has announced a new major update that brings with it the long awaited new weapon.

Skyfront Pineapple Pop

The release of the Pineapple – what the grenade launcher has come to be known as – follows weeks of hype from the developers Levity Play. The new weapon allows players a chance to deliver some explosive justice to other players with not one but three grenades that can be then detonated remotely. To mark the release of the new weapon, Levity Play have also announced the return of the Skyfront Skirmishes to give players a chance to play in a limited-time gamemode that is all about this new weapon.

Titled Skyfront Skirmish: Pineapple Pop, the gamemode will take place between April 13th until April 16th at 5am UTC. The event is being run in collaboration with Aerial Sports League and will feature many prices for players to earn include a drone, Skyfront merch and Steam gift cards.

Skyfront VR screenshot

The weekend long event will challenge players to compete for a certain high score with the highest earner being called the victor. For this one the number of grenade launcher kills is what counts with deaths not having any impact on the score. Other weapons and skills are allowed by only those with the grenade launcher will count towards the high score. The remaining full rules, including how abusive behavior will not be tolerated, can be read on the Steam announcement.

Skyfront Skirmish: Pineapple Pop will also bring the first real-life trail of the Skyfront tournament app, which will allow VR arcades and eSports event organizers to tweak the game’s settings, such as match length and number of players, alongside a spectator function which is designed for broadcasting tournaments. Though the release of this depends on Steam’s approval process, it is a big step forward for the title and the business customers who want to expand their coverage and offering of the Skyfront VR.

VRFocus will be sure to bring you all the latest on Skyfront VR in the future so stay tuned for more.