Since the first mainframe computers were created, designers and engineers have sought more intuitive ways to interact with communication technologies. The Analytical Engine of Charles Babbage of the early 19th century used punch cards and levers to enable interaction with these early computers. Fast forward 100 years and the keyboard, and later the mouse would become the the dominant methods for input for decades.
The advent of the laptop, and then the smartphone helped popularise the touch pad mouse and the touchscreen. But as the devices have become ever more intuitive and naturalistic, so have our methods of interaction. The overweight computational partner used to have a whole room to itself, it eventually slimmed down and sat on your desk. Next it was hopping onto your lap which made you decide that you wanted to take it home. Thereafter you were never more than a few feet away, and it spent much of your time holding it in your hand. The next transition- the leap onto your face, is when computing really gets personal.
Thus interaction methods have changed from levers and buttons to be pulled and pushed, to keys that you tap, the mice that you click, the touchpad that you slide, the touchscreen that you swipe.
With face worn computers, the challenges and potential benefits are significant. As the computer’s screen size has diminished from 19-inch desktop to virtually no screen at all (Magic Leap), we have introduced a need for new methods of interaction.
The 4 levels of interaction
I propose a classification of input devices that view VR/AR platforms from the perspective of the level of interactivity and chronological introduction. Viewed in this way, the missing link that stops VR/AR from becoming truly immersive becomes clear.
First generation AR/VR input
A first generation input device can be considered the motion sensor incorporated into the headmounted device (HMD) which translates the head position to change the scene.
For VR this enables basic interaction by creating a pointer (reticle) on the screen.
Second generation AR/VR input
VR systems literally and metaphorically made a step forward when the software was able to interact with the user’s limbs.
A variety of pointing devices were introduced in the 1990s, notably gloves and treadmills which enable the wearer to move around and include a representation of their hands in the virtual scene. Even without tactile feedback, the introduction of wireless and camera-based limb tracking such as the Leap Motion for VR and and similar technologies for AR considerably improved interactivity.
Third generation VR input
Until recently, wearable eye-tracking has been a niche and comparatively expensive technology, mostly confined to academic uses and market researchers.
However, the potential for foveated rendering has increased interest with the promise of a marked reduction in the computational demands of high resolution, low latency image display.
The other benefit of adding eye-tracking to VR is that it enables more realistic interactions between the user and virtual characters. Speech recognition, a technology that has also benefited from the smartphone revolution can add to eye-tracking by enabling categorical commands such as looking at a door and saying ‘open’.
Facial expressions are the important missing element from VR and AR interactions. HMDs with depth cameras attached have been used to visualise the lower face (e.g. Binary VR), but whether this method proves popular in the future is yet to be seen.
Three potential reasons why this approach may be problematic relate to i) the way humans interact, ii) ergonomic concerns and iii) computational and battery life considerations.
One learning from eye-tracking research is that during face to face interactions we infer information from the eye region. Surprise, anger, disgust and a genuine (Duchenne) smile all require visibility of the brow area and the skin typically covered by the HMD. Hao Li for Oculus research has incorporated stretch sensors in thefoam interface of the HMD to derive information from behind the headset, and it will be interesting to see how this performs when the final version is released.
Mindmaze have revealed their Mask prototype, which requires the user to wear a clip on their ear and according to one account “conductive gel” onthe skin. Samsung have also announced a development nicknamed “FaceSense” although details are still limited.
Emteq’s solution is called FaceTeq and is a platform technology that uses novel sensor modalities to detect the minute electrical changes that occur when facial muscles contract. With each facial expression a characteristic wave of electrical activity washes over the skin and this can be detected non-invasively and without the need for cameras.
Our light-weight, low-cost open platform will herald the 4th generation of VR. Researchers, developers and market researchers will undoubtedly be the initial adopters. However the real advance will be the ability to enable face-to-face social experiences. There are so many areas where facial expressions in VR could improve communication and interactivity. We will be opening our platform and are excited to see what ideas developers come up with. At Emteq, we’re passionate about fostering the 4th generation of AR/VR interaction. We look forward to partnering with headset manufacturers and content creators.
Follow us to learn more about the possibilities and to stay up to date with developments whilst you can of course also follow VRFocus for ongoign developments in the technology space at large
Evidence for the human need to share experiences stretches back to the earliest cave paintings. Scenes from real life or the artist’s imagination were recreated and displayed for others to share. Social interactions have accompanied almost all communication platforms. Reading and writing facilitated theatre and the formal play, early movies facilitated the cinema industry, radio and television broadcasts resulted in families and friends huddled around a single device to consume sports and entertainment. This article explores how innovators are making Virtual Reality (VR) a social experience.
Social interaction within VR can be distilled into three core elements – speech, movement, and emotional expression. Speech is easily captured and communicated using a microphone and VOIP. All VR devices support capture of head movement, and many capture arm/ hand movement. A surprising amount of non-verbal communication can be inferred from these movements, particularly gestures and gesticulations. Whilst elements of body language can be communicated in VR, the communication of emotional expression is lacking. As a work-around some VR social apps are relying on user-triggered emoticons and arm movement tracking to imply feelings and reactions. We’ve seen some interesting demos from Oculus and others using cameras to capture mouth movements. Eye tracking will provide improved face to face interaction but eye tracking by itself is not sufficient.
Whilst we await the release of emotionally expressive VR, there are still a number of companies creating social platforms. One of the largest is AltspaceVR, founded by former SpaceX engineer Eric Romo. AltspaceVR is freemium software that supports high- and low-end VR headsets, as well as a 2D experience on computers and mobile. It allows users to chat, watch videos, and join a range of special events, from NBC News Q&A sessions to live music. Like many early social VR spaces, it’s similar to a VR-based Second Life –built less around sophisticated communication, and more around sharing experiences.
AltspaceVR focuses on simplicity and shared experiences.
For emotional interaction, Altspace focuses largely on voice and physical movement. As platform-agnostic software, it features many ways to communicate physical movement for social interaction – everything from simple controller-based movement, through to full-body motion tracking with Microsoft’s Kinect. However, this approach limits the sophistication of social interaction between platforms – users won’t often have equally elaborate set-ups, and so some modes of interaction might not be reciprocated. In terms of emotional expression Altspace supports a range of emoticons, largely selected by the user through a menu. It also supports eye tracking but, again, this is dependent on the VR platform being used supporting it. The main focus appears to be on connecting with friends and sharing experiences like live events or streamed video in a VR setting, which it does very effectively.
In real life, gaming is naturally a social experience and so it’s inevitable that social spaces are being built and enable playing together. The gaming community has always been quick to embrace new technology that allows them to share play time in new ways. As such, many social applications for VR are heavily game-based, offering up a variety of minigames and tools for users. For example, Sports Bar VR offers competitive pool, darts, and skeeball, Anyland invites users to add and tinker with anything (really, pretty much anything) to their avatars or environment, and Rec Room has online multiplayer paintball, dodgeball, disc golf, charades, and more. These games have simple avatars, often cartoony and without arms, but all players can communicate through voice, movement, emoticons, and hand-gestures. In Rec Room, a fist bump results in an explosion of light – physical interaction is used to perform actions, and now you’ve formed a private party to go play paintball.
Rec Room’s use of the game charades is great for showing the capacity for fun brought with physical interaction in the digital world; getting someone halfway across the world to correctly guess that you’re acting out the movie Jaws in your office is a strange but compelling pastime. VR gaming social spaces focus on the fun of physically interacting and exploring the world and other users around you, and anything they miss in the subtleties of communication are often compensated for with absurdity and silliness from fellow players.
Gaming spaces like Rec Room revel in communicating through exaggerated avatars and situations.
In April Facebook finally launched its own foray into social VR with Facebook Spaces. In Spaces, users are represented by a cartoon avatar, with customised hair, face, and clothing. Spaces integrates Facebook services heavily – users can share photos and videos, take their own inside the space (to share on Facebook, of course), play simple games, or call non-spaces users through Messenger.
Interaction in Facebook Spaces is simple, but effective.
Facebook Spaces is part of a third subset of social VR applications – one step beyond sandboxes like AltspaceVR that focus on sharing content, Spaces is a polished experience built around all aspects of communication. Spaces is sophisticated and modern, and seems to pay quite a lot of attention to conveying authentic interaction. The Oculus Rift headset’s tracking communicate head, arm, and hand gestures to others in the social space reliably and universally. Facebook also invested time in making human-like avatars. Development lead Mike Book stated, “Facebook is about authentic identity, which is fundamentally about humans”, and this ethos is carried through to Space’s characters, who are stylised, but also authentically human and full of emotional range.
Facebook Spaces’ avatars, though stylised, look and feel human in their actions.
What makes Facebook Spaces interesting is the focus on communicating the emotional aspects of conversation. Like many similar applications, avatars’ mouths move in time with microphone output. In addition, the eye positioning of all users is interpreted, creating “eye contact” with others. Given that eye contact is a key form of nonverbal communication, this is a very important development. Spaces also integrates a wide range of emoticons, triggered by movement and by buttons on the Oculus touch controllers. Movement-based emoticons enable some spontaneity in the conversation, but, as Book says, “You have to invoke them. They’re not supposed to be accidental.” The need to deliberately remember to respond in a certain way abstracts emotional communication. Nevertheless, interaction-focused social spaces in VR are making big steps forward to providing authentic human communication in the space.
BigScreen VRhas an interesting approach. Here, the social element largely revolves around sharing 2D content within VR. Users can share their work, games or entertainment content by allowing others to view their PC screen. Lip sync and inferred gaze tracking adds to the interactivity or the cartoony avatars. According to CEO Darshan Shankar, engagement levels have been impressive, and to show their commitment to this new way of collaborating the company holds its meeting in VR.
Most VR platforms can be divided into these three subsets – sharing experiences, gaming, or authentically communicating. In the fledgling VR industry, developers largely haven’t yet looked to tackle more than two of these at a time. While sharing experiences and gaming in VR are natural fits that have seen massive growth, authentic communication in VR is still difficult to implement successfully. While almost all platforms support good interactions in speech and movement, emotional expression is still largely based on emoticons that have to be purposefully triggered by users.
At Emteq, we are working to deliver a virtual reality experience that can interpret and respond to a user’s emotional state. Our Faceteq™ technology allows user avatars to react in conjunction with the user’s own facial expressions – essential to truly authentic communication. Our expression recognition solution will integrate to common headsets and capture the wearer’s expressions accurately. We believe this affective computing is the key to authentic VR and AR social interaction, and will open up new avenues in digital social spaces. . If you’re interested in learning more, do get in touch.
Cancer and heart disease are well recognised causes of ill health, however a recent report by the Institute of Global Health reveals that in terms of the impact on quality of life (measured by Daily Adjusted Life Years, DALYs), mental health conditions affect more people than cancer and heart disease combined.
Furthermore, according to the World Health Organisation, approximately one in four of us will suffer from some kind of mental disorder. Those who don’t might still experience substantial anxiety and stress levels. Mental health disorders and psychological conditions influence our day-to-day activities, and in the U.S., it costs taxpayers about $467 billion in medical expenses ($2.5 trillion globally).
Alternative drug-free techniques like exposure therapy (ET) and cognitive behavioural therapy (CBT) have been shown to be very effective in overcoming conditions like phobias, anxiety disorders, panic disorders, obsessive compulsive disorders (OCD) and PTSD amongst others. However, for many healthcare providers, drug based treatments are the mainstay in spite of the potential side effects and marginal benefits.
Virtual Reality and Mental Health
An exciting opportunity to minimise reliance on pharmacological garments for mental health lies in Virtual Reality (VR). Advances in VR technology allow you to enter a world that is authentic enough to trigger your mind and body to behave as if it’s the real world.
Exposure Therapy (ET) using VR is an increasingly popular alternative method amongst some practitioners to administer safe and regulated therapy for patients suffering from mental health. Previously, technological and cost barriers have limited the use of Virtual Reality Exposure Therapy (VRET) to the private sector. The introduction of mobile VR headsets, including the Gear VR, presents an opportunity to use telemedicine for mental health treatment.
The use of VRET could lead to mobile tele-therapy that can work in collaboration with in-clinic VR therapy. Further still, patient-directed VR therapeutic approaches are currently operating that don’t require the therapist to be physically present.
As a relatively new form of treatment, more data from well-designed large trials and clinical evidence is needed to support VR as an effective tool for therapy. Organisations who want to target this market must conduct randomised controlled clinical trials to prove the efficacy of VRET. Once the technology’s effectiveness has been established, there will inevitably be an influx of VR apps attempting to digitally treat mental health issues.
Exposure therapy and cognitive behavioural therapy (CBT)
CBT is a psychotherapeutic treatment provided by a therapist specialised in mental health disorders. It involves patients participating in a number of sessions that concentrate on an isolated issue, helping the patient recognise and modify troublesome thoughts, feelings, and patterns that produce negative behaviours and beliefs.
CBT can compliment ET well, which over time, gives patients the confidence to confront their disturbing fears and thoughts, head-on. This reduces the peak anxiety an individual goes through when faced with anxiety triggers.
VR is capable of isolating anxiety-related stimuli with a controlled and safe approach. Even though ET goes hand-in-hand with VR, there are several other psychiatric conditions, including autism (see below) and childhood developmental disorders, where VR might have a more active role in the coming years.
Using Virtual Reality to Treat PTSD
There are approximately 8 million adults in America who suffer from PTSD.
VR has been used to deliver prolonged exposure therapy for PTSD patients since the early 90s, mostly for war veterans and soldiers. This is particularly the case in America, where approximately 8 million adults suffer from PTSD. Dr. Albert “Skip” Rizzo is a pioneer in this field. His software, named Bravemind, was created in partnership with globally renowned programmers Virtually Better.
The system is comprised of a customisable and controllable VR environment, a vibro-tactile platform that provides sensations relevant to explosions and firefights, and a scent machine that emits smells like garbage, diesel fuel and gunpowder. All of these simulated sensory components are released at precise times to enhance the digital scenario. There have been multiple clinical studies examining the effectiveness and safety of Bravemind. Current ongoing clinical trials are looking into the use of Bravemind VR therapy for sexual trauma in the military.
Recent research discovered that using VR therapy treatment alone was as effective as a mix of medication and VR therapy. In a direct comparison, prescriptions received greater negative results for patients than the VR therapy.
Privates and smaller organisations producing VR therapy software on tight budgets will still have to prove clinical effectiveness if they intend to make an impact on this market.
Severe Paranoia Treatment Using Virtual Reality
In Britain today 1-2% of the population suffers from paranoia, showing significant mistrust in others to the extent of feeling threatened. Sufferers of paranoia when in social situations may use defensive mechanisms such as reduce eye contact or shortening social interactions. At its most severe, avoiding social interactions all together. These only reinforce the paranoia.
Oxford researchers are trialling treatments of hallucinations using VR. Professor Daniel Freeman and colleagues from Oxford University used VR to test if patients can ‘re-learn’ that social situations were safe, and to reduce the use of defence mechanisms when feeling threatened. Projecting images of train rides, a lift or an airplane where one must encounter several people, and gradually making them busier. Using VR to do this allows the patient to come face to face with their fears, and attempt to overcome them. The patients then transfer the techniques used in VR and presented a significant reduction is paranoid feelings, with 20% no longer presenting severe paranoia symptoms.
Treating Anxiety Disorders & Phobias With Virtual Reality
Approximately 40% of disability worldwide is due to anxiety and depression, and in the US costs the country an estimated $42 billion annually. With so many people affected and the significant cost this represents, technology provides an opportunity for treatment decentralisation. Alternatives such as self-guided therapy or telemedicine present low cost and potentially equally effective results.
Phobias influence the behaviour of approximately 19 million Americans. A recent study from 14 clinical trials suggested that Virtual Reality Exposure Treatment (VRET) was effective in treating phobias. Outlined below are some instances of organizations using VR to treat anxiety disorders.
The Virtual Reality Medical Center has a procedure to treat those afraid of being on an airplane. The system consists of hardware and software, as well mock airplane seats, and even a subwoofer system to imitate the sounds, sights, and experience of flying.
Virtually Better developed a program that treats public speaking, heights and thunderstorm phobias. The business is working with top-name schools and research facilities to take on research and development projects about childhood social phobias and anxiety.
CleVR is a Netherlands-based business constructing VR systems to treat the fear of heights, flights and social phobias, all based on scientific research. The organization is conducting experiments to examine the effectiveness of VR as a therapeutic approach to treating social phobias and psychosis. Using dynamic virtual emotion technology, the general environment of such simulated social scenarios can be regulated.
Psious is a business in Spain that provides a toolkit therapists can use to control and administer VRET, in order to treat patients with phobias. The software consists of VR hardware, a programmable software platform, and devices for biofeedback.
VirtualRet is a tool therapists and psychologists can use to treat and evaluate phobias, including flying, public speaking, the sight of blood, heights, and public places. The developers offer a variety of hardware, virtual environments and parallel services.
A business from Sweden named Mimerse is creating psychological tools for VR treatment, and hopes to partner with Stockholm University and the Swedish Government for mass-market use. Their initial program, “Itsy,” is a game concentrating on treating arachnophobia through a digital therapist. In conjunction with the game’s release on the Gear VR app store, a regulated study is currently being administered that compares VRET with Itsy versus actual exposure therapy. Since most people with phobias don’t obtain professional treatment, mass-market games, such as Itsy, may provide tremendous value for people all over the world.
High Functioning Autism and Virtual Reality
Autism can be classified in many ways. At one end of the spectrum there is high functioning autism that has Asperger’s type symptoms. The symptoms include delayed motor skills, limited understanding of abstract language and obsessive interest in specific items or information. VR has the ability to provide a platform where children can safely practice and enhance these social skills.
Virtual Reality Social Cognition Training (VR-SCT) is able to support children and adults at different ages, adjusting the scenarios depending on the stage of development. For children with autism this could include confronting a bully for the first time, or meeting a new peer. Contexts may remain the same, but the content and complexity may differ depending on age.
Recent studies have suggested using VR-SCT can benefit a child’s emotion recognition, social attribution and executive function of analogical reasoning. Patients were able to practice a dynamic range of social encounters with outcomes dependent on their responses. Therefore VR-SCT has the ability to allow for meaningful close-to-life scenarios with immediate feedback, enhancing the child’s development
Virtual Reality for Meditation and Stress Relief
Whether or not an individual suffers from a mental health condition, many of us go through varying levels of anxiety and stress at some point. Meditation is a long-established approach to improve one’s mood and bring about a more relaxed state of mind. Though relaxation and meditation might not be the sole treatment for any specific condition, their health benefits can be positive.
Besides their work on phobias, VirtualRet and Psious also have solutions for relaxation and generalised anxiety. Another tech start up created DEEP, a special meditative VR game where the player walks through a beautiful underwater environment. The character’s movement is operated through the player’s breath. Proper breathing techniques are at the heart of relaxation and meditation, so the customized DEEP controller allows the user’s breathing to correspond with what is shown in the digital environment, and determines how the player navigates through it.
Unello Design have developed several relaxation and meditation apps for Oculus Rift and Google Cardboard. Examples of these are Eden River, a nature adventure, and Zen Zone, a supervised meditation journey. Players can also check out “sound sculptures” with their 3D music apps.
One of the most famous relaxation apps is Cubicle Ninja’s Guided Meditation VR, which provides a quartet of soothing, deeply engaging environments to experience.
Depression and Virtual Reality
Self-compassion therapy for depression at UCL
For those with depression it is often a case of lacking compassion for themselves, and being highly self-critical. Using VR to place patients in situations whereby they can indirectly provide themselves with self-compassion is one method used with promising results. The method asks the patient to provide a child who is sad with support. As the patient does this, the child ceases to cry, and begins responding positively to the compassion. The patient then takes the role of the child, and receives their own compassionate words back to them, reversing the role and effectively practicing self-compassion.
Feedback from the use of VR for treating depression has been very promising, with patients responding with how they transferred their virtual experiences into real-life scenarios. VR presents a low cost alternative therapy, thus increasing the accessibility to depression treatment and support.
Self-compassion therapy for depression at UCL
Shining Light On Mental Darkness
The demand for improved mental health via VR use as been well documented based on years of scientific research. That said, this market remains in its early stages, since the technology, at least so far, as not been perfected. Measuring emotional responses through facial EMG is a well-established research method. With advances in sensor technologies, machine learning and artificial intelligence, Emteq are launching a low cost platform for researchers in 2017, with consumer versions shortly thereafter. If you’re interested in learning more, do get in touch.