Can VR and AR Help with Phobias?

In line with Mental Health Awareness Month this May, we’re doing a series of articles looking at the relationship between the technologies we cover and mental health. Today, we turn our eye to the use of augmented and virtual reality to treat something almost all of us will be affected by: phobias.

What are Phobias?

Phobias differ from simple fears in that they are excessive and debilitating. Crucially, they are also irrational. Phobias encapsulate a host of fears, experienced to various different levels. While almost all of us will be mildly fearful of something, it becomes problematic when that fear is more intense – to the point of stopping people living their daily lives. Consider someone with needle phobia who cannot visit a doctor, for instance. Phobias can develop from critical, traumatic events, but can equally just slowly build over time. 

With that in mind, many people seek treatment to overcome their phobias, with a practice known as “exposure therapy” being one of the most popularly deployed. That involves a therapist attempting to help a patient overcome fears, anxieties and phobias by gradually introducing them to the object of their fears. Importantly, exposure therapy does not seek to “cure” phobias, instead equipping patients with the confidence to manage an encounter with the object of their fears.

The analogue version of such an approach involves watching videos or encountering a stimulus in the real world. Increasingly, however, VR and AR technology is being deployed instead. The genius of using AR and VR in such an approach is that patients can be exposed to a virtual representation of something that scares them while ultimately knowing that they are in a safe environment.

Just having that knowledge that what you are seeing is virtual does not, of course, eliminate the fear response (as countless videos of people screaming while falling off virtual planks can attest to). While Richie’s Plank Experience is less an exercise in curing phobias, and more an exercise in exploiting them, there are plenty of serious attempts at using VR and AR in a clinical setting.

XR Treatments

Happily, this is not some pipe dream. AR and VR are currently being used in a host of real-world medical scenarios. And medical practitioners themselves are increasingly realising the benefits of the technology. According to GlobalData’s 2021 poll on digital health in neurology, 18% of 109 industry respondents thought AR and VR solutions would be the most suitable technology to treat mental and behavioural health conditions.

In 2020 in the UK, the Norfolk and Suffolk NHS Foundation Trust (NSFT) announced it had introduced virtual reality headsets to help patients combat phobias ranging from a fear of needles, heights, flying and spiders to agoraphobia and claustrophobia, as well as more unusual fears such as exams, driving, public speaking and storms.

“We are also able to completely control the virtual environment, such as changing the weather conditions for someone who is scared of flying, which can really help to prepare them to face every possible scenario in their everyday lives,” said Nesta Reeve, consultant clinical psychologist and clinical lead for the Wellbeing service. “Ultimately, we hope that more people will come forward to ask for help with their phobias now that we have this technology, and that using VR will help them to get better more quickly so that they can enjoy activities that many of us take for granted.”

VR also has enormous utility when it comes to placing people in locations it wouldn’t be safe to in the real world. When it comes to phobias, this is perfect for addressing a fear of heights, and NHS services in Oxfordshire and Buckinghamshire in the UK are duly offering VR exposure to heights.

Virtual reality treatment to be offered to NHS patients in Buckinghamshire and Oxfordshire

Their approach even leverages a virtual therapist who appears via a computer-generated avatar, and is voiced by a real person. In a randomised clinical trial of fear of height treatments, the VR therapy was shown to be effective for a fear of heights (which affects 1 in 5 people). After spending an average of over two hours in the simulation, all participants showed a reduction in fear of heights, with the average reduction being 68%. 

Retired paramedic Richard, who received the treatment, said: “I lived with a debilitating fear of heights for all of my life and had to organise my life so that I completely avoided all situations that exposed me to heights and altitude as I would experience intolerable anxiety. Since having the VR-enabled therapy I can now go to my local shopping centre and I am able to freely walk around and go to all floors and even look out over the balcony.  This is something that would have been simply impossible for me to do before having this treatment.”

Help at Home

While the use of AR and VR is a potentially cheaper option for treating patients in a clinical session, particularly for patients who only need a small amount of contact with a therapist, it also means help can be accessed remotely and in the comfort of one’s own home. 

This concept of self-help powers services like oVRcome, which offers a range of VR simulations and modules for phobias, anxiety and depression – for use by children, teens and adults. The company provides its own virtual reality headset by which the content is accessed, which works by inserting a smartphone to display the content.

Meanwhile, the fact that AR can be so easily accessed from the ubiquitous smartphone means it has the greatest potential for at-home use, democratising exposure therapy by significantly lowering the barrier to entry. AR also confers its own benefits compared to VR, by blending a user’s own body and real surroundings with virtual elements. 

One example in the world of AR is Phobys, a smartphone app which is designed to reduce the fear of spiders. The experience was developed as part of a study which found the “intervention led to significantly lower subjective fear” over a controlled two-week trial. 

Effectiveness and Benefits

Among the other benefits of AR and VR exposure therapy is its repeatability (with patients allowed infinite attempts at their own pace) as well as privacy, with patients not being forced to reveal their fears to the outside world.

Interestingly, a study of the efficacy of VR for exposure therapy found varying amounts of potential depending on the specific phobias. In comparison with real-world exposure therapy, there was found to be no major difference when using VR as compared to the real thing. As such, the study recommends that VR for exposure therapy should be disseminated further – especially considering the potential for new technologies and techniques to have an even greater benefit.

Summary

It’s clear that AR and VR technologies have enormous potential for helping people safely overcome their phobias. Even in their relatively nascent stage, they are being put to use to help people in the real world. As technology develops, and virtual worlds become ever more realistic and interactable, there’s every chance AR and VR will become the treatment of choice for phobias.

Feeling the Scenic Burn With VZfit

VZfit

I’ve always wanted to explore more of Scotland. It isn’t that far away and I’ve only visited as far north as Glasgow, so there’s a whole Highlands adventure to be had. The need to get out and about has never weighed heavier than it has recently and I‘m sure many of you have already decided on where you’d want to travel first. But what about right now? And what if you could explore far-flung places whilst you get healthier at the same time? That’s been the goal of VirZOOM for the last few years, and with the recent arrival of its VZfit app for Oculus Quest making it easier than ever, now seemed as good a time as any to get back on the bike.

VZfit

If you’ve not heard of VirZOOM before the company originally started out in VR hardware, launching a virtual reality (VR) bike controller in 2016 in combination with two apps VZfit Play and VZfit Explore. The first featured a bunch of basic mini-games such as riding horseback whilst lassoing people or piloting a helicopter through a canyon. But it was the latter VZfit Explore that really caught people’s attention, using Google Maps’ Street View imagery so you could cycle through Rome or the Alaskan mountains.

On your bike…

And it’s that experience that has been refined into VZfit, enabling you to cycle almost anywhere in the world, without leaving your living room. It’s this sole feature that sets VZfit apart from every other VR fitness app on the Quest platform. Whilst all the rest focus on rhythm action elements here you can simply enjoy the open road – and it certainly helps the whole illusion having a desk fan on nearby wafting a gentle breeze.

Opening up VZfit you’re presented with two options, continue with or without a bike. Without and you’ll hop onto the virtual ‘exerboard’ (more on that later), or with and you can connect a cadence sensor to use a normal exercise bike. VirZOOM has now moved to just being a software company so it doesn’t make the bike controllers anymore – which VRFocus was using – instead, opening up its software so that adding a $20 sensor to an exercise bike you already own removes a lot of the friction.

VZfit

It can still be a little fiddly, to begin with, however, as you still need the Oculus controllers whilst sat on the bike to go through the options and get yourself set up. The quickest way was to dive straight into a pre-set course like the nice long roads of Colorado. There’s a wealth of options to tailor and refine your cycling experience, selecting a nice leisurely 10-mile cycle or going for a gruelling 50+ miles. Once on a ride, it’s best to delve into the in-game options as you can adjust what information is shown like distance and time, comfort options and crucially…turning.

This is essentially a cycling title after all so you can choose to steer around corners by physically leaning, aiding that immersive aspect. It does feel a little weird leaning left and right on a stationary bike yet after a few miles that natural feeling does kick in. You can always select auto turn if it isn’t to your liking.

Street View in VR

What you really want to know is how well Google Maps’ Street View imagery translates into VR, taking flat 2D images and dropping you in the centre. Good, to a point. Now let’s be clear here, the method doesn’t offer a seamless journey through picturesque countryside because each image is taken a few metres apart. So a long open US road with distant mountains works very well, whilst a tight, winding road in Europe doesn’t fair as well. Also, the fewer cars the better, it does break reality somewhat when there are too many flat cars in view.

VZfit

Google’s imagery can also be a little erratic so certain parts of the road may jump between different parts of the day for example. Thankfully, these moments weren’t too often. One option VZfit gives you is the ability to make your own route, pinning an A point and B point on a map to cycle. I set up a route around Lock Ness – part of my Scottish adventure – and there were no real issues, just a pleasant ride around the loch. The only downside was not being able to actually stop, jump off the bike and take in the view.

Most importantly, it sure as hell beats staring at the four walls of my living room whilst providing some much-needed motivation. I quite quickly forgave the occasional janky imagery as it was refreshing to cycle somewhere new. Even more so when it came to mountainous regions which my fitness level is nowhere near achieving in the real world. Plus you can always adjust the tension on the bike for an increased workout.

No bike required

So what if you don’t have an exercise bike, don’t want one or simply don’t have space for one? As mentioned, that’s where the exerboard comes in and where VZfit really adds value for money. As the screenshots showcase, think of the exerboard as a circular exercise mat on wheels, the more you move the faster it’ll go – not too fast mind you.

VZfit

You can still head out on all the same routes, no difference there, but you’re not on your own as there’s a trainer with you. They’ll be up in front encouraging you to keep moving with a bunch of workout routines for you to copy. These range from jogging on the spot to lunges, knee raises and loads more, about 50 at the moment, making you work muscles you’d forgotten about.

Because this option provided a full-body workout with less hassle than grabbing the bike from the corner of the room, using the exerboard soon became my go-to choice in VZfit. Being able to grab the Oculus Quest, open up the app and continue a route from where I left off was faultless, and fun. There’s even a basic radio offering a selection of genres if you need some music – use this Spotify trick if you have a premium account. You can also take a selfie next to a famous monument should the mood take you.

I could walk 500 miles…

VZfit is part of this new slate of VR titles for Oculus Quest which offer a subscription model. Rather than a one-off videogame price you sign-up for a monthly membership – like a gym but without all the mirrors and having to, you know, leave the house – which is $9.99. You can download the app and try it out for seven days before deciding on a membership to see if it’s right for you.

Signing up for a subscription is always a personal choice, weighing up finances, value for money and whether the service is good enough against rivals. As I like to travel, for me VZfit ticks both mental and physical wellness boxes. My body’s getting the workout it needs and my mind has room to breathe and discover new places. So I don’t see the monthly admission cost as too much of a sting to cycle the fjords of Iceland.  

Surgeons Use Mixed Reality to Conference Call & Consult On Surgery During a Live Colonoscopy Operation

New immersive technologies such as virtual reality (VR), augmented reality (AR) and mixed reality (MR) are currently being used in many different ways. From gaming, automation, education and therapy, these immersive technologies are helping train as well as simplify communication between people. (If you need a quick guide comparison guide on these technologies, check out VRFocus‘s guide here). For the first time, three surgeons from Mumbai and London became digital 3D avatars in an operating theatre at The Royal London Hospital and were then able speak to one another in real-time to discuss on how to operate on the patient with the aid of pre-uploaded patient scans.

Aetho’s Thrive software on the Microsoft Hololens is a MR application all about connecting people and information in immersive environments. Professor Shafi Ahmed at the NHS’s The Royal London Hospital was doing a colonoscopy operation on a patient wearing a Microsoft Hololens – as seen in the image below.  Professor Ahmed explains that they chose to do this project to “think about the way we communicate from doctor-to-doctor or doctor-to-patient.”

Professor Shafi Ahmed in an operating theatre, wearing a Microsoft Hololens.

Professor Ahmed was joined by Professor Shailesh Shrikhande, a Cancer Surgeon at Tata Memorial Hospital in Mumbai (the largest cancer hospital in India), as well as Mr Hitesh Patel, Consultant Colorectal Surgeon at BMI The London Independent Hospital. They were also joined by Ian Nott, Co-Founder and CTO of Aetho who was based in Atlanta, USA. All four participants wore a Microsoft Hololens, appearing as moving graphic avatars to one another, with each able to see and hear one another. They were able to look at pre-uploaded patient scans that appeared as three-dimensional holograms of the tumour. In the video below, you can see each specialist discuss and analyse the patient’s data through Professor Ahmed’s perspective, the footage captured from his Hololens.

VRFocus spoke to Professor Ahmed about the project in the video interview below. He explains that the team were connected into a virtual space where they could share the scans, images of the patients, interact with them and then discuss the case in more detail, similar to a multidisciplinary team meeting that surgeons normally do in healthcare practices. The experience was like having a very ‘lucid conversation’ about the patient. Apparently, after you get past the initial shock of feeling like Iron Man, the experience is no different to having a person sit next to you and conversing.

Professor Ahmed is very excited about being in the healthcare space right now and believes that they’re undergoing the fourth industrial revolution. “It’s a question about globalization, if you want help and support – well actually the whole world can support them. These are the type of technologies that will connect people, make the world much smaller and actually make healthcare more equitable”, he says. For the future of surgery, he’d like to teleport or ‘holoport’ himself into another part of the world, walk around the room, stand over the surgeon’s shoulder, see what they’re doing, give advice and then disappear. Although this might seem like this is far in the future, it’s the direction he sees it going and is something he is working on.

Aetho approached Professor Ahmed at Cannes Lions after seeing his talk about creating a digital avatar of himself using photogrammetry. Aetho were working on the concept of avatars, holograms and telepresence for their software Thrive. The two met and Professor Ahmed’s VR company Medical Realities then collaborated with Aetho and co-ordinated the project with the hospitals to do a world’s first MR conference call with 3D digital assets during a real-time surgery.

He explains that new technologies are severely needed because globally there is an increasing demand for healthcare, but not enough capacity to cope with it. Unfortunately, with little funding it’s difficult for public services like the NHS to justify new healthcare services. He hopes that by using new technologies such as these, that healthcare can be better, more efficient and eliminate the need to travel in order to do certain operations. He believes A.I. and robotic machines will take over routine jobs, and doctors as well as surgeons will have to re-design their roles in this future landscape.

Whatever the future holds, this is an exciting step for future healthcare operations. It could save a lot of money on expensive travel, save time on treating patients and free time for doctors and surgeons to treat more patients. If you want to find out more about the project, watch the video below. You can also find out more about how immersive technology is being implimented into the world of healthcare with VRFocusThe VR Doctor and Emotion Sensing series.

Emotion Sensing In VR: The New Face of Digital Interaction

Since the first mainframe computers were created, designers and engineers have sought more intuitive ways to interact with communication technologies. The Analytical Engine of Charles Babbage of the early 19th century used punch cards and levers to enable interaction with these early computers. Fast forward 100 years and the keyboard, and later the mouse would become the the dominant methods for input for decades.

The advent of the laptop, and then the smartphone helped popularise the touch pad mouse and the touchscreen. But as the devices have become ever more intuitive and naturalistic, so have our methods of interaction.  The overweight computational partner used to have a whole room to itself, it eventually slimmed down and sat on your desk. Next it was hopping onto your lap which made you decide that you wanted to take it home. Thereafter you were never more than a few feet away, and it spent much of your time holding it in your hand. The next transition- the leap onto your face, is when computing really gets personal.

Thus interaction methods have changed from levers and buttons to be pulled and pushed, to keys that you tap, the mice that you click, the touchpad that you slide, the touchscreen that you swipe.

With face worn computers, the challenges and potential benefits are significant. As the computer’s  screen size has diminished from 19-inch desktop to virtually no screen at all (Magic Leap), we have introduced a need for new methods of interaction.

The 4 levels of interaction

I propose a classification of input devices that view VR/AR platforms from the perspective of the level of interactivity and chronological introduction. Viewed in this way, the missing link that stops VR/AR from becoming truly immersive becomes clear.

First generation AR/VR input

A first generation input device can be considered the motion sensor incorporated into the headmounted device (HMD) which translates the head position to change the scene.

For VR this enables basic interaction by creating a pointer (reticle) on the screen.

 


 

Second generation AR/VR input

VR systems literally and metaphorically made a step forward when the software was able to interact with the user’s limbs.

A variety of pointing devices were introduced in the 1990s, notably gloves and treadmills which enable the wearer to move around and include a representation of their hands in the virtual scene. Even without tactile feedback, the introduction of wireless and camera-based limb tracking such as the Leap Motion for VR and and similar technologies for AR  considerably improved interactivity.


 

Third generation VR input

Until recently, wearable eye-tracking has been a niche and comparatively expensive technology, mostly confined to academic uses and market researchers.

However, the potential for foveated rendering has increased interest with the promise of a marked reduction in the computational demands of high resolution, low latency image display.

The other benefit of adding eye-tracking to VR is that it enables more realistic interactions between the user and virtual characters. Speech recognition, a technology that has also benefited from the smartphone revolution can add to eye-tracking by enabling categorical commands such as looking at a door and saying ‘open’.

The major players have all purchased eye-tracking companies. Google acquired Eyefluence, Facebook purchased the Eye Tribe, and Apple has bought SensoMotoric Instruments (SMI).


 

Fourth Generation VR Input

Facial expressions are the important missing element from VR and AR interactions. HMDs with depth cameras attached have been used to visualise the lower face (e.g. Binary VR), but whether this method proves popular in the future is yet to be seen.

Three potential reasons why this approach may be problematic relate to i) the way humans interact, ii) ergonomic concerns and iii) computational and battery life considerations.

One learning from eye-tracking research is that during face to face interactions we infer information from the eye region. Surprise, anger, disgust and a genuine (Duchenne) smile all require visibility of the brow area and the skin typically covered by the HMD. Hao Li for Oculus research has incorporated stretch sensors in the foam interface of the HMD to derive information from behind the headset, and it will be interesting to see how this performs when the final version is released.

Mindmaze have revealed their Mask prototype, which requires the user to wear a clip on their ear and according to one account “conductive gel” on the skin. Samsung have also announced a development nicknamed “FaceSense” although details are still limited.

Emteq’s solution is called FaceTeq and is a platform technology that uses novel sensor modalities to detect the minute electrical changes that occur when facial muscles contract. With each facial expression a characteristic wave of electrical activity washes over the skin and this can be detected non-invasively and without the need for cameras.

Faceteq1Our light-weight, low-cost open platform will herald the 4th generation of VR. Researchers, developers and market researchers will undoubtedly be the initial adopters. However the real advance will be the ability to enable face-to-face social experiences. There are so many areas where facial expressions in VR could improve communication and interactivity. We will be opening our platform and are excited to see what ideas developers come up with. At Emteq, we’re passionate about fostering the 4th generation of AR/VR interaction. We look forward to partnering with headset manufacturers and content creators.

Follow us to learn more about the possibilities and to stay up to date with developments whilst you can of course also follow VRFocus for ongoign developments in the technology space at large

Getting Social in VR

Evidence for the human need to share experiences stretches back to the earliest cave paintings. Scenes from real life or the artist’s imagination were recreated and displayed for others to share.   Social interactions have accompanied almost all communication platforms. Reading and writing facilitated theatre and the formal play, early movies facilitated the cinema industry, radio and television broadcasts resulted in families and friends huddled around a single device to consume sports and entertainment. This article explores how innovators are making Virtual Reality (VR) a social experience.

Social interaction within VR can be distilled into three core elements – speech, movement, and emotional expression. Speech is easily captured and communicated using a microphone and VOIP. All VR devices support capture of head movement, and many capture arm/ hand movement. A surprising amount of non-verbal communication can be inferred from these movements, particularly gestures and gesticulations. Whilst elements of body language can be communicated in VR, the communication of emotional expression is lacking. As a work-around some VR social apps are relying on user-triggered emoticons and arm movement tracking to imply feelings and reactions.  We’ve seen some interesting demos from Oculus and others using cameras to capture mouth movements. Eye tracking will provide improved face to face interaction but eye tracking by itself is not sufficient.

Whilst we await the release of emotionally expressive VR, there are still a number of companies creating social platforms. One of the largest is AltspaceVR, founded by former SpaceX engineer Eric Romo. AltspaceVR is freemium software that supports high- and low-end VR headsets, as well as a 2D experience on computers and mobile. It allows users to chat, watch videos, and join a range of special events, from NBC News Q&A sessions to live music. Like many early social VR spaces, it’s similar to a VR-based Second Life –built less around sophisticated communication, and more around sharing experiences.

AltspaceVR focuses on simplicity and shared experiences.

For emotional interaction, Altspace focuses largely on voice and physical movement. As platform-agnostic software, it features many ways to communicate physical movement for social interaction – everything from simple controller-based movement, through to full-body motion tracking with Microsoft’s Kinect. However, this approach limits the sophistication of social interaction between platforms – users won’t often have equally elaborate set-ups, and so some modes of interaction might not be reciprocated. In terms of emotional expression Altspace supports a range of emoticons, largely selected by the user through a menu. It also supports eye tracking but, again, this is dependent on the VR platform being used supporting it. The main focus appears to be on connecting with friends and sharing experiences like live events or streamed video in a VR setting, which it does very effectively.

In real life, gaming is naturally a social experience and so it’s inevitable that social spaces are being built and enable playing together.  The gaming community has always been quick to embrace new technology that allows them to share play time in new ways. As such, many social applications for VR are heavily game-based, offering up a variety of minigames and tools for users. For example, Sports Bar VR offers competitive pool, darts, and skeeball, Anyland invites users to add and tinker with anything (really, pretty much anything) to their avatars or environment, and Rec Room has online multiplayer paintball, dodgeball, disc golf, charades, and more. These games have simple avatars, often cartoony and without arms, but all players can communicate through voice, movement, emoticons, and hand-gestures. In Rec Room, a fist bump results in an explosion of light – physical interaction is used to perform actions, and now you’ve formed a private party to go play paintball.

Rec Room’s use of the game charades is great for showing the capacity for fun brought with physical interaction in the digital world; getting someone halfway across the world to correctly guess that you’re acting out the movie Jaws in your office is a strange but compelling pastime. VR gaming social spaces focus on the fun of physically interacting and exploring the world and other users around you, and anything they miss in the subtleties of communication are often compensated for with absurdity and silliness from fellow players.

Gaming spaces like Rec Room revel in communicating through exaggerated avatars and situations.

In April Facebook finally launched its own foray into social VR with Facebook Spaces. In Spaces, users are represented by a cartoon avatar, with customised hair, face, and clothing. Spaces integrates Facebook services heavily – users can share photos and videos, take their own inside the space (to share on Facebook, of course), play simple games, or call non-spaces users through Messenger.

Interaction in Facebook Spaces is simple, but effective.

Facebook Spaces is part of a third subset of social VR applications – one step beyond sandboxes like AltspaceVR that focus on sharing content, Spaces is a polished experience built around all aspects of communication. Spaces is sophisticated and modern, and seems to pay quite a lot of attention to conveying authentic interaction. The Oculus Rift headset’s tracking communicate head, arm, and hand gestures to others in the social space reliably and universally. Facebook also invested time in making human-like avatars. Development lead Mike Book stated, “Facebook is about authentic identity, which is fundamentally about humans”, and this ethos is carried through to Space’s characters, who are stylised, but also authentically human and full of emotional range.

Facebook Spaces’ avatars, though stylised, look and feel human in their actions.

What makes Facebook Spaces interesting is the focus on communicating the emotional aspects of conversation. Like many similar applications, avatars’ mouths move in time with microphone output. In addition, the eye positioning of all users is interpreted, creating “eye contact” with others. Given that eye contact is a key form of nonverbal communication, this is a very important development. Spaces also integrates a wide range of emoticons, triggered by movement and by buttons on the Oculus touch controllers. Movement-based emoticons enable some spontaneity in the conversation, but, as Book says, “You have to invoke them. They’re not supposed to be accidental.” The need to deliberately remember to respond in a certain way abstracts emotional communication. Nevertheless, interaction-focused social spaces in VR are making big steps forward to providing authentic human communication in the space.

BigScreen VR has an interesting approach. Here, the social element largely revolves around sharing 2D content within VR. Users can share their work, games or entertainment content by allowing others to view their PC screen. Lip sync and inferred gaze tracking adds to the interactivity or the cartoony avatars. According to CEO Darshan Shankar, engagement  levels have been impressive, and to show their commitment to this new way of collaborating the company holds its meeting in VR.

Most VR platforms can be divided into these three subsets – sharing experiences, gaming, or authentically communicating. In the fledgling VR industry, developers largely haven’t yet looked to tackle more than two of these at a time. While sharing experiences and gaming in VR are natural fits that have seen massive growth, authentic communication in VR is still difficult to implement successfully. While almost all platforms support good interactions in speech and movement, emotional expression is still largely based on emoticons that have to be purposefully triggered by users.

At Emteq, we are working to deliver a virtual reality experience that can interpret and respond to a user’s emotional state. Our Faceteq™ technology allows user avatars to react in conjunction with the user’s own facial expressions – essential to truly authentic communication. Our expression recognition solution will integrate to common headsets and capture the wearer’s expressions accurately. We believe this affective computing is the key to authentic VR and AR social interaction, and will open up new avenues in digital social spaces. . If you’re interested in learning more, do get in touch.

 

Emotion Sensing In VR: How VR Is Being Used To Benefit Mental Health

Cancer and heart disease are well recognised causes of ill health, however a recent report by the Institute of Global Health reveals that in terms of the impact on quality of life (measured by Daily Adjusted Life Years, DALYs), mental health conditions affect more people than cancer and heart disease combined.

Furthermore, according to the World Health Organisation, approximately one in four of us will suffer from some kind of mental disorder. Those who don’t might still experience substantial anxiety and stress levels. Mental health disorders and psychological conditions influence our day-to-day activities, and in the U.S., it costs taxpayers about $467 billion in medical expenses ($2.5 trillion globally).

Alternative drug-free techniques like exposure therapy (ET) and cognitive behavioural therapy (CBT) have been shown to be very effective in overcoming conditions like phobias, anxiety disorders, panic disorders, obsessive compulsive disorders (OCD) and PTSD amongst others. However, for many healthcare providers, drug based treatments are the mainstay in spite of the potential side effects and marginal benefits. 

Virtual Reality and Mental Health

An exciting opportunity to minimise reliance on pharmacological garments for mental health lies in Virtual Reality (VR).  Advances in VR technology allow you to enter a world that is authentic enough to trigger your mind and body to behave as if it’s the real world. 

Exposure Therapy (ET) using VR is an increasingly popular alternative method amongst some practitioners to administer safe and regulated therapy for patients suffering from mental health. Previously, technological and cost barriers have limited the use of Virtual Reality Exposure Therapy (VRET) to the private sector. The introduction of mobile VR headsets, including the Gear VR, presents an opportunity to use telemedicine for mental health treatment.

The use of VRET could lead to mobile tele-therapy that can work in collaboration with in-clinic VR therapy. Further still, patient-directed VR therapeutic approaches are currently operating that don’t require the therapist to be physically present. 

As a relatively new form of treatment, more data from well-designed large trials and clinical evidence is needed to support VR as an effective tool for therapy. Organisations who want to target this market must conduct randomised controlled clinical trials to prove the efficacy of VRET. Once the technology’s effectiveness has been established, there will inevitably be an influx of VR apps attempting to digitally treat mental health issues.

Exposure therapy and cognitive behavioural therapy (CBT)

CBT is a psychotherapeutic treatment provided by a therapist specialised in mental health disorders. It involves patients participating in a number of sessions that concentrate on an isolated issue, helping the patient recognise and modify troublesome thoughts, feelings, and patterns that produce negative behaviours and beliefs. 

CBT can compliment ET well, which over time, gives patients the confidence to confront their disturbing fears and thoughts, head-on. This reduces the peak anxiety an individual goes through when faced with anxiety triggers.

VR is capable of isolating anxiety-related stimuli with a controlled and safe approach. Even though ET goes hand-in-hand with VR, there are several other psychiatric conditions, including autism (see below) and childhood developmental disorders, where VR might have a more active role in the coming years.

Using Virtual Reality to Treat PTSD

There are approximately 8 million adults in America who suffer from PTSD.

VR has been used to deliver prolonged exposure therapy for PTSD patients since the early 90s, mostly for war veterans and soldiers. This is particularly the case in America, where approximately 8 million adults suffer from PTSD. Dr. Albert “Skip” Rizzo is a pioneer in this field. His software, named Bravemind, was created in partnership with globally renowned programmers Virtually Better.

The system is comprised of a customisable and controllable VR environment, a vibro-tactile platform that provides sensations relevant to explosions and firefights, and a scent machine that emits smells like garbage, diesel fuel and gunpowder. All of these simulated sensory components are released at precise times to enhance the digital scenario. There have been multiple clinical studies examining the effectiveness and safety of Bravemind. Current ongoing clinical trials are looking into the use of Bravemind VR therapy for sexual trauma in the military. 

Recent research discovered that using VR therapy treatment alone was as effective as a mix of medication and VR therapy. In a direct comparison, prescriptions received greater negative results for patients than the VR therapy.  

Privates and smaller organisations producing VR therapy software on tight budgets will still have to prove clinical effectiveness if they intend to make an impact on this market.

Severe Paranoia Treatment Using Virtual Reality

In Britain today 1-2% of the population suffers from paranoia, showing significant mistrust in others to the extent of feeling threatened. Sufferers of paranoia when in social situations may use defensive mechanisms such as reduce eye contact or shortening social interactions. At its most severe, avoiding social interactions all together. These only reinforce the paranoia. 

Oxford researchers are trialling treatments of hallucinations using VR. Professor Daniel Freeman and colleagues from Oxford University used VR to test if patients can ‘re-learn’ that social situations were safe, and to reduce the use of defence mechanisms when feeling threatened. Projecting images of train rides, a lift or an airplane where one must encounter several people, and gradually making them busier. Using VR to do this allows the patient to come face to face with their fears, and attempt to overcome them. The patients then transfer the techniques used in VR and presented a significant reduction is paranoid feelings, with 20% no longer presenting severe paranoia symptoms. 

Treating Anxiety Disorders & Phobias With Virtual Reality

Approximately 40% of disability worldwide is due to anxiety and depression, and in the US costs the country an estimated $42 billion annually. With so many people affected and the significant cost this represents, technology provides an opportunity for treatment decentralisation. Alternatives such as self-guided therapy or telemedicine present low cost and potentially equally effective results. 

Phobias influence the behaviour of approximately 19 million Americans. A recent study from 14 clinical trials suggested that Virtual Reality Exposure Treatment (VRET) was effective in treating phobias. Outlined below are some instances of organizations using VR to treat anxiety disorders.

  • The Virtual Reality Medical Center has a procedure to treat those afraid of being on an airplane. The system consists of hardware and software, as well mock airplane seats, and even a subwoofer system to imitate the sounds, sights, and experience of flying.

  • Virtually Better developed a program that treats public speaking, heights and thunderstorm phobias. The business is working with top-name schools and research facilities to take on research and development projects about childhood social phobias and anxiety.

  • CleVR is a Netherlands-based business constructing VR systems to treat the fear of heights, flights and social phobias, all based on scientific research. The organization is conducting experiments to examine the effectiveness of VR as a therapeutic approach to treating social phobias and psychosis. Using dynamic virtual emotion technology, the general environment of such simulated social scenarios can be regulated.

  • Psious is a business in Spain that provides a toolkit therapists can use to control and administer VRET, in order to treat patients with phobias. The software consists of VR hardware, a programmable software platform, and devices for biofeedback.

  • VirtualRet is a tool therapists and psychologists can use to treat and evaluate phobias, including flying, public speaking, the sight of blood, heights, and public places. The developers offer a variety of hardware, virtual environments and parallel services.

  • A business from Sweden named Mimerse is creating psychological tools for VR treatment, and hopes to partner with Stockholm University and the Swedish Government for mass-market use. Their initial program, “Itsy,” is a game concentrating on treating arachnophobia through a digital therapist. In conjunction with the game’s release on the Gear VR app store, a regulated study is currently being administered that compares VRET with Itsy versus actual exposure therapy. Since most people with phobias don’t obtain professional treatment, mass-market games, such as Itsy, may provide tremendous value for people all over the world.

High Functioning Autism and Virtual Reality

Autism can be classified in many ways. At one end of the spectrum there is high functioning autism that has Asperger’s type symptoms. The symptoms include delayed motor skills, limited understanding of abstract language and obsessive interest in specific items or information. VR has the ability to provide a platform where children can safely practice and enhance these social skills. 

Virtual Reality Social Cognition Training (VR-SCT) is able to support children and adults at different ages, adjusting the scenarios depending on the stage of development. For children with autism this could include confronting a bully for the first time, or meeting a new peer. Contexts may remain the same, but the content and complexity may differ depending on age. 

Recent studies have suggested using VR-SCT can benefit a child’s emotion recognition, social attribution and executive function of analogical reasoning. Patients were able to practice a dynamic range of social encounters with outcomes dependent on their responses. Therefore VR-SCT has the ability to allow for meaningful close-to-life scenarios with immediate feedback, enhancing the child’s development

Virtual Reality for Meditation and Stress Relief

Whether or not an individual suffers from a mental health condition, many of us go through varying levels of anxiety and stress at some point. Meditation is a long-established approach to improve one’s mood and bring about a more relaxed state of mind. Though relaxation and meditation might not be the sole treatment for any specific condition, their health benefits can be positive.

Besides their work on phobias, VirtualRet and Psious also have solutions for relaxation and generalised anxiety. Another tech start up created DEEP, a special meditative VR game where the player walks through a beautiful underwater environment. The character’s movement is operated through the player’s breath. Proper breathing techniques are at the heart of relaxation and meditation, so the customized DEEP controller allows the user’s breathing to correspond with what is shown in the digital environment, and determines how the player navigates through it.

Unello Design have developed several relaxation and meditation apps for Oculus Rift and Google Cardboard. Examples of these are Eden River, a nature adventure, and Zen Zone, a supervised meditation journey. Players can also check out “sound sculptures” with their 3D music apps.

One of the most famous relaxation apps is Cubicle Ninja’s Guided Meditation VR, which provides a quartet of soothing, deeply engaging environments to experience.

Depression and Virtual Reality

Self-compassion therapy for depression at UCL

For those with depression it is often a case of lacking compassion for themselves, and being highly self-critical. Using VR to place patients in situations whereby they can indirectly provide themselves with self-compassion is one method used with promising results. The method asks the patient to provide a child who is sad with support. As the patient does this, the child ceases to cry, and begins responding positively to the compassion. The patient then takes the role of the child, and receives their own compassionate words back to them, reversing the role and effectively practicing self-compassion.

Feedback from the use of VR for treating depression has been very promising, with patients responding with how they transferred their virtual experiences into real-life scenarios. VR presents a low cost alternative therapy, thus increasing the accessibility to depression treatment and support.  

Self-compassion therapy for depression at UCL

Shining Light On Mental Darkness

The demand for improved mental health via VR use as been well documented based on years of scientific research. That said, this market remains in its early stages, since the technology, at least so far, as not been perfected. Measuring emotional responses through facial EMG is a well-established research method. With advances in sensor technologies, machine learning and artificial intelligence, Emteq are launching a low cost platform for researchers in 2017, with consumer versions shortly thereafter. If you’re interested in learning more, do get in touch.