Emteq Aims to Humanize VR by Capturing Your Facial Expressions
As researchers and VR start-ups make great strides in increasing the sense of presence such as larger field of view, lower latency and more natural input, a Brighton based start-up has just announced a new technology that can track a user’s facial expressions without markers.
Emteq, with was founded with $1.5 million of seed funding, hopes their technology will enhance social engagement in the growing social VR space, which combines miniature sensors and artificial intelligence to enable digital avatars to mirror the user’s expressions and mood.
Emteq’s technology called “Faceteq” uses a range of biometric sensor techniques including electromyography (EMG), electrooculography (EOG), heart rate and more in the faceplate of VR headsets, such as Oculus Rift, which track the electrical current generated in the movement of facial muscles.
Unlike camera-based facial tracking systems, it also registers movements in eye, forehead and cheek muscles that are underneath a VR headset. Although this technology has clear applications for gaming and social VR apps such as vTime and the upcoming Facebook VR platform, the founders also plan to use the recorded data to analyse how audiences react to regular films and TV advertising.
Graeme Cox, the Chief Executive, co-founder of Emteq and serial tech entrepreneur, said: “Our machine learning combined with facial sensor technologies represents a significant leap forward in functionality and form for developers looking to introduce human expression and mood into a digital environment.
“Imagine playing an immersive role playing game where you need to emotionally engage with a character to progress. With our technology, that is possible – it literally enables your computer to know when you are having a bad day or when you are tired.”
Emteq joins a small but growing range of companies that hope to bring the user’s facial expressions into VR applications. FOVE VR is able to track a user’s eye movements allowing for people to take actions through eye gaze and blinking. It also goes towards a more natural way of viewing scenes in VR such as shifting focus with our eyes which we do in reality.
In July we reported news that Kickstarter project Veeso was aiming to be the first to market with a mobile VR headset that tracks your face’s movement; a system that uses head-mounted infrared cameras to track both your eyes and mouth for a greater sense of presence in virtual reality. Unfortunately that project was cancelled due to lack of funding, but what is exciting about Emteq is that their technology won’t be restricted to one headset.
What is important is that companies such as Emteq are able to garner enough support from developers and produce the required plugins for game engines such as Unity and Unreal to unlock its true potential.
Road to VR will be visiting Emteq within the next few weeks for a closer look and to try the technology first hand.
The post Emteq Aims to Humanize VR by Capturing Your Facial Expressions appeared first on Road to VR.
vTime Adds Detailed New Social VR Avatar Customization to Celebrate Diversity
‘vTime’ Releases New Avatar Customization Tool
vTime, the social VR platform from Starship Group, today released an update to their avatar creation tool that now gives users access to thousands of avatar customization options, promising to “let you reflect more of yourself inside vTime.” The new feature is available across all of the app’s supported headsets including Google Cardboard, Oculus Rift and Gear VR.
With the so called ‘New vYou’ tool, users can control a wide range of characteristics such as body type, skin tone, hair, eyebrows, eye shape, eye color, nose, lips, ears, cheeks, jaw, face length, facial hair, and even age—providing users with a much more detailed way of recreating themselves in virtual reality. Something the devs intentionally didn’t include however was a clear gender distinction, meaning you can mix and match whatever clothes or body style you like to get the desired effect.
vTime Managing Director Clemens Wangerin says the new gender-neutral avatar system “sees vTime become truly diverse and inclusive, allowing us to see the faces and uniqueness of each of our users – something that has been missing from the wider social VR community, until now.”
To test the system, I tried to rebuild my face without looking into a mirror or picture of myself. The results, although not perfect, are pretty good for only about 5 minutes of playing around with the tool’s sliders and color swatches.
But as anyone who spends an exaggerated amount of time in character builders knows, a nondescript white guy is probably the easiest avatar to make, so I set off to max out the sliders and create something with a little more flair. In all earnestness, I was hoping for a result that the McElroy brothers over at Polygon’s YouTube show Monster Factory would be proud of, but the result was decidedly much more human than monster.
Disheartened somewhat by the fact that in my quest to build a monster, I actually ended up making a funky web developer, I then drifted on to the second favorite pass time of avatar creators: Celebrities.
With the basic swath of pre-set avatars at my disposal, I was able to create Samuel L. Jackson’s character from Django Unchained (2012) and a hipster version of Mahatma Gandhi, both of which were made in about 10 minutes.
Overall, vTime’s new avatar system proves to be pretty robust, offering a wide enough variety of clothing options, skin tones, and facial feature sliders to create any basic human form, be it young or old.
vTime says that the social VR app, along with the new character builder, is coming soon to iOS, Google Daydream devices, PlayStation VR and HTC Vive. The studio maintains that new features, events, tools and further customization options will coming out next year.
The post ‘vTime’ Releases New Avatar Customization Tool appeared first on Road to VR.