Kite & Lighting Uses iPhone X in Experiment to Create ‘Cheap & Fast’ VR Facial Mocap

Packing a 7-megapixel front-facing depth camera, Apple’s iPhone X can do some pretty impressive things with its facial recognition capabilities. While unlocking your phone and embodying an AR poop emoji are fun features though, the developers at Kite & Lightning just published a video of an interesting experiment that aims to use the iPhone X as a “cheap and fast” facial motion capture camera for VR game development.

Created by Kite & Lightning dev Cory Strassburger, the video uses one of the studio’s Bebylon Battle Royale characters (work in progress) to demonstrate just how robust a capture the iPhone X can provide. Flexing through several facial movements, replete with hammy New York(ish) accent, Strassburger shows off some silly sneers and a few cheeky smiles that really show the potential for capturing expressive facial movement.

While still a quick first test, Strassburger says that even though the iPhone X can drive a character’s blendshapes at 60fps while it tracks 52 motion groups across the face, “there’s a bit more to be done before I hit the quality ceiling in regards to the captured data.”

On the docket before the iPhone X’s TrueDepth camera can be levied as a VR game development workhorse, Strassburger says his next steps will include getting the eyes properly tracked, figure out why blinking causes the whole head to jitter, re-sculpting some of the blend shapes from the Beby rig to be better suited for this setup, visually tune characters, and add features to record the data.

image courtesy Kite & Lightning

To top it off, Strassburger is thinking about creating a harness system to mount the iPhone into a mocap helmet so both face and body (with the help of a mocap suit) can be recorded simultaneously.

Bebylon Battle Royale, a comedic multiplayer arena brawler, is due out sometime in 2018 on Rift and Vive via Steam. We can’t wait to see what the devs have come up with, as the game already promises to be one of the silliest games in VR.

The post Kite & Lighting Uses iPhone X in Experiment to Create ‘Cheap & Fast’ VR Facial Mocap appeared first on Road to VR.

Researchers Showcase Impressive New Bar for Real-time Digital Human Rendering in VR

A broad team of graphics researchers, universities, and technology companies are showcasing the latest research into digital human representation in VR at SIGGRAPH 2017. Advanced capture, rigging, and rendering techniques have resulted in an impressive new bar for the art of recreating the human likeness inside of a computer in real-time.

MEETMIKE is the name of the VR experience being shown at this week at SIGGRAPH 2017 conference, which features a wholly digital version of VFX reporter Mike Seymour being ‘driven’ and rendered in real-time by the real life Seymour. Inside the experience, Seymour is to play host, interviewing industry veterans and researchers inside of VR during the conference. Several additional participants wearing VR headsets can watch the interview from inside the virtual studio.

The result is a rather stunning representation of Seymour—rendered at 90 FPS in VR using Epic’s Unreal Engine—standing up to extreme scrutiny, with shots showing detailed eyebrows and eyelashes, intricate specular highlights on the pores of the skin, and a detailed facial model.

To achieve this, Seymour wears a Technoprops stereo camera rig which watches his face as it moves. The images of the face are tracked and solved with technology from Cubic Motion, and that data is relayed to a facial rig created by 3Lateral and based on a scan of Seymour created as part of the Wikihuman project at USC-ICT. Seymour’s fxguide further details the project:

  • MEETMIKE has about 440,000 triangles being rendered in real time, which means rendering of VR stereo about every 9 milliseconds, of those 75% are used for the hair.
  • Mike’s face rig uses about 80 joints, mostly for the movement of the hair and facial hair.
  • For the face mesh, there is only about 10 joints used- these are for jaw, eyes and the tongue, in order to add more an arc motion.
  • These are in combination with around 750 blendshapes in the final version of the head mesh.
  • The system uses complex traditional software design and three deep learning AI engines.

A paper published by Seymour and Epic Games researchers Chris Evans and Kim Libreri titled Meet Mike: Epic Avatars offers more background on the project.

From our reading of the project, it’s somewhat unclear but sounds like the rendering of the digital Seymour is being done on one PC with a GTX 1080 Ti GPU and 32GB of RAM, while other computers accompany the setup to allow the host’s guest and several audience members to view the scene in VR. We’ve reached out to confirm the exact hardware and rendering setup.

The post Researchers Showcase Impressive New Bar for Real-time Digital Human Rendering in VR appeared first on Road to VR.

Emteq Aims to Humanize VR by Capturing Your Facial Expressions

As researchers and VR start-ups make great strides in increasing the sense of presence such as larger field of view, lower latency and more natural input, a Brighton based start-up has just announced a new technology that can track a user’s facial expressions without markers.

Emteq, with was founded with $1.5 million of seed funding, hopes their technology will enhance social engagement in the growing social VR space, which combines miniature sensors and artificial intelligence to enable digital avatars to mirror the user’s expressions and mood.

Emteq’s technology called “Faceteq” uses a range of biometric sensor techniques including electromyography (EMG), electrooculography (EOG), heart rate and more in the faceplate of VR headsets, such as Oculus Rift, which track the electrical current generated in the movement of facial muscles.

Unlike camera-based facial tracking systems, it also registers movements in eye, forehead and cheek muscles that are underneath a VR headset. Although this technology has clear applications for gaming and social VR apps such as vTime and the upcoming Facebook VR platform, the founders also plan to use the recorded data to analyse how audiences react to regular films and TV advertising.

emteq-2

Graeme Cox, the Chief Executive, co-founder of Emteq and serial tech entrepreneur, said: “Our machine learning combined with facial sensor technologies represents a significant leap forward in functionality and form for developers looking to introduce human expression and mood into a digital environment.

“Imagine playing an immersive role playing game where you need to emotionally engage with a character to progress. With our technology, that is possible – it literally enables your computer to know when you are having a bad day or when you are tired.”

See Also: FOVE Debuts Latest Design for Eye Tracking VR Headset
See Also: FOVE Debuts Latest Design for Eye Tracking VR Headset

Emteq joins a small but growing range of companies that hope to bring the user’s facial expressions into VR applications. FOVE VR is able to track a user’s eye movements allowing for people to take actions through eye gaze and blinking. It also goes towards a more natural way of viewing scenes in VR such as shifting focus with our eyes which we do in reality.

In July we reported news that Kickstarter project Veeso was aiming to be the first to market with a mobile VR headset that tracks your face’s movement; a system that uses head-mounted infrared cameras to track both your eyes and mouth for a greater sense of presence in virtual reality. Unfortunately that project was cancelled due to lack of funding, but what is exciting about Emteq is that their technology won’t be restricted to one headset.

What is important is that companies such as Emteq are able to garner enough support from developers and produce the required plugins for game engines such as Unity and Unreal to unlock its true potential.

Road to VR will be visiting Emteq within the next few weeks for a closer look and to try the technology first hand.

The post Emteq Aims to Humanize VR by Capturing Your Facial Expressions appeared first on Road to VR.