Researchers Showcase Impressive New Bar for Real-time Digital Human Rendering in VR

A broad team of graphics researchers, universities, and technology companies are showcasing the latest research into digital human representation in VR at SIGGRAPH 2017. Advanced capture, rigging, and rendering techniques have resulted in an impressive new bar for the art of recreating the human likeness inside of a computer in real-time.

MEETMIKE is the name of the VR experience being shown at this week at SIGGRAPH 2017 conference, which features a wholly digital version of VFX reporter Mike Seymour being ‘driven’ and rendered in real-time by the real life Seymour. Inside the experience, Seymour is to play host, interviewing industry veterans and researchers inside of VR during the conference. Several additional participants wearing VR headsets can watch the interview from inside the virtual studio.

The result is a rather stunning representation of Seymour—rendered at 90 FPS in VR using Epic’s Unreal Engine—standing up to extreme scrutiny, with shots showing detailed eyebrows and eyelashes, intricate specular highlights on the pores of the skin, and a detailed facial model.

To achieve this, Seymour wears a Technoprops stereo camera rig which watches his face as it moves. The images of the face are tracked and solved with technology from Cubic Motion, and that data is relayed to a facial rig created by 3Lateral and based on a scan of Seymour created as part of the Wikihuman project at USC-ICT. Seymour’s fxguide further details the project:

  • MEETMIKE has about 440,000 triangles being rendered in real time, which means rendering of VR stereo about every 9 milliseconds, of those 75% are used for the hair.
  • Mike’s face rig uses about 80 joints, mostly for the movement of the hair and facial hair.
  • For the face mesh, there is only about 10 joints used- these are for jaw, eyes and the tongue, in order to add more an arc motion.
  • These are in combination with around 750 blendshapes in the final version of the head mesh.
  • The system uses complex traditional software design and three deep learning AI engines.

A paper published by Seymour and Epic Games researchers Chris Evans and Kim Libreri titled Meet Mike: Epic Avatars offers more background on the project.

From our reading of the project, it’s somewhat unclear but sounds like the rendering of the digital Seymour is being done on one PC with a GTX 1080 Ti GPU and 32GB of RAM, while other computers accompany the setup to allow the host’s guest and several audience members to view the scene in VR. We’ve reached out to confirm the exact hardware and rendering setup.

The post Researchers Showcase Impressive New Bar for Real-time Digital Human Rendering in VR appeared first on Road to VR.