Owlchemy Labs Teases New In-Engine Mixed Reality Tech

Owlchemy Labs, the studio known for the genre-defying game Job Simulator, have cooked up a new way of doing mixed reality that not only promises to be more realistic, but is sure to grab the attention of VR streamers and content creators alike. They’re calling it ‘Depth-based Realtime In-app Mixed Reality Compositing’. It sounds complex, but it seems to simplify the entire production pipeline.

Green screen VR setups have littered expos ever since Northway Games teased mixed reality integration in Fantastic Contraption earlier this year. Requiring little more than a green sheet, an external camera and a few other bits and bobs (Northway published a step-by-step guide), the results are easy to see:

The video above however is the result of extensive polishing and after effects like rotoscoping to correctly occlude items, making it appear that the player is in 3D space instead of flatly sandwiched between the foreground; the contraption, and the background; the virtual environment.

owlchemy-labs-mixed-reality
image courtesy Owlchemy Labs

Owlchemy Labs recently teased a new in-engine method of putting you in the middle of the action, correctly occluded, that promises to eliminate extra software like Adobe After Effects or composition software like OBS from the equation.

They do it by using a stereo depth camera, recording video and depth data simultaneously. They then feed the stereo data in real-time into Unity using a custom plugin and a custom shader to cutout and depth sort the user directly in the engine renderer. This method requires you to replace your simple webcam with a 3D camera like the ZED 2K stereo cam—a $500 dual RGB camera setup that importantly doesn’t use infrared sensors (like Kinect) which can screw with VR positional tracking. But if you’re pumping out mixed reality VR footage on the daily, then the time savings (and admittedly awesome-looking results) may be worth the initial investment.

Owlchemy says you’ll be able to capture footage with either static or full-motion, tracked cameras, and do it from a single computer. Because the method doesn’t actually require a VR headset or controllers, you can technically capture a VR scene with multiple, non-tracked users.

“Developing this pipeline was a large technical challenge as we encountered many potentially show-stopping problems, such as wrangling the process of getting 1080p video with depth data into Unity at 30fps without impacting performance such that the user in VR can still hit 90FPS in their HMD,” writes Owlchemy. “Additionally, calibrating the camera/video was a deeply complicated issue, as was syncing the depth feed and the engine renderer such that they align properly for the final result. After significant research and engineering we were able to solve these problems and the result is definitely worth the deep dive.”

The studio says it still needs more time to complete the project, but they “have plans in the works to be able to eventually share some of our tech outside the walls of Owlchemy Labs.” We’ll be following their progress to see just how far reaching it becomes.

The post Owlchemy Labs Teases New In-Engine Mixed Reality Tech appeared first on Road to VR.

OwlchemyVR zeigt neue Mixed Reality Methode

Mixed Reality ist wohl die beste Form um Virtual Reality Inhalte zu präsentieren. Anstatt einer wackligen First-Person Ansicht zu folgen, kann der Zuschauer dem Spieler über die Schulter schauen. Außerdem können Zuschauer so wesentlich besser wahrnehmen, wie sich der Spieler in der Welt fühlt. Hier seht ihr ein Beispiel von unserer App, die wir in Zusammenarbeit mit Beiersdorf entwickelt haben:

Neue Mixed Reality Methode

Die aktuelle Methode benötigt die Ausgabe von mindestens zwei verschiedenen Bildern. Ihr benötigt einen Alpha Channel und die normale Ansicht der Third-Person Kamera. Unity Spiele können mit dem aktuellen SteamVR Plugin schnell diese Ausgabe liefern. Dabei wird das Bild in vier Zonen aufgeteilt und somit ist eine 4K Ausgabe notwendig. Dies wirkt sich natürlich auch deutlich auf die Performance aus und bei einem Livestream lässt sich somit nicht unbedingt eine optimale Erfahrung gewährleisten.

Mit der neuen Mixed Reality Methode von OwlchemyVR soll alles einfacher werden. Die Verarbeitung der Daten erfolgt komplett in Unity und es ist keine zusätzliche Software nötig. Realisiert wird dies über eine 3D Kamera und spezielle Plugins in Unity.

Der größte Vorteil ist, dass der Spieler genauer im Raum verortet werden kann. Es sind nicht zwei Layer notwendig und somit gibt es keine ungewollte Verdeckung von Objekten durch den eigenen Körper. Alles was im Vordergrund sein soll, ist auch im Vordergrund zu finden. Außerdem kann das Licht innerhalb der Unity Szene auch auf den Akteur angewendet werden. Somit können überzeugende Ergebnisse ohne große Nachbearbeitung verwirklicht werden.

Aktuell ist diese Methode aber noch in der Entwicklung und das Team hat die entsprechenden Plugins noch nicht veröffentlicht. Es ist aber davon auszugehen, dass OwlchemyVR diese Methode bald für Entwickler freigeben wird und anschließend Videoproduzenten damit arbeiten können.

[Quelle: OwlchemyVR]

 

Der Beitrag OwlchemyVR zeigt neue Mixed Reality Methode zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

This Live Mixed Reality Solution Uses Kinect to Eliminate Green Screen

This project uses Kinect depth cameras to the fuse virtual and the real to produce real-time mixed reality footage with no green screens required, all using Microsoft’s depth camera Kinect.

Mixed reality is all the rage and has become one of the most effective methods to convey the power of immersion afforded by new virtual reality technologies. Spearheaded most recently by the Fantastic Contraption team for their excellent teaser series, the Vive’s room-scale positional tracking used in conjunction with green screen backdrops to fuse the virtual and the real.

A new experimental project has come up with an alternative method, one that does away with the requirement for draping your demo area with green sheets and leverages Microsoft’s Kinect depth camera to achieve a similar effect, in real time. The new technique allows potential exhibitors to show off a user interacting with a virtual application an retain any demo space design (say, a stand at a conference venue) and still produce a compelling way to visualise what makes VR so special.

“We built a quick prototype using one of HTC Vive’s controllers for camera tracking, Microsoft Kinect v2, a Kinect v2 plugin and Unity running on 2x machines,” says the team, who have demonstrated their work via the above YouTube video. “The server ran the actual VR scene and the client extracted data from the Kinect, placed the point cloud into the scene, resulting in a the mixed reality feed. The depth threshold was altered dynamically based on the position of the headset.”

kinect-mmixed-reality

Of course, the compositing effectiveness is not as precise as a professionally produced green-screen equivalent, and there will be occasional pop-in from other objects which creep into the demo space, but it’s a neat, low cost and potentially more practical approach to getting your VR app noticed at a venue.

However, the biggest drawback for the technique will likely be it’s Achilles heel, specifically the requirement for the target VR application to provide integration for displaying the point cloud imagery alongside the view captured by the virtual camera. No mean feat.

kinect-mixed-reality-2

Nevertheless, it’s an intriguing approach that once again reminds us how Microsoft’s gaming peripheral seems to have found a life much more productive than it’s original, ill fated designed purpose.

You can read in detail all about the project over at the team’s YouTube channel here.

The post This Live Mixed Reality Solution Uses Kinect to Eliminate Green Screen appeared first on Road to VR.

Microsoft Teams With Intel For MR Projects, Reveals Windows Holographic Shell Update For Windows 10

Not only are Intel looking to redefine what virtual reality (VR) is with their wireless head-mounted display (HMD) codenamed Project Alloy but today at the 2016 Intel Developer Forum (IDF) Intel’s CEO Brian Krzanich was joined on stage by Microsoft’s Executive Vice President of the Windows and Devices Group Terry Myerson to reveal a new development for Microsoft’s Hololens mixed reality (MR) system.

The pair revealed Windows Holographic Shell as a future update to Windows 10 that will allow use of the Hololens system on “mainstream” Windows 10 PCs. The update will arrive at an undisclosed time in 2017.

Commenting on the news Myerson write on the Windows blog that “The Windows Holographic shell enables an entirely new experience for multi-tasking in mixed reality, blending 2D and 3D apps at the same time, while supporting a broad range of 6 degrees of freedom devices.”

The two companies also revealed they were working together on MR in general along with Microsoft’s “hardware partners” to formulate “a broad range of devices for the mainstream consumer and business markets”. Specifications for these an initial version which are set to be revealed this December at the Windows Hardware Engineering Community (WinHEC). Microsoft have revealed a promotional video for the Windows Holographic Shell which you can see below.

VRFocus will bring you more information as to these developments when we get them.

Virtual Reality Very Much On The Table At Edinburgh Digital Entertainment Festival

This coming Monday and the Tuesday following it sees virtual reality (VR) become a main point of focus in the Edinburgh Digital Entertainment Festival (EDEF). The event sees the next session of its Ideas Studio which will focus on what VR can bring from both a creative and cultural standpoint and will feature speakers from REWIND:VR, Whispering Gibbon and ‘VR writers room’ Digital Jam Ltd. The VR focus covers a number of keynote speeches and panels and follows the previous week’s session on video game technology.

The full program for two days below. can be found below

MONDAY 15 AUGUST

11:00 – SOL ROGERS (REWIND:VR) – VIRTUAL REALITY: CAN YOU BELIEVE THE HYPE?

Looking at the history of VR over the last 20 years and how it has become a reality, from technology and platforms to smartphones and
appetite for the medium. Sol has worked with major artists and brands including Bjork, Rolls Royce, Red Bull and BBC.

12:00 – ADRIAN HON (SIX TO START) – HOW VR WILL BREAK MUSEUMS

Creator of the world’s bestselling smartphone fitness app ‘Zombie, run!’ will discuss how VR can do everything museums can but at greater
scale and lower cost. Does that mean museums will disappear?

14:00 – SIMON BENSON (DIRECTOR AT IMMERSIVE TECHNOLOGY GROUP WITHIN SONY INTERACTIVE ENTERTAINMENT WORLDWIDE STUDIOS) – VIRTUAL REALITY THE PLAYSTATION WAY

Ahead of the PlayStationVR release on 13th October, Simon will offer insight into the VR platform and the key features that will be included.
He will explain how the influence of PlayStation has steered Sony’s VR solution in a unique direction.

15:00 – PANEL – FUTURE REALITIES: EXPLORING AUGMENTED AND MIXED REALITY

Chair: Mark Atkin, Director – Crossover Labs

Tanya Laird, Founder – Digital Jam

Trevor Jones, Artist working with Augmented Reality

Keiichi Matsuda, Designer and Film-maker – Keiichi Matsuda Ltd

Is this real life? Or is this just fantasy? Pokemon Go has already taken over and in the near future it will be commonplace to create and
interact freely with holograms that blend into the rest of our surroundings. This panel will focus on where augmented and mixed
realities will take us and how these new ways of seeing will revolutionise the way we work and play, heightening creativity and making the imaginary become real.

16:00 – PANEL – ARTIFICIAL INTELLIGENCE: CAN MACHINES POSSES EMOTIONS?

Artificial intelligence, the science of making clever machines, has resulted in programs that can win a game, recognise your face and even
appeal against your parking ticket. Research has been developed that gives machines skills of emotional intelligence, allowing them to have feelings and develop a personality. This panel explores whether we are at the dawn of sentient artificial intelligence and what that really means for the human race.

17:00 – VR WRITERS’ ROOM

Chair: Tanya Laird, Founder – Digital Jam

The VR Writers Room Live brings together multiple areas of expertise together (game, film, TV, comics, theatre) to discuss narrative design
and storytelling for the new and emerging medium of Immersive entertainment. This covers VR, AR, live 360, immersive theatre and any
other entertainment format that includes an element of artificial reality.

TUESDAY 16TH AUGUST

11:00 – JOE STEVENS (WHISPERING GIBBON) – TO REALITY AND BEYOND: TURNING VIRTUAL WORLDS INTO PHYSICAL MODELS THROUGH 3D PRINTING TECHNOLOGY

CEO of 3D-printing tech company, Joe will discuss the latest in technology to print physical models of gaming content, bringing them into the real word.

12:00 – PAUL LONG, CO-ARTISTIC DIRECTOR – METRO BOULOT DODO HACT: THE ART OF KEEPING TECH SIMPLE

Metro Boulot Dodo is an independent arts company that are known for combining live performance, original soundtrack and stunning visuals in a wide variety of contexts. Here, the Artistic Director will discuss the relationship between technology and artwork and introduce the concept behind HACT. The vision to bring bite size theatre experiences to venues and public spaces across the UK.

14:00 – AN INTRODUCTION TO MUSIC IN VIRTUAL REALITY WITH MELODYVR

When it comes to music, virtual reality will change the industry with its ability to disrupt how fans connect with the artists they love.
People will be able to attend a gig anywhere in the world without worry of price tags or age restrictions and sold out events will no longer
mean missing out on performances. New vantage points that allow you on stage with the world’s biggest bands, backstage at a festival or in a
DJ booth at an Ibiza superclub are no longer off limits. So how will this really affect the fans, artist, labels and promoters? Join the
founders of virtual reality music platform Melody VR, as they host an exciting panel of speakers from across the music industry to focus on
this new technology and consider what it means for the future of the music scene.

15:00 – PANEL – THE NEW STATE OF STORYTELLING: THE IMPORTANCE OF EMPATHY IN VR NARRATIVES

Chair: Oliver Franklin-Wallis, Assistant Editor, WIRED

Dan Efergan, Digital Group Creative Director at Aardman Animation
Studios

Jane Gauntlett, Artist and Producer, Founder of In My Shoes project

Toby Coffey, Head of Digital Development – National Theatre

VR has been proclaimed as the ultimate Empathy Machine. The medium will give us the ability to connect with humans in a different way,
experiencing the view of the world through eyes of a stranger. This panels brings VR storytelling pioneers to discuss the challenges, advantages and future of this new medium.

16:00 – VICKY ROBERTS (HEAD OF COMMUNICATIONS AT STARSHIP GROUP) – HOW TO TURN SOCIAL NETWORKS INTO SOCIABLE NETWORKS

As Head of Communications at vTime, the first sociable network in VR, Vicky will discuss how VR will transform social networks from staring at computers and mobiles, to virtual destinations where sharing experiences will be possible from virtually anywhere, at virtually anytime on
virtually any device. Spending quality time with family and friends around the world.

17:00 – PEGGY WU (SENIOR RESEARCHER AT SMART INFORMATION FLOW TECHNOLOGIES) – ANSIBLE: VIRTUAL REALITY FOR SUPPORTING FUTURE MARS ASTRONAUTS

In this fascinating talk, learn about NASA’s vision for the future manned mission to Mars and how a team of researchers are building the
next generation communications tool using VR to provide psychological health and social support to our explorers as we venture beyond
Earth’s orbit.

The event,which will take place at The George Hotel on George Street in Edinburgh which will also streamed live online at thisistomorrow.info’s YouTube channel and on edef.co.uk.  VRFocus will bring you more news about VR and AR related events as we get it.

 

Augmented World Expo 2013: It’s a wrap!

Augmented World Expo 2013 was really an amazing experience. I’m co-founder and co-organizer of the conference, along with Ori Inbar, so it has meant a lot to me to see our event grow over the last four years, and thrilling to make such a big splash this year.  There were 1,163 attendees, and the expo show cased an ecosystem of emerging technologies – augmented reality, gesture interaction, eyewear, wearables, and connected hardware of  many stripes, that mark the beginning of natural computing entering the mainstream. It was a unique opportunity to get up close and personal with what it feels like to be an augmented human in an augmented world!

Videos of AWE 2013′s 35 hours of educational sessions and inspirational keynotes are now available on our YouTube channel. I am sharing my own talk (my slides are also up on slideshare here), and a few of my favorites in this post, but there are far to many to post here, so please browse further on the Augmented World Expo youtube channel.

One notable high point of AWE2013, for me, was the showcase sponsored by Meta, a startup developing the first device allowing visualization and interaction with 3D virtual objects in the real world using your hands. It was made possible by the generous contribution from the private collections of Paul Travers, Dan Cui, Steven Feiner, Steve Mann, and Chris Grayson, and passionate volunteers who are helping advance the industry. Sean Hollister of The Verge did this excellent report on the eyewear showcase 35 years of wearable computing history at Augmented World Expo 2013
Also for more on Meta see this article by Dan Farber.

My colleagues at Syntertainment, Will Wright, Avi Bar-Zeev, Jason Shankel, and LaurenElliott all gave great talks. Ironically, we’re not building augmented reality apps or H/W. We all just happen to continue to be very interested in the field.  

Thank you to everyone for supporting the event!

The press coverage was truly extensive:

In the shadow of Google Glass, an augmented reality industry revs its engines
The Verge, Sean Hollister, June 9, 2013, 271 Tweets

The next big thing in tech: Augmented reality
CNET, Dan Farber, June 7, 2013
Pick up on Current News Daily
350 Tweets

AWE 2013 Conference Report: Augmented Reality and Marketing
The Persuaders Marketing Podcast on Dublin City FM, June 23, 2013

AR Dirt Podcast – Ori Inbar AWE2013 Extravaganza Recap
AR Dirt by Joseph Rampolla, June 18, 2013

35 years of wearable computing history at Augmented World Expo 2013
The Verge, Sean Hollister, June 9, 2013
7 Tweets

Augmented Reality: Bruce Sterling, keynote at Augmented World Expo 2013
Wired, Bruce Sterling, June 9, 2013
9 Tweets

On the road for VR: Augmented World Expo 2013
Doc-Ok, Staff, June 7, 2013
3 Tweets

My Interview from Augmented World Expo 2013 [VIDEO] Wassom.com, Brian Wassom, June 7, 2013

Augmented World Expo
ZenFri, Staff, June 7, 2013

AWE2013: Hardware for an augmented world
FBNSantos.com, Felipe Neves Dos Santos, June 6, 2013

Augmented Reality Will Be the New Reality
InvestorPlace, Brad Moon, June 6, 2013

Wearable computing pioneer Steve Mann: Who watches the watchmen?
TechHive, Armando Rodriguez, June 6, 2013

Expo puts augmented reality in the limelight
ABC 7 News, Jonathan Bloom, June 5, 2013

These OLED microdisplays are the future of augmented reality
DVICE, Evan Ackerman, June 5, 2013

Visualized: a history of augmented and virtual reality eyewear
Engadget, Michael Gorman, June 5, 2013

Wikitude announces Wikitude Studio and in-house developed IR & Tracking engine
PapiTV, KC Leung, June 5, 2013

Augmented reality expo aims for sci-fi future today
USA Today, Marco della Cava, June 5, 2013

Augmented Reality: High Dynamic Range (HDR) Video Image Processing For Digital Glass
Wired, Bruce Sterling, June 5, 2013

Will Wright at Augmented Reality Conference: Don’t Augment Reality, Decimate It
AllThingsD, Eric Johnson, June 4, 2013

Philip Rosedale’s Second Life with High Fidelity
CNET, Dan Farber, June 4, 2013

Google Glass competitors vie for attention as industry grows
PC World, Zack Miners for IDG News Service, June 4, 2013

4D Augmented Reality Leader Daqri Announces $15 Million Financing
Press Release, June 4, 2013

CrowdOptic Powers Lancome Virtual Gallery App, Crowd-powered Heat Map
TechZone 360, Peter Bernstein, June 3, 2013

Augmented humans, enhanced happiness?
Crave Culture, Angelica Weihs, June 2, 2013

Metaio & Vuzix to Showcase AR-Ready Smart Glasses at the 2013 Augmented World Expo
Press Release, May 30, 2013

Four ways augmented reality will invade your life in 2013
Quartz, Rachel Feltman, May 30, 2013

Augmented Reality: Augmented World Expo™ is next week
Wired, Bruce Sterling, May 28, 2013

Strike it Rich with Cachetown and AWE 2013 Playing the Gold Rush 49’er Challenge In Augmented Reality
Press Release, May 24, 2013

Local Community College Student Headed to Silicon Valley to Learn More about Augmented Reality
St. Louis Post-Dispatch, Staff, May 24, 2013

Explore an intricate labyrinth with smartphone AR
CNET Australia, Michelle Starr, May 21, 2013

Dartmouth firm lands super app
Herald Business, Remo Zaccagna, May 21, 2013

Augmented World Expo 2013–The Future of Augmented Reality
Silicon Angle, Saroj Kar, May 17, 2013