The founders at Survios are true believers in virtual reality, and they’ve poured a lot of effort into the hit VR games Raw Data and Sprint Vector. Now they’re switching from games to something more like an immersive music experience with Electronauts.
The VR app will enable music fans to remix, compose, and perform their own music, riffing on works by artists such as The Chainsmokers, Odesza, Steve Aoki & Boehm, and many other bands that I am intimately familiar with (not).
At the core of Electronauts is the Music Reality Engine, which lets anyone perform and produce their own versions of the hits. It doesn’t skip a beat thanks to a technology called quantization. I spoke with Nathan Burba, CEO and cofounder of Survios in Los Angeles, about the new technology and the creativity that it brings to VR music.
“You can take a song by the Chainsmokers, ‘Roses,’ and determine when the different elements in the song will play,” Burba said. “It’s like you are playing inside a song.”
He said you get a sensation of playing the song at the right tempo, thanks to the quantization and a little mind trick that helps you deal with latencies in music headsets.
Electronauts debuts on August 7. It will be available on Steam and Oculus Home for HTC Vive and Oculus Rift at $20, and PlayStation Store for PS VR at $18. It’s also launching in VR arcades across 38 countries worldwide.
Here’s an edited transcript of our conversation.
Above: Nathan Burba is CEO of Survios.
Image Credit: Survios
GamesBeat: You were excited about the technology behind Electronauts. What makes it work? Can you explain that?
Nathan Burba: The project started with us creating quantized instruments. What that means is that you can perform an action in VR, and with a certain amount of latency that’s added, we then play a sound. The reason we do that is because we can make the sound happen at the correct tempo. That way you don’t have to worry about the tempo yourself. You’re not pounding your hands in perfect time like a drummer. You just perform your actions in the game and it sounds like you’re in time.
The best way to describe it is, it’s similar to the trick of how the HMDs work themselves. Some of the people working on Electronauts are former hardware engineers that also worked on our hardware back in the day. The latency trick we’re doing fools the brain into thinking it’s played a note at the right tempo, even though it hasn’t, because we delay it slightly. But that delay isn’t long enough for your brain to pick up on it. Certain tempos allow for that delay to be under 30 milliseconds, and with that timing, your brain says, “Did that happen in time? Sure, why not?” It plays along.
That creates the sensation of playing music in the right tempo, even you aren’t necessarily. It’s an amazing experience, to think you’re making this music. That’s at the core of the quantized instruments, in addition to the fact that they’re always in the right key. They always sound good, no matter what’s going on with the rest of the audio sync. Then we started layering other pieces on top of it, playing a bunch of samples in real time.
The way the game works, we request the stems for a song from a musician. These stems are the entire songs they’ve constructed, and then what we do is we put those into the game and allow you to play around with them through various interfaces, as much as you want. You can completely change the mix of the song at any given time. You can take out the vocals and play a solo, or bring everything down to a simpler version of the track. You can build up to a drop. All those kinds of arranging that a composer normally does — you can rearrange a song at will. You can take a song that never even had a drop and add one that sounds correct and relevant to the song, even though it was never there to begin with. You can play with all these pieces like they’re Legos.
It’s a revolutionary technology that has applications in a lot of places outside VR, but it synced up really well with VR, and we’re a VR company. It ended up working out well in Electronauts.
GamesBeat: If you’re mixing your own sounds that you’re creating into sounds that you’re hearing — are you making a song together with the game?
Burba: There are sounds that are taken from the song itself. Let’s say we take a popular song, like “Roses” by the Chainsmokers. That song has various elements. It has vocals. It has guitars playing in different keys. It has a drum kit. We’re letting the user determine when those different elements play, or whether they play or don’t play.
Some of those are turned on and off in real time and some of them aren’t. It depends on the kind of instrument. With the vocals, you queue them up and they play on time, and you can kind of remix them. With a guitar, you can play the actual notes yourself and sequence those notes to create melodies. It’s a big sonic playground. You play inside of the song.
It’s different from remixing a song using, say, a typical DJ turntable, because that’s designed to mix two different songs, alternating between one or the other, and all you have is the final mix. It’s like a cake. We’re not just giving you cake and frosting. We’re giving you the eggs, milk, and flour, everything, before it’s all put together. You get to decide how much or how little of those elements it has, and when it has those elements.
It’s the first time musicians have ever worked with a company with the raw stems, as they’re called, to allow this to occur. They did this a bit in Rock Band and Guitar Hero, but it wasn’t quite to the depth that we’re seeing with this project.
GamesBeat: You can be inside VR and then touch something with your hands, then, to turn on or turn off something like the vocals?
Burba: It’s a combination of interfaces that you touch — or bang on, so to speak — and interfaces where you turn buttons on and off. There’s what we call the orb kit. You always have two drumsticks, one in each hand, and you bang on the orbs. The orbs themselves are designed to be incredibly juicy, like how it feels to play a real drum. There’s a few instruments like that. There’s a laser harp. There’s a pinball, a kind of electric ball you can play around with.
Then there’s various buttons you can turn on and off. There’s a backing track, which has six different tracks. Usually they’re part of how the song itself is arranged. There’s an intro, a build, a drop, all the way to an outro. You play those in order. Then there are stems that are pieces of those tracks, and you can turn those on and off any time you want. You can make the entire track silent if you want.
A good example of why this is useful — have you played a game with the sub packs, the vibrating backpacks you can buy to go with VR games?
GamesBeat: Not really, no.
Burba: Imagine just having a giant rumble pack on your back. It adds a nice element to a game. But with Electronauts, because you’re in full control of the audio, you can control exactly how that thing massages your back. You can turn the bass drum on and start doing these giant pulses in certain parts of your back.
It gives you full control over the audio spectrum in a very easy way that’s intuitive for anyone. A kid can go in and immediately know what they’re doing. It’s that level of control, but then everything sounds really good. It’s a way to democratize music, so anyone can feel like a musician or DJ without needing any equipment or experience.
Above: Electronauts lets you mix your own songs.
Image Credit: Survios
GamesBeat: Getting back to some of the core innovation there, why do people care about the tempo matching in real time?
Burba: There are two primary difficulties with music. There’s what note you play, and then what time you play it. People who are really good at music know how to do both of those really well. You can put them down at a piano and they’ll start playing something that sounds good.
You remember The Jerk, with Steve Martin? In the first 10 minutes you see him with his whole adoptive family, and he has terrible rhythm. He’s trying to beat time with the music that everyone else is playing and he can’t quite do it. He can never be on time. This is designed for people like that. If you try to drum or sing or just — a lot of people out there just have awful rhythm. They have a hard time staying in time with a song, but it’s frustrating. They might love music, but they just can’t get the hang of that.
This essentially takes that away. You’re always in time, with certain restrictions. You can’t play everything, but within that window of possibilities, you’ll feel like you have rhythm. And then the other side of it is us choosing the right samples from the song, being very cognizant of what parts of the audio spectrum are being used, so that everything you play, the notes that you hit, they’re always in the right key. We’ll have seven notes that all sound good, and you can play them in any order or at any speed. They’ll still sound good to anyone’s ear. That takes away the need for you to know which notes are the correct notes and which are not.
Imagine sitting down at a piano and you can just immediately start playing. That’s what the music reality engine does. It fools your brain into thinking that you’re a great musician, just like the VR headset itself fools you into thinking that you’re somewhere else.
GamesBeat: When you call it the “music reality engine,” what is that, really — the technical core of that?
Burba: It’s an engine, by which I mean it’s a script that runs continuously, every X amount of frames. It’s an engine we built out of Pure Data. That project started with a primary electrical engineer, back when we were doing hardware. When we stopped doing hardware, we had to figure out something else to do with that.
That engineer’s other related passion was music. He always wanted to build a music engine, and he wanted to use Pure Data, because it mimics the signal flow style of how you do electrical engineering, these continuous very low-level electrical processes in Pure Data. The project created a software synthesizer in Pure Data that fits inside of Electronauts.
Above: HTC Vive Pro headset.
Image Credit: HTC
He started off writing these Pure Data scripts. Pure Data is a visual scripting language, similar to Unreal, where you just connect different pieces. It’s also similar to Max/MSP, which is used by a lot of DJs. He started building a system where, primarily, you could play simple quantized notes, quantized .WAV file samples. That was inspired by Plink, which was a quantized music game, a little web app made by an advertising company that was stuck in my head for a year. I was obsessed with it, because I could see the possibilities.
We started making that, a simple little quantized sampler. It was outside of VR. You just played it with a game controller. We made that quantized sampler in Pure Data, and that by itself was incredibly fun to play with. We kept building pieces on to that, adding more tracks, adding more instruments, letting you turn tracks on and off, the vocal tracks. We built a synthesizer and added that as well. It’s the ability to play all of these different things in one song package, play them in real time, and play them procedurally in a way that a modern computer can run it. There’s enough variance that you can actually make a real song and put in there. You have your vocals and percussion and all the elements that a song typically has.
That ability to play all the different pieces in real time with incredibly low delay — you have to be able to touch an orb in VR and then the sound has to happen virtually immediately. We optimize the delay, but that’s basically what the engine is. We also have a way to compile C libraries into Pure Data, something experimental. So we have this engine we’re running, and that sits separately from the game itself. Our game connects to it through what we call Open Sound Control. It processes inputs and outputs through the music reality to play all of the sounds. Because it sits separately, we could make a mobile app with it or hook it up to Unreal — it’s an extensible engine that could be used in many ways. It’s just an engine that lets you play components of a song in real time if you have the stems for that song.
The engine also controls the music visualizer, which is a bunch of different visual events that are synced up to different musical events — when the drum happens, when the clap happens. That goes back to the engine to say, “Animate now.” It’s also a visualization driver.
GamesBeat: What’s the main gameplay mechanic?
Burba: It’s pretty simple and open-ended. You go in, choose a song, and have fun making music. We made the gameplay a bit more directed for the arcade version. The arcade version plays the backing track for six tracks, running through them one at a time. When it goes through the entire song, which takes a few minutes, then the select pops up, more arcade style, and you choose another song and go into that one.
Beyond that, the challenge of the game is just to be the best DJ and make music that sounds really good. We wanted to keep it fairly open-ended, not as game-ey, so we could encourage people to make music for music’s sake. It’s fun. People get lost in there for 30 or 40 minutes at a time, just jamming for themselves.
GamesBeat: Are you recording that music as well? Can you share it with other people?
Burba: We wanted to do that, originally. As we started working with the record labels — there’s a legal side of this that’s very compelling and innovative. We can get all these big record labels and music producers to work together. They want to send us their stems, which are like their babies. But one of the stipulations that we agreed to was to not allow recording functionality, because they don’t want people pirating their songs through this mechanism.
So from the standpoint of the features in the game, there’s not a recording functionality. That said, there’s always what we call the analog loophole. We can’t stop anyone from Twitch-streaming this, or recording it with many kinds of recording software. You could record it and share. What we want to do eventually with the game is have .MOD support, so people can take their own songs that they’ve made and put them in the game. They could basically use the game as a DJ tool and perform their own music. In fact, we DJ’d the Unity party at GDC with this game, and our E3 party as well. We still don’t have that yet, though. Honestly, we ran out of time to get it in for shipping.
Above: Electronauts turns you into a music creator.
Image Credit: Survios
GamesBeat: Within the game, can you play back something you just performed?
Burba: What we do have available is the arrangement system. The music itself is broken down into loops. You can make little chunks of a song, where each chunk has a different backing track and an instrument sequence it plays using a certain set of stems. You can set up to 40 of those and make an entire arrangement for a song. You could make an arrangement that lasts up to about an hour, and that’s your custom mix of the song. That can play by itself. You can also play along with that. It’s a full song creation kit in the game.
GamesBeat: What drove you guys to design it this way, with the different components you’ve talked about?
Burba: It started with the quantized instruments, the ability to play the instruments. Then we saw that playing an instrument without a track in the background kind of sucked, so we added the backing track system. It started off very basic. But as we were experimenting — we put homemade music in there to test things out, but really, what piqued our interest was taking stems from pre-existing songs and seeing what they sounded like.
One of the earliest ones — Trent Reznor released the full masters, the full stems, for two Nine Inch Nails albums. I think it was The Slip and Ghosts. He wanted people to remix those songs, and then he put together a remix album that came out afterward with some of the best fan remixes. No one had ever done this before. So these masters are just sitting there online, incredibly high quality, and the songs are great. We took some of the songs from Ghosts to test them out, and the result we got was incredible. It opened our eyes to understanding the other interfaces we needed to make.
Taking popular songs in general, looking at all the songs that are out there, pop songs and rap songs and EDM — we looked at all that and said, “What other interfaces do we need to represent a full song properly and still make this user-friendly?” That’s where the vocal tool came in. That’s where what we call the “la la looper” came in. We needed this looper tool, because otherwise this certain design pattern that’s common in songs, we couldn’t replicate it in Electronauts.
We built all of those pieces, and that allowed us to have the variance we needed. Now, when we go to artists to put their songs in the game, we can faithfully represent them and still have that dynamic interface. The songs guided us in the interface design process.
Above: Raw Data from Survios is now on the Oculus Touch.
Image Credit: Survios
GamesBeat: What made you decide that this would be good in VR, as opposed to just a 2D game?
Burba: We started off with a basic 2D prototype. The key differentiator for VR is the fact that you’re standing up. You’re already moving a bit. You’re on your toes. Back at Harmonix, they used to say, “Rock stars stand up.” It’s true for this game. Part of the fun is in how you move your body relative the music you’re playing. You end up having a better rhythm, a better time, when you’re moving physically.
Originally the game was called Body Jam. That was the code name for it, because you’re jamming with your body. But that element, I think, helped us out a lot compared to just sitting down and playing with a controller. It helps you keep to a rhythm. The immersive aspect of it is also a big part of it. I’ve been going to music festivals with the game over the last three years, and there’s a very psychedelic, immersive, festival kind of experience. We’re trying to re-create that experience in the game itself. You’re very immersed in the visualizer. It hearkens back to an open music experience. If you’re a 12-year-old and you can’t go to Burning Man, this is your thing.
GamesBeat: When you’re holding these things, are you actually holding sticks, or is it just an HTC controller?
Burba: Yeah, it’s just the regular Vive controller or Rift controller. Your character doesn’t have animatable fingers, which is great, because it would have taken so long to do that. The character is holding sticks in the game. That’s another part of the design that made the game easier, the fact that you have this extra reach. You’re not grabbing items or anything like that. You just swing the sticks and press the trigger to do things with them. That’s the core set of controls.
In many ways this is the opposite of what we normally make here. We built it in Unity. It’s our only Unity title for the foreseeable future. It’s not an aggressive action game. It’s very laid back. The fun is the activity itself. It’s very much like a toy or a creative tool. We like to think of it as a Tilt Brush for music.
Above: Electronauts is available now for $20.
Image Credit: Survios
GamesBeat: Can you explain more about the multiplayer aspect?
Burba: When we put in multiplayer — there are not very many music games that you can play multiplayer and actually play music with someone else. Imagine the process of sitting down with a guitar, someone else is on drums, and you’re just jamming. Being able to internet-ify that process, for lack of a better word, allows people who are thousands of miles away to do that, just like talking on the phone together — that’s not really happened yet, because of latency and because of the tempo problem I mentioned.
You and I, on this phone call, probably have about 100 milliseconds of delay. That doesn’t affect our interaction too much. We don’t need to be synced up that closely. But when you’re playing music, you have to be synced really well. Otherwise you can’t jam together. Because of the quantization, we can simulate people playing music together even when they’re 100 or 200 milliseconds off. It feels pretty good. Similar to how you play an online action game and it still feels pretty good, even though you have lag.
This has been done in one or two other applications, the ability to go online and jam together, but they’re usually just little demos or tests that one person made. This is the first time it’s ever been done in something like a full-blown video game.
GamesBeat: Did you guys set up a studio to make this game, any separate space for making music?
Burba: Not too much? I don’t know if you’ve been down to our new space on La Cienega. We’ve been here about two years, in a 17,000 square foot hangar. Most of our people are in the bullpen part of it, about 80 out of 120 employees. The Electronauts team recently moved to an adjacent place so they could play music in there. But generally, everyone’s been able to play and test, primarily using the Oculus Rift. You’ve got headphones, and we could do multiplayer from desk to desk. We like to do everything in a space where everyone is co-located. It just doesn’t feel right for us any other way.
GamesBeat: Are you doing anything interesting to get the word out with the different musicians?
Burba: We’re working right now with a number of the musicians in the project to film videos of them. It’s really funny to have them play their music in the game and see it through new eyes, so to speak. We’re putting together some video packages with the musicians. We’re working through their social media channels.
A big part of our strategy here at Survios is not only to develop our original IP, but mix in other IP as well. The IP here is the musicians and the music we’re getting externally. They’ve been a big help. Several them are very engaged with us, especially the ones in the Los Angeles area. There are some great artists right around here, and they’ve been able to film in our studio.
This post by Dean Takahashi originally appeared on VentureBeat.
Tagged with: Electronauts, Survios
The post How Survios Crafted A Creative Music VR Experience With Electronauts appeared first on UploadVR.