Indie game studio Funomena, known for their interactive VR experience Luna (2017), announced that the company is bringing a new Luna experience especially made for augmented reality. Called Luna: Moondust Garden, the experience is slated to launch on Magic Leap One this fall.
Funomena calls Luna: Moondust Garden a playful extension of the original Luna which brings a new story from Bird and Owl’s forest into your living room.
In Moondust Garden, the studio says, players “explore the storybook world of Luna from a new, intimate perspective by bringing its animals & forest right into their own reality. By planting and tending a variety of plants, flowers, trees and islands within their physical space, players can create a lovely garden where Fox can come out to play.”
Activities
Plant and interact with a musical landscape directly into your own physical playspace
Discover & collect the moondust that’s hidden throughout the forest
Sprinkle moondust on the plants and flowers to help them grow and transform
Build a beautiful garden so that the sad Fox will cheer up, come out and play!
Funomena co-founder, CEO, and designer of Luna Robin Hunicke will be at Magic Leap’s L.E.A.P. conference on October 10th to talk more about Luna: Moondust Garden and discuss the design and development process that the team applied in building the experience for Magic Leap One. During the session she will talk about the creative inspiration, design and best practices for developing experiences for AR.
The FIFA World Cup finals are nearly here, and while we still live in a time where most everyone interested in the France vs. Croatia match will be glued to a TV set, researchers from the University of Washington, Facebook, and Google just gave us a prescient look at what an AR soccer match could look like in the near future.
The researchers have devised an end-to-end system to create a moving 3D reconstruction of a real soccer match, which they say in their paper can be viewed with a 3D viewer or an AR device such as a HoloLens. They did it by training their convolutional neural network (CNN) with hours of virtual player data captured from EA’s FIFA video games, which essentially gave the team the data needed to ingest a single monocular YouTube video and output it into a sort of 2D/3D hybrid.
Researchers involved in the project are University of Washington’s Konstantinos Rematas, Ira Kemelmacher-Shlizerman (also Facebook), Brian Curless, and Steve Seitz (also Google).
There are a few caveats currently that should temper your expectations of seeing a ‘perfect’ 3D reconstruction that you could watch from any angle: currently players are still projected as 2D textures, positioning of the individual players is still a bit jittery, and the ball isn’t tracked either—an indispensable part of the equation that’s coming in the future, the team says. Also, because it’s based on single monocular shots, occlusion is an issue too, as players’ movements are hidden from the camera and their texture disappears from view.
The implication of watching a (nearly) live soccer match in AR is still pretty astounding though, especially on your living room coffee table.
Image courtesy University of Washington, Facebook, Google
“There are numerous challenges in monocular reconstruction of a soccer game. We must estimate the camera pose relative to the field, detect and track each of the players, re-construct their body shapes and poses, and render the combined reconstruction,” the team writes.
Viewing live matches won’t be possible for a while either, the team says. To watch a full match on an AR device such as a HoloLens, the system still requires a real-time reconstruction method and a method for efficient data compression and streaming to deliver it to your AR headset.
Because the system relies on standard footage, it represents a sort of low-hanging fruit of what’s possible now with current capture tech. Even though it’s based on 4K video, there are still unwanted artifacts such as chromatic aberration, motion blur, and compression artifacts.
Ideally, a stadium would be outfitted with multiple cameras with the specific purpose of AR capture for the best possible outcome—not the overall goal of the paper, but it’s a definite building block on the way to live 3D sports in AR.
Niantic, the developer behind Pokémon Go and the upcoming Harry Potter AR game, announced they’ll be opening up access to their latest work in AR, dubbed the ‘Niantic Real World Platform’, to third-party developers soon. To boot, the company also introduced a few key AR technologies that will have you salivating over the possibilities of actually chasing down pocket monsters on your commute to that next Pokéstop.
Niantic says their Real World Platform blends machine learning and computer vision to tackle the classic challenge of building a useful and realistic AR experience on mobile devices—something that can sense small details, understand surroundings, and model them in an interactive 3D space that a smartphone can digest.
One area of research Niantic has been work on is proper occlusion, or making sure digital imagery fits into the physical world correctly, and allowing it to be obscured naturally by objects in the environment. The company published a quick video on their blog, showing off their latest work in the area of AR occlusion. What better test subject than Pikachu?
Creating correct occlusion in AR requires that the computer, in this case a smartphone, contextually understands the world around it. Slowing down the video some, it becomes a little more clear however that the company still has a ways to go, as the occlusion masks oftentimes overcompensate, or misjudge the alignment of objects as Pikachu scampers about. While a proof-of-concept, it’s definitely a tantalizing look at the near future of smartphone AR, and a clear departure from what we saw at Pokémon Go’s launch back in Summer 2016.
The Niantic Real World Platform is also focusing on cross-platform AR for shared, multiplayer experiences. The biggest obstacle, the company says, is invariably latency. To this effect, the company says they’ve developed “proprietary, low-latency AR networking techniques” to overcome this problem, which allowed them to realize a unified, cross-platform solution with a single code base. To demonstrate, Niantic built a multiplayer smartphone AR shooter, dubbed ‘Neon’, which shows six users playing at once.
We can attribute some of this to the company’s recent acquisitions; Niantic recently acquired Escher Reality, a studio touted for its cross-platform, multi-user AR platform, and the computer vision and machine learning company Matrix Mill—two decisive moves forward after the company’s $200 million Series B finance round.
“It’s through the coordination of these teams that we’ve been able to establish what the Niantic Real World Platform looks like today, and what it will be in the future,” Niantic says in a statement.
As for third-party developers looking to get in on Niantic’s platform, the company says they’ll be picking a “handful” of devs to begin working with their tools later this year. To receive more information, sign up here.
Disney and Lenovo teamed up last year to produce Star Wars: Jedi Challenges, a sampler of Star Wars games that worked with Lenovo’s smartphone-driven Mirage AR headset and wireless lightsaber. While the app supports both Android and iOS smartphones, now users on recent iPads and iPhones don’t need the headset at all to play one of the most iconic in the lot: Holochess. And it’s free.
The update comes to iPhone and iPad users running iOS 11 or higher. Thanks to the app’s new ARKit compatibility, the full Holochess game mode is available for free including 18 levels across 6 planets and 8 unlockable creatures with unique special abilities. It doesn’t appear you can play any form of person vs person multiplayer though, which seems kind of like a missed opportunity.
Using ARKit, developers tap into the iPhone’s backfacing camera to create apps and games that are overlaid on top of the physical world. This means you won’t need a headset to play, as the iPhone acts as an augmented reality window.
There’s no word on when Android will get a similar ARCore integration. So Android users not so hot on the idea of shelling out $200 for the AR headset and lightsaber will have to wait a little longer, it seems.
The developers behind Modbox (2016), the VR physics sandbox for SteamVR-compatible headsets, have been experimenting with augmented reality recently, something they say is a perfect fit for the Modbox software.
Developer Lee Vermeulen didn’t wait for VR headsets like the Vive Pro to get an AR function, but rather strapped a stereoscopic Zed Mini to his original HTC Vive in effort to push Modbox forward into augmented reality.
“We are planning to make Modbox into a AR building application, which we feel it’s perfect for,” Vermeulen told Road to VR.
Image courtesy Stereolabs
In VR, Moxbox basically offers users a sandbox to experiment with toys, mods, and a place for impressive object destruction, although the app has taken a slightly more serious tone lately with its recent update which allows multiple users to concurrently script entity behavior much like you can in the Unity game engine. Modbox is currently in Early Access on Steam, supporting HTC Vive and Oculus Rift, although that may change in the near future.
AR games, Vermeulen tells us, “need to match the environment they’re in; so to do that, games usually tend to adjust based on the environment in a procedural way. For Modbox though, we’re creating something that allows people to make games to exactly match their environment, and hook up their AR worlds to their smart home.”
Modbox AR Experiments
While the team hasn’t discussed when they’re releasing the AR version of Modbox, Vermeulen highlighted a few interesting use-cases, including a way to change the color of his WiFi-connected Philips Hue Lights, and to turn each light off by ‘shooting’ it with a virtual bow and arrow:
Here, they show off what a self-made AR version of Space Pirate Trainer (2017) might look like. Drone battle in the living room anyone?:
These AR experiments are possible because Vermeulen previously mapped his environment and placed the virtual items in himself, something that requires a little DIY, but entirely possible thanks to the application’s numerous entities including weapons, objects, and NPCs.
Below we see Vermeulen importing a NPC and playing catch with a virtual basketball. To create the effect of ‘pulling’ the NPC from his monitor, he set a virtual camera to match the monitor’s position, making it seem like a magic window.
Virtual reality developers eyeballing augmented reality can get started using this setup, although admittedly the ZED Mini camera + VR headset combo actually overshoots the mark somewhat. It relies on a gaming computer’s elevated graphical capabilities, and presents the user with a comparatively wide field of view not possible on current AR headsets. The Vive controllers, which have 6 degrees of freedom (6DoF) positional tracking, are also unique to VR at the moment, although Magic Leap One is said to ship later this year with a single 6DoF controller which could allow for many of these sorts of interactions.
Just like in the early days of virtual reality, the first developers to create something fun and useful are kind of like the first miners to a gold rush. So while AR headsets still have a ways to go, seeing these early stabs at solving the question of content gives us an exciting look into what could be the near future of AR head-mounted displays.
Singh calls it the ‘Real World Warrior’ edition, which hypothetically lets multiple smartphone-clad players use on-screen controls to play the game just like the flatscreen version—except superimposed in the physical world standing on your table, or out in the town square.
As reported by the BBC, Singh says the game won’t be released for copyright reasons, but the he hopes it will help generate further interest in augmented reality.
Brett- und Gesellschaftsspiele sind jeher ein großartiger Spaß für Freunde und Familie. Die Kombination dieser Spiele mit der AR-Technologie konnte man bisher nur in Science-Fiction-Filmen, wie Star Wars sehen. Dort gibt es das Holo-Schach auf dem Millenium Falcon, bei dem sich Chewbacca und C-3PO die Zeit vertreiben. Das Unternehmen Helios Interactive bringt die Augmented Reality Brettspiele nun mit dem Spiel Echelon in die heutige Zeit.
Echelon: AR-Brettspiel für mehrere Spieler
Das amerikanische Unternehmen Helios Interactive entwickelt derzeit das AR-Brettspiel Echelon, das von mehreren Spielern mit der Microsoft HoloLens gespielt werden kann. Damit könnte es neue Maßstäbe für unser zukünftiges Spieleverhalten setzen.
Das Spiel funktioniert folgendermaßen: Die Mitspieler setzen sich eine HoloLens auf und starten das Spiel. Daraufhin nutzt jeder Mitspieler diverse Karten zum Hervorrufen von Kreaturen, die den Sieg bringen sollen. Laut den Entwicklern des Spiels basiert das Gameplay auf klassischen Brettspielmechaniken.
Diese werden durch Augmented Reality jedoch futuristisch dargestellt. Die Darstellungen beziehen sich dabei auf die Kreaturen und verschiedenen Spielteile, welche hologrammartig angezeigt werden. Dadurch wird den Spielern eine immersive Spielerfahrung geboten.
Der Hersteller Devin Fuller Knight von Helios Interactive sagt zum Projekt Folgendes:
“Die Tatsache, dass man mit einer HoloLens jede Menge Erfahrungen alleine bestreiten kann, ist schon unglaublich spaßig. Nutzt man die HoloLens jedoch zur Interaktion mit anderen Spielern, während alle Mitspieler dieselbe Welt sehen, ist das Ganze noch viel besser.”
Der Entwickler des Spiels Kristafer Vale wurde tatsächlich vom Holo-Schach in Star Wars inspiriert. Er erzählte, dass er es sofort, nachdem er dieses Spiel im Film sah, spielen wollte. Jedoch musste er bis zum heutigen Zeitpunkt warten, bis die notwendige Technologie, wie Unity und die HoloLens verfügbar waren.
Quelle: Lucasfilm
Wann Echelon erscheint, steht bisher noch nicht fest. Es bleibt jedoch spannend, denn derartige AR-Spiele könnten für eine Revolution des klassischen Brettspielmarkts sorgen.
Auf der Unite Europe 2017, der Unity-Entwicklerkonforenz in Amsterdam, hat der Spielepublisher Ubisoft zwei Projekte für MicrosoftsHoloLens enthüllt: Toy Soldier und Rabbid Rockets. Bei beiden Titeln ist es allerdings unwahrscheinlich, dass Ubisoft sie zu vollwertigen Spielen ausbaut. Aber es zeigt, dass der Entwickler mit HoloLens experimentiert und AR-Games in Zukunft erscheinen könnten.
AR-Prototypen für Microsofts HoloLens
Gezeigt hat die Titel Toy Soldier und Rabbid Rockets David Yue von Ubisoft während der Unite Europe 2017 bei seinem Vortrag zum Thema „AR Prototyping for the HoloLens“. Dabei verfolgen die Projekte unterschiedliche Herangehensweisen. Toy Soldier basiert auf Spatial Mapping, Rabbid Rockets muss ohne auskommen. In Toy Soldier kämpfen zwei Spielzeug-Armeen gegeneinander, das Beispiel zeigt die Krieger im Plastik-Look auf einem Schreibtisch. Der HoloLens-Träger kann mit den Figuren interagieren, dank Spatial Mapping werden die virtuellen Soldaten realistisch in die echte Umgebung eingeblendet. Bei Rabbid Rockets steuert der Spieler hingegen zwei Roboterarme, ein Fadenkreuz dient wohl als Zielerfassung.
Toy Soldiers verlässt sich auf Spatial Mapping
Rabbid Robots hingegen verzichtet auf Spatial Mapping
So unspektakulär die Titel auf den ersten Blick auch wirken mögen, sie zeigen erneut das große Interesse von Ubisoft an neuen Technologien. Für VR hat der Publisher beispielsweise mit Star Trek: Bridge Crew einen der meist erwarteten Titel im Programm, auf der E3 kündigte Ubisoft die Titel Transference und Space Junkies an. Für letzteren Titel entwickelte das Studio von Ubisoft in Montepellier sogar eine eigene Engine namens Brigitte, mit der sich Objekte in Echtzeit direkt in der virtuellen Realität anpassen lassen.