Mit Tango hat Google einige Erfahrungen mit AR, doch setzt das System spezielle Hardware voraus. Soeben veröffentlicht der Suchmaschinenriese eine Preview von ARCore, das direkt gegen Apples ARKit antritt. ARCore setzt Android 7 Nougat voraus und benötigt keine zusätzliche Hardware. Die Augmented-Reality-Software läuft bereits jetzt auf dem Samsung S8 und den Google Pixel Smartphones, bis Ende des Jahres will man 100 Millionen Geräte beispielsweise von Huawei, LG, Asus und anderen unterstützen.
ARCore: Ohne Zusatzhardware AR für alle Smartphones mit Android Nougats
In einem Blogbeitrag gibt Google heute die Veröffentlichung von ARCore bekannt. Damit reagiert das Unternehmen auf Apples überraschende Vorstellung von ARKit auf Apples Pressekonferenz zur WWDC. ARKit entwickelte sich schnell zum Liebling vieler Entwickler, schließlich spricht man mit der Veröffentlichung von iOS 11 im Herbst auf einen Schlag gleich mehrere Millionen von iPhone- und iPad-Anwendern an. Google geht jetzt den gleichen Weg und kann zumindest theoretisch mit ARCore eine wesentlich größere Zielgruppe ansprechen – sofern Android Nougat auf den Geräten installiert ist.
ARCore basiert auf Java und OpenGL. Das Framework unterstützt die beiden großen Spiele-Entwicklungsumgebungen Unity sowie die Unreal-Engine. Aber auch eine experimentelle Browser-Version ist erhältlich, welche die Entwicklung von AR-Apps ermöglichen soll. Die Erfahrungen aus dem Tango-Projekt sind laut Google in die Entwicklung eingeflossen. Über GitHub können Entwickler nicht nur erforderliche Dateien herunterladen, sonder auch ihre Meinung zu ARCore äußern. Außerdem gibt es erste Beispiele für AR-Anwendungen in einer Galerie. Noch ist ARCore eine Preview, aber die Weichen für den AR-Zug in den Massenmarkt sind gestellt.
It seems with Gamescom 2017 taking place last week the big tech companies have planned a slew of big announcements for this week and it’s only Tuesday. Microsoft had its Windows Mixed Reality news and today Google gone on the augmented reality (VR) offensive against Apple, unveiling the first developer preview for the ARCore software development kit (SDK). Just releasing ARCore on its own might not have been enough, so Google has enlisted the support of one of the biggest videogame engines in the world Unity.
As Unity has become one of the most popular engines for AR and virtual reality (VR) development – the company regularly states that 2/3rd’s of all immersive experiences are made with it’s software – it was only natural that support would be part of the SDK.
“We’re excited to work closely with Google to expand our Augmented Reality offering to our developer community and look forward to further exploration in this emerging landscape,” said Scott Flynn, Director of AR/VR Development at Unity Technologies. “We believe the accessibility of Augmented Reality presents a unique opportunity to reach a massive global audience through innovative applications and new possibilities for content delivery.”
ARCore leverages native integration with Unity 2017.2 Beta 9 and higher, enabling Unity developers to build AR experiences for Android devices, as well as add AR to existing Android apps.
“We’ve been working closely with Unity on Virtual and Augmented Reality for some time now. We share a vision in making this technology accessible to everyone, and a big part of that is giving developers the tools and platforms on which to build great immersive experiences,” said Clay Bavor, VP of Augmented and Virtual Reality, Google in a statement. “With ARCore, we’re taking the next step towards that goal, and we’re excited to be working together again to bring these new capabilities to developers large and small.”
Expect plenty more updates in the months to come for ARCore, so stay with VRFocus for the latest news.
Today the push for augmented reality (AR) advancement took another step with Google announcing ARCore, its version of Apple’s ARKit which released a short while back. Just like its rival, Google has ensured support from some of the biggest middleware companies, with Epic Games and Unreal Engine 4 officially supporting ARCore.
With the launch of ARCore’s preview today, developers will now be able to start creating AR experiences millions of Android users across the world. And for those who’re used to Unreal Engine, that means the knowledge they already have can now be utilised.
While today’s release is merely an early access look at ARCore, downloaded through GitHub, when Epic Games release Unreal Engine 4.18 in mid-October, the update will include deeper support for the AR software.
“Augmented reality is the next step in the evolution of the smartphone, and Unreal Engine developers are already hard at work on great AR experiences. ARCore will help further drive AR adoption by empowering developers to build and ship cross-platform AR experiences. We encourage the Unreal community to check out today’s early access Unreal Engine 4 support for ARCore on GitHub as well as the preview coming in Unreal Engine 4.18,” said Mark Rein, Co-Founder and Vice President, Epic Games in a statement.
Today’s ARCore SDK release supports the Google Pixel, Pixel XL, and the Samsung Galaxy S8 running Android 7.0 Nougat and above. By mid-October expect that to have been expanded.
As Epic Games release further details about Unreal Engine’s ARCore support, VRFocus will let you know.
No matter what Android device you own, you should soon see a greatly improved augmented reality experience. It's all thanks to ARCore, Google's new SDK that is designed to reach all devices running Nougat and higher.
Ever since Apple announced its augmented reality (AR) software ARKit a few months ago developers across the world have been excitedly creating AR projects. Now it’s the turn of Google, with the company unveiling its version ARCore for Android devices.
Just like ARKit for iOS, ARCore enables AR development across Android ecosystem, giving developers the ability to make compelling AR experiences without the need for any additional hardware. ARCore SDK supports the Google Pixel, Pixel XL, and the Samsung Galaxy S8 running Android 7.0 Nougat and above.
ARCore has three main components: motion tracking, environmental understanding and light estimation. With motion tracking ARCore combines visual data from the device’s camera and inertial measurements from the device’s IMU to estimate the position and orientation of the camera relative to the world over time. ARCore learns to understand the real world environment by detecting feature points and planes. The former are visually distinct features in the captured camera image while the latter are feature points that appear to lie on common horizontal surfaces, like tables and desks.
Using light estimation, ARCore enables developers to light their virtual objects under the same conditions as the environment around a user, increasing the sense of realism.
“ARCore is a foundational layer which provides similar capabilities, but it works across the Android ecosystem,” a Google spokesperson told Mashable. “Both give developers the ability to build motion tracking, environmental understanding, and light estimation into AR applications. It’s easy to imagine how ARCore works with Blocks, creating 3D assets in VR and then bringing them to AR, or with VPS [Visual Positioning Service] to map and annotate indoor spaces like museums or stores with AR.”
Google’s releasing a preview of the ARCore software development kit (SDK) today with further details to be unveiled later this year. When they are VRFocus will bring you the latest updates.
Today, Google is announcing ARCore, a software-based solution for making more Android devices AR-capable without the need for depth sensors and extra cameras. It will even work on the Google Pixel, Galaxy S8, and several other devices very soon and supports Java, Unity, and Unreal from day one. In short, it’s kind of like Google’s answer to Apple’s ARKit.
But that isn’t how Clay Bavor, VP of Augmented and Virtual Reality at Google, would describe it. Instead, when the topic came up, he reminded me that Google Tango had its first development kit all the way back in 2014 and that they’ve slowly been building towards this vision of a future where AR is democratized and available to millions around the world. Specifically, Google wants 100 million AR-capable Android phones within just the next few months.
“I like to call it immersive computing to sidestep some of the jargon and acronym debate — VR, AR, MR — just everything. Integrating computer-generated imagery seamlessly into experiences is what it’s all about,” Bavor explains at the beginning of our interview at Google’s San Francisco office last week. “Our goal here is to make AR mainstream on Android for developers and for consumers…We thought mobile smartphone-based AR was going to be a thing that was important years ago. The first Tango development kit was 2014 and by relaxing the constraints on the hardware, getting rid of the need for a depth sensor or additional tracking cameras we’ve honed in on our aim to prove out the technology and show the world that on consumer-grade sensors you can do really powerful AR experiences.”
From the demos I got the chance to try he’s not exaggerating. On standard Google Pixel and Samsung Galaxy S8 phones I watched robots walk across a tabletop and wave at me, trees shrink and grow from a few inches to several feet, and even a giant lion flex his muscles and look down at me as if I was really there. Similar to the first time someone tries VR, a powerful AR experience can feel like magic.
“There’s a lot of things that need to happen to make it successful though,” Bavor admits. “We’ve always known that it’s got to work at scale, so we’ve been investing in software-only solutions like ARCore, in addition to Tango. We feel that the technology is ready and we’ve figured out some of the core use cases for AR that we’re really excited about, so that’s why we’re so excited to get ARCore out there and start lighting up across the ecosystem.”
One great use case that I got to see first-hand is the dynamic good AR can bring to shopping. Using a plugin on the Wayfair website I watched as a room was measured in real-time (similar to the GIF above) and a chair was placed from the website into the physical space. Imagine applying this same concept to other types of shopping and interior design as well.
Another future example Bavor gave was through the use of VPS (Visual Positioning Service.) “We’ve been investing in a constellation of tools and services and applications around it to make it even more powerful for developers,” Bavor says. “One example is VPS. You’re gonna want, as a developer, to extend beyond just the tabletop or room to something that’s world-scale, or to anchor things in the world that persist so you can go back to them. ARCore and VPS we see as very natural partners and in fact we were building VPS in anticipation of scaling AR on Android with ARCore.”
Imagine being able to return to a specific building and see a sign in AR that has aged and rusted with years passing by. Or know where your friends recommend eating downtown just by looking around — Google Lens could be a big part of that too. It’s the type of stuff that’s been promised and imagined for a while, but we’re getting closer. We’re not there yet, but it’s an attainable goal in our lifetime.
“Another example, which is especially relevant for developers that build traditional smartphone apps in Java, is that we want to make it easier than ever for people to get into 3D modeling that haven’t done it before,” Bavor says. “We know there are a lot of people that want to get into 3D development and AR but aren’t experts in Maya, or Unity, or anything. So Blocks is an app we built with the intention of enabling people that have never done a 3D model in their life to feel comfortable building 3D assets. We even made it easy to export right from Blocks and pull into ARCore apps you’re developing.”
One of the demos I tried, the same one with little robots and trees on top of a table, had all of its assets created directly inside of Blocks and exported to ARCore.
“We’re also working on experimental browsers that combine all of the ARCore functionality into a web browser,” Bavor explains. “With just a little Java, some HTML, and a few assets you can create an AR experience. ARCore embeds parts of itself into the experimental browsers. Google was born in the web and we love the web and we want to enable more devs to build for AR. And notably the experimental browser will have a version for ARCore that uses Android and a version on iOS that uses ARKit. A developer can build one webpage with one Javascript library and have a cross-platform AR experience.”
More details will likely emerge about ARCore in the coming months and we can’t wait to see what intrepid AR developers are able to cook up with this new suite of accessible tools. Let us know what you think of Google’s ARCore, ARKit, WebAR functionality, and everything else in the world AR down in the comments below!
Today, Google is announcing the debut of ARCore, an initiative to bring mobile-powered AR experiences to the masses like never before. Previously Google’s Tango was the best way to see powerful AR projects in action, but that required extra cameras and sensors on high-end smartphones to work. Now, ARCore is aiming to democratize augmented reality for the Android ecosystem by offering a software-only solution.
“[Today] we’ll be announcing a preview of something we call ‘ARCore,'” said Clay Bavor, VP of Augmented and Virtual Reality at Google, during an interview with UploadVR. “It’s an SDK for Android developers to build AR experiences for Android phones — a software only solution for doing stuff. So basically we’re bringing much of the goodness of Tango onto a very broad range of AR devices.”
Starting right now, Google is making the ARCore SDK available to owners of the Google Pixel (running Oreo) and Samsung’s Galaxy S8 (running at least 7.0 Nougat,) with a target of running on millions of devices by “this Winter,” according to Bavor. Other Android devices from Samsung as well as smartphones from LG, Huawei, and ASUS are expected to all get support over time as well.
In an aim to make it as easy as possible to develop AR applications, ARCore will work with Java/OpenGL, Unity, and Unreal from day one. It’s aiming to leverage three core principles, according to a prepared statement from the company:
Motion tracking: Using the phone’s camera to observe feature points in the room and IMU sensor data, ARCore determines both the position and orientation (pose) of the phone as it moves. Virtual objects remain accurately placed.
Environmental understanding: It is common for AR objects to be placed on a floor or a table. ARCore can detect horizontal surfaces using the same feature points it uses for motion tracking.
Light estimation: ARCore observes the ambient light in the environment and makes it possible for developers to light virtual objects in ways that match their surroundings, making their appearance even more realistic.
Hands-On Impressions
During a visit to Google’s San Francisco office last week I got the chance to see three different ARCore demos showing off what the SDK can do. Everything has been built off of the foundation laid by Tango previously and is proof of the scalability of the technologies that Google is creating. At the meeting I got to speak with Jon Wiley, the Director of Immersive Design, and Nikhil Chandhok, Director of Product for Google AR, as well as the aforementioned Clay Bavor.
The first demo was a relatively standard AR proof of concept that let me place 3D models on a tabletop, move them around, and resize them. I could play around with a tree, house, mountain, and little Android robot mascot. As great as it was the most impressive thing about the whole demo was that every model was created inside Blocks, the latest VR 3D modeling program from Google. This platform, combined with Tilt Brush, is dramatically lowering the barrier to entry for intrepid designers and platforms like ARCore only serve as a means to further expand access.
One of my other favorite bits of the demo is how the little Android robots wobbled around and walked across the table. If I leaned down and put the phone close enough they’d even look at me and wave. Everything persisted if I moved the phone away and pointed at the ground and the camera was able to even track the location and plane of flat surfaces such as the table and floor. This meant I could move models from one surface to the next and they’d retain their scale and size relative to the rest of the environment.
All of that without any depth sensors or extra cameras on the phones. It was running on a Google Pixel.
The second demo Google showed me was one focused on large, life-sized 3D modeled characters. Each of the characters on display (a lion, tin man, and scarecrow) were all themed after The Wizard of Oz, because why not? They took me to another corner of the room and placed the lion next to chair with a light source behind him. He stood there and the light cast shadows across his torso in a surprisingly realistic manner.
Then, Jon Wiley stepped into the frame and stood next to the lion as it towered over him, similar to how I’m standing in the image above. The lion then recognized his presence, looked down at him, and flexed its muscles to try and display his superiority. Then Elizabeth Markman, a Communications Manager at Google, turned off the lights. The lion grabbed his trail, looked up at the ceiling, and quivered in fear. It was a remarkable series of events and it all played out flawlessly right before my eyes.
The final demo I saw during my meeting was the most practical. Using a plugin on the Wayfair website Nikhil Chandhok measured a corner of our meeting room using his finger on the phone’s screen. He dragged a cursor to represent the length, width, and height of the type of chair he wanted and then the Wayfair website displayed results only for chairs that would fit in that space. I can see this type of technology being used to buy furniture as shown, to buy paint for walls, sheets and blankets for beds, pillows for couches, and so much more. It’s exciting to think about.
Interestingly this latest example on the Wayfair website is the first I’ve seen of what Clay Bavor described as “WebAR” wherein the user doesn’t actually need to have a special application installed on their phone to get it to work. Instead, just by visiting the website that has the ARCore code implemented with a compatible browser, the phone can automatically channel an AR experience from the web directly.
In a world where Apple already has ARKit it was only a matter of time before Google unveiled something similar. With support for both Pixel and Galaxy S8 devices starting today, and even more Android phones in the near future, the number of AR-capable smartphones in the world is starting to dramatically increase. You can read more about Google’s plans for ARCore and what it means for immersive computing right here.
What do you think of Google’s ARCore? Do you have plans to develop for it? Let us know down in the comments below!