Apple ARKit öffentlich verfügbar und Video von Minecraft AR

Apples ARKit ist da: iOS 11 bringt fortschrittliche Augmented Reality auf aktuelle iPhone– und iPad-Hardware und ist ab sofort als Beta-Version auch für Nicht-Entwickler zum Download freigegeben. Was da noch fehlt? Na klar, der Multi-Plattformer schlechthin, Minecraft. Wie das Klötzchenspiel in AR aussieht, zeigt ein brandneues Video.

Apple ARKit in der öffentlichen Beta

Apples ARKit in iOS 11 hat seit seiner Vorstellung auf der WWDC hohe Wellen geschlagen, war aber bislang nur für Entwickler verfügbar. Das hat sich jetzt geändert: Die öffentliche Testversion ist über Apples Webseite ab sofort kostenlos erhältlich – wer sein iPhone oder iPad produktiv einsetzt oder auf das Device angewiesen ist, sollte sich das Update auf die Vorversion von iOS 11 allerdings gut überlegen, da sie noch Fehler enthalten kann und im schlimmsten Fall das Gerät abschießt. Wer sich davon nicht abschrecken lässt, füllt die Registrierung aus.

Allerdings können nicht alle Anwender der Touch-Devices in den Genuss der neuen Augmented-Reality-Welt kommen: Voraussetzung ist ein iPhone oder iPad mit Prozessoren ab dem A9. Nur diese sind schnell genug für die benötigten Berechnungen. Damit eignen sich Apple-Devices ab dem iPhone SE und iPhone 6 sowie Tablets ab dem iPad 2017 und dem iPad Pro.

Video: Minecraft für Apple ARKit

Lust auf das Update macht allerdings Matthew Herbert: Der Entwickler fragte sich, wie wohl das Klötzchenspiel Minecraft in Augmented Reality aussehen könnte und bastelte mit dem ARKit gleich einen Prototypen. Das Video mischt also die reale Umgebung mit den bekannten pixeligen 3D-Bauten im Minecraft-Stil. Herbert zeigt sich von dem ARKit restlos begeistert, obwohl er kein Freund von proprietären Plattformen ist. Besonders das Tracking sei fantastisch, so der Entwickler. Für die Programmierung nutzte Herbert Xcode und Unity, außerdem stellt er ein How-To-Video auf seinem YouTube-Kanal in Aussicht, wenn Interesse daran besteht.

Minecraft ist nicht die erste AR-Umsetzung für das AR-Kit, die von einem Spiel inspiriert wurde. Beispielsweise hat es Widowmaker aus Overwatch in die AR-Umgebung geschafft und verspricht hohen Besuch im eigenen Schlafzimmer.

Der Beitrag Apple ARKit öffentlich verfügbar und Video von Minecraft AR zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Watch This Amazing Prototype Video Of Minecraft In AR Using Apple’s ARKit

Watch This Amazing Prototype Video Of Minecraft In AR Using Apple’s ARKit

If we’ve learned one thing over the years it’s that if it has a screen or some way of display visuals, then it can and will probably play Minecraft. All modern smartphones, tablets, game systems, computers, and more all have Minecraft. In fact, it’s already spread to VR devices like the Samsung Gear VR, Oculus Rift, and even HTC Vive. But since we’re still waiting on the big augmented reality (AR) breakthrough to take the tech into the mainstream it hasn’t been implemented on that level yet in a consumer-facing way.

YouTube user Matthew Hallberg utilizes Apple’s new ARKit to experiment with what it might be like to play Minecraft in AR. Anyone that’s played the popular sandbox building and creation game know that you typically start by having a world (or “seed”) procedurally generated. No two game worlds are ever the same. You then explore, mine, and build things as you fight to survive. In the case of Minecraft AR though, the world you’re building in is just the real world around you with modern AR technology layered on top of it all. Check it out in the prototype video he made right here:

Sometimes it’s tough to imagine gaming-focused use cases for AR, but Minecraft seems like a great example. Imagine an official LEGO app that utilized AR displays — you’d never have to clean up those pesky blocks ever again.

Let us know what you think of the idea down in the comments below!

Tagged with: ,

4 Dev Demos Showing Off Apple’s New ARKit Tracking

Showcased at Apple’s Worldwide Developer Conference this week, ARKit is a new core technology for iOS 11, due to launch this Fall, soon to enable augmented reality features on hundreds of millions of iPhones and iPads. As the iOS 11 developer beta is already available, we’re starting to see some interesting real-world tests of ARKit, showing off the tracking that’s achievable with nothing more than a camera.

Developer Cody Brown hacked together a quick demo using Overwatch assets as a ‘hello world’ test of ARKit running on an iPhone 6S:

Apple’s keynote included a couple of impressive live demonstrations of screen-based AR on the stage, including a sneak peek at an Unreal-powered experience from developer Wingnut AR:

Perhaps the most impressive aspect of that demo was that it was running on an iPad, using the single camera on the back of the device for tracking. This appeared to deliver fairly stable tracking, without the need for dedicated hardware, unlike Google Tango, which uses a suite of cameras and sensors.

Now that developers have their hands on ARKit, the early real-world tests are very promising, such as this clip of the Unity sample demo showing tracking points and plane estimation:

This video from Austrian augmented reality company ViewAR puts the technology through a demanding tracking test, covering the camera, moving quickly away from the virtual object, through multiple rooms with different lighting conditions to check for drift. The result is remarkable considering the limitations of using a single camera:

Apple is believed to be hard at work on AR technologies, and is likely to make screen-based AR a key selling point of the next iPhone, which is anticipated to have a near bezel-free design, which would certainly enhance the appearance of AR features.

The post 4 Dev Demos Showing Off Apple’s New ARKit Tracking appeared first on Road to VR.

Apple: ARKit für iOS11 vorgestellt

Auf Apples Worldwide Developer Conference (WWDC) wurden einige interessante Neuigkeiten bezüglich Virtual Reality verkündet. So wurde ein VR-Entwicklungskit sowie die neuen VR-tauglichen iMac Pros vorgestellt. Neben diesen Neuigkeiten stellten sie außerdem das AR-Entwickler-Kit für das iOS 11 vor, welches Laut Apple über Nacht zur größten AR-Plattform weltweit werden soll.

ARKit als größte AR-Plattform für iPhone und iPad

Apple-ARKit-AR-Augmented Reality

Mit der Veröffentlichung des neuen ARKit, dem neuen App Entwicklungskit für das iOS 11, integriert Apple eine AR-Plattform für iPhones und iPads. Der Senior Vizepräsident des Bereichs Software Engineering Craig Federighi stellte diesbezüglich die Features des Entwicklungskits an einer Test App vor: Hier lies er mit einem Iphone diverse virtuelle Gegenstände, wie z. B. eine Tasse oder eine Vase auf einem realen Tisch erscheinen. Die virtuellen Objekte sind ausgestattet mit interaktiven Animationen und dynamischen Lichtverhältnissen.

Das ARKit soll zudem über zuverlässiges und schnelles Motion-Tracking verfügen. Dadurch erscheinen die Objekte realer als vorher und wirken als würden sie wirklich an dem vorgesehenen Platz stehen. Doch auch komplexere Erfahrungen sind möglich, so können beispielsweise auch ganze AR-Umgebungen, wie z. B. komplette Spiele auf einer Oberfläche erstellt werden. Dafür enthält es Support für Unity, Unreal, SceneKit und Xcode. Da diese alle iOS kompatibel sind und auf Milliarden von iPhones und iPads anwendbar sind, verspricht sich Apple laut eigener Aussage über Nacht mit ARKit zur größten AR-Plattform der Welt zu werden. Damit konkurriert Apple natürlich auch mit Facebooks Arbeit innerhalb des AR-Bereichs. Diese stellten vor Kurzem auf der F8 ihre AR-Demo vor.

Das iOS 11 wird für alle iPhone 5S und neuere Generationen sowie alle iPad Air- und iPad Pro-Modelle verfügbar sein. Zudem für die iPads der 5. Generation, dem iPad Mini 2 und neueren Modellen sowie dem iPod Touch der 6. Generation. Apple veröffentlicht das iOS 11 im Herbst des Jahres 2017. Dies geschieht vermutlich zeitgleich mit der Veröffentlichung des iPhone 8 und des iPhone 7S Smartphones. Die offene Beta-Phase des neuen Betriebsystems beginnt bereits im Juni 2017.

(Quellen: RoadtoVR)

Der Beitrag Apple: ARKit für iOS11 vorgestellt zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Apple: ARKit für iOS11 vorgestellt

Auf Apples Worldwide Developer Conference (WWDC) wurden einige interessante Neuigkeiten bezüglich Virtual Reality verkündet. So wurde ein VR-Entwicklungskit sowie die neuen VR-tauglichen iMac Pros vorgestellt. Neben diesen Neuigkeiten stellten sie außerdem das AR-Entwickler-Kit für das iOS 11 vor, welches Laut Apple über Nacht zur größten AR-Plattform weltweit werden soll.

ARKit als größte AR-Plattform für iPhone und iPad

Apple-ARKit-AR-Augmented Reality

Mit der Veröffentlichung des neuen ARKit, dem neuen App Entwicklungskit für das iOS 11, integriert Apple eine AR-Plattform für iPhones und iPads. Der Senior Vizepräsident des Bereichs Software Engineering Craig Federighi stellte diesbezüglich die Features des Entwicklungskits an einer Test App vor: Hier lies er mit einem Iphone diverse virtuelle Gegenstände, wie z. B. eine Tasse oder eine Vase auf einem realen Tisch erscheinen. Die virtuellen Objekte sind ausgestattet mit interaktiven Animationen und dynamischen Lichtverhältnissen.

Das ARKit soll zudem über zuverlässiges und schnelles Motion-Tracking verfügen. Dadurch erscheinen die Objekte realer als vorher und wirken als würden sie wirklich an dem vorgesehenen Platz stehen. Doch auch komplexere Erfahrungen sind möglich, so können beispielsweise auch ganze AR-Umgebungen, wie z. B. komplette Spiele auf einer Oberfläche erstellt werden. Dafür enthält es Support für Unity, Unreal, SceneKit und Xcode. Da diese alle iOS kompatibel sind und auf Milliarden von iPhones und iPads anwendbar sind, verspricht sich Apple laut eigener Aussage über Nacht mit ARKit zur größten AR-Plattform der Welt zu werden. Damit konkurriert Apple natürlich auch mit Facebooks Arbeit innerhalb des AR-Bereichs. Diese stellten vor Kurzem auf der F8 ihre AR-Demo vor.

Das iOS 11 wird für alle iPhone 5S und neuere Generationen sowie alle iPad Air- und iPad Pro-Modelle verfügbar sein. Zudem für die iPads der 5. Generation, dem iPad Mini 2 und neueren Modellen sowie dem iPod Touch der 6. Generation. Apple veröffentlicht das iOS 11 im Herbst des Jahres 2017. Dies geschieht vermutlich zeitgleich mit der Veröffentlichung des iPhone 8 und des iPhone 7S Smartphones. Die offene Beta-Phase des neuen Betriebsystems beginnt bereits im Juni 2017.

(Quellen: RoadtoVR)

Der Beitrag Apple: ARKit für iOS11 vorgestellt zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Apple betting on Windows-based AR

Well, well, well, look who wants to join and play! Apple’s developer conference WWDC just started and it’s finally the moment to see some in-house augmented reality development from Apple hit the stage! I didn’t even have the time to sort all my AWE conference notes, check all videos or talk about Ori’s keynote to push superheroes out into the world! Hm, guess I can´t resist, but need to write up the Apple news today:

One more thing… AR

So, Apple talks about their own big brother speaker for your living room, some other hardware and iOS updates, etc.pp, but then finally we get to learn about Apple’s plans to jump into AR! Pokémon plays the well-known example for the masses again. But this time using the new “ARKit” by Apple. Their new SDK toolset for developers that brings AR…

to your phone or tablet. Yep. No AR goggles (yet), but through a frame, a window to hold. As I discussed last week, this was highly expected. Apple AR will be seen through windows the next years, too. Apple won’t spearhead the glasses approach.

The presentation of this new toolkit is nicely done and it feels like AR has never been seen before. Craig Federighi is really excited – “you guys are actually in the shot here” – that one could think people at Apple were only thinking about VR lately and are surprised to see a camera feed in the same scene. He claims that so many fake videos have been around and now Apple is finally showing “something for real”. (Nice chat, but honestly, there have been others before. But, let’s focus:) Obviously Apple is good in marketing and knows their tech well. They have been investing in this a lot and now we can see the first public piece: in the demo we see how the RGB camera of the tablet finds a plain wooden surface of the table and how he can easily add a coffee cup, a vase or a lamp to it. The objects are nicely rendered (as expected in 2017) and have fun little details like steam rising from the coffee, etc. The shown demo is a developer demo snippet and shows how to move around the objects – and how they influence each other regarding lighting and shadows. The lamp causes the cup to cast a shadow on the real table and changes to object movements accordingly. In the demo section one could try it out and get a closer look – I’ve edited the short clip below to summarize on this. Next, we see a pretty awesome Unreal-rendered “Wingnut AR” demo showing some gaming content in AR on the table. Let’s take a look now:

The demos show pretty stable tracking (under the prepared demo conditions), Apple states that the mobile sensors (gyro, etc.) support the great visual software part using the RGB camera. They talk about “fast stable motion tracking” and as it was shown this can be given a “thumbs up”. The starting point seems to be the plane estimation to register a surface to place objects on. They don’t talk about the “basic boundaries” in detail – how is a surface registered? Does it have clear borders? In the Unreal demo we briefly see a character fall off the scenery into darkness, but maybe this works only in the prep’ed demo context. Would it work at home? Can the system register more than one surface? Or is it (today) limited to one height only level to augmented stuff? We don’t learn about this and the demo (I would have done the same) avoid these questions. But let’s find out about this later below.

Apple seems pretty happy about the real-time light calculation to give a more realistic look to it. They talk about “ambient light estimation”, but in the demo we only see some shadows of the cup and vase moving in reference to the (also virtual) lamp. This is out of the box functionality of any 3D graphics engine. But it seems they plan way bigger things, actually considering the real world light, hue, white balance or other details to better integrate AR objects. Metaio (now part of Apple and probably leading this dev) showed some of these concepts during their 2014 conference in Munich (see in my video from back then) using the secondary camera (face-facing) to estimate the real world light situation. I would have been more pleased if Apple showed some more on this, too. After all, it’s the developer conference, not the consumer marketing event. Why don’t they switch off the lights or use a changing spotlight with some real reference object on the table?

Federighi briefly talks about scale estimation, support for Unity, Unreal and SceneKit to render and that developers will get Xcode app templates to start things quickly. With so many existing iOS devices out in the market they claim to have become “the largest AR platform in the world” over night. Don’t know the numbers, but agreed that the phone will stay the AR platform of everybody’s (= consumer big time market) choice these days. No doubt about that. But also no innovation by Apple seen today.

The Unreal Engine demo afterwards shows some more details on tracking stability (going closer, moving faster) and how well the rendering quality and performance can be. No real interaction concept is shown, though – what is the advantage? Also, the presentation felt a bit uninspired – reading from the teleprompter in a monotone voice. Let’s get more excited, shall we? Or won’t we? Maybe we are not so excited, since it has all been seen before? Even the fun Lego demo reminds us of the really cool Lego Digital Box by metaio.

A look at the ARToolkit

The toolkit’s documentation is now also available online, so I planned to spend hours there last night. But to admit, it’s quite slim as of today, but gives a good initial overview for developers. We learn a thing or two:

first, multiple planes are possible. The world detection might be (today) more limited than on a Tango or Hololens device, but their system focuses on close-to-horizontal oriented surfaces. The documentation talks about “If you enable horizontal plane detection […] notifies you […] whenever its analysis of captured video images detects an area that appears to be a flat surface.” and “orientations of a detected plane with respect to gravity”. Further it seems that surfaces are rectangular areas since “the estimated width and length of the detected plane” can be read as attributes.

Second, the lighting estimation seems to include only one value to use: “var ambientIntensity: CGFloat”, that returns the estimated intensity in lumens of ambient light throughout the currently recognized scene. No light direction for cast shadows or other info so far. But obviously a solid start to help for a better integration.

They don’t talk about other things regarding world recognition. E.g. there is no reconstruction listed that would allow for assumed geometry to be used for occlusions. But, well, let’s hit F5 in our browsers during the next weeks to see what’s coming.

AR in the fall?

Speaking about what’s next. What’s next? Apple made a move that was overdue to me. I don’t want to ruin it for 3rd party developers creating great AR toolkits, but it was inevitable to come. While a third party SDK has the huge advantage of taking care of cross-platform-ness, it is obvious that companies like Apple or Google want to squeeze the best out of their devices by coding better low-level features into their systems (like ARKit or Tango). The announcement during WWDC felt more like “ah, yeah, finally! Now, please, can we play with it until you release something worthy of it in the fall?” Maybe we will see the iphone 8 shipping a tri-cam setup like Tango – or the twin-camera-setup is enough for more world scanning?

I definitely want to see more possibilities to include the real world, be it lighting conditions, reflections or object recognition and room awareness (for walls, floors and mobile objects)… AR is just more fun and useful if you really integrate it into your world and allow easier interaction. Real interaction. Not only walking around a hologram. The Unreal demo sure was only to show off rendering capabilities, but what do I do with it? Where is the advantage over a VR game (with possibly added positional tracking for my device)? AR only wins if it plays this advantage: to seamlessly integrate into our life and our real world vision, our current situation and enables a natural interaction.

Guess now it’s wait and see (and code and develop) with the SDK until we see some consumer update in November. This week, it was a geeky developer event, but we can only see if it all prevails when it hits the stores for all consumers. The race is on. While Microsoft claims the phone to be dead soon (but does not show a consumer alternative just yet), Google sure could step up and push some more Tango devices out there to take the lead during summer. So, let’s enjoy the sunny days!

The post Apple betting on Windows-based AR appeared first on augmented.org.

Apple Embraces AR & VR, and What That Means

Apple, Inc. has been slow to jump on the augmented reality (AR) and virtual reality (VR) bandwagons, letting their competitors test the water before diving in themselves. That changed today, as the company’s Worldwide Developer Conference (WWDC), San Jose, hosted several key announcements for the immersive technology industries.

First up came the reveal that Steam VR would soon support Mac format. This doesn’t mean that every VR title on Steam will automatically work with iOS devices, but giving developers the option to create videogames and experiences for the Mac format is certainly a step in the right direction. This was reinforced with the announcement of Metal 2, the company’s API for high performance graphics, which promises to deliver a VR-optimised display pipeline.

This new technology was then showcased by Epic Games, who used the VR Editor component of Unreal Engine 4 to demonstrate a real-time build using assets from Industrial Light and Magic (ILM)’s digital Star Wars library through the HTC Vive, using green screen mixed reality (MR) video. That one sentence had a huge amount of high profile names within it, and the showcase was as impressive as you would expect it to be. Lauren Ridge, Technical Writer at Epic Games, resized and positioned Tie Fighters and other craft, before playing out the scene she created. A choreographed sequence featuring the infamous Darth Vader drew huge applause from the audience.

Apple then revealed their VR compatible systems before moving on to AR. With an equally remarkable showcase – again utilising Unreal Engine 4 and big names from the film industry – a LEGO model that could be exploded, Pokemon GO with a Pikachu looking as if it was directly in contact with the ground and an impressively rendered wild west cyberpunk battle scene, the ARKit development suite features stabilisation in motion-tracking, ambient lighting estimation and support for Unity, Unreal Engine, amongst other features.

While the company stopped short of revealing the AR head-mounted display (HMD) many had been expecting, the development tools presented with certainly have application beyond iPads and iPhones. A showcase it was; now it’s up to developers to take the technology further.

Ultimately, Apple coming late to the game is nothing new. What it does mean however, is that AR and VR will get a publicity injection and brand new opportunities to grow its audience. This is perhaps more important than the hardware line-up supporting VR or the graphical prowess of the AR presentation: it’s about the experiences you can have with the technology opposed to the technology itself, after all.

ARKit Support Now Available for Unreal Engine 4

Apple’s new AR endeavour, ARKit, a new development suite designed to enhance augmented reality (AR) on iOS devices, was announced only hours ago at the Worldwide Developers Conference (WWDC), San Jose. Following a live on-stage demo, Epic Games has now made available early-­access support for ARKit in Unreal Engine 4.

 

Unreal Engine 4 Header 2ARKit uses the iOS device’s camera for positional tracking, potentially bringing millions of new consumers into the world of AR. The showcase at WWDC today included Pokemon GO and a new experience designed by director Peter Jackson and powered by Unreal Engine 4, featuring a highly detailed spaceship battle scene.

“High-quality AR experiences truly demand the power and performance Unreal Engine provides: Realistic lighting and shadowing that match the real world. High-end cinematic tools. Filmic post-processing. Physically-based rendering with advanced materials. A solid engine foundation that scales,” states Tim Sweeney, CEO Epic Games, over on the official Unreal Engine blog.

Unreal Engine 4’s early­access support for ARKit is now available to download via GitHub. The download includes the complete source code that’s ready to compile and run immediately.

Further to this, Epic Games has confirmed to VRFocus that binary support will arrive with the Unreal Engine 4.17 preview in July 2017, followed by launch in early August. VRFocus will continue to keep you updated with all the latest details on ARKit and Epic Games’ endeavours in AR.

Apple’s ARKit is Bringing Augmented Reality to “hundreds of millions of iPhones and iPads”

Apple’s annual Worldwide Developer Conference (WWDC) is here, and today’s keynote saw a number of VR-specific announcements including Apple’s first VR-ready computers to go along with the launch of the company’s newest macOS High Sierra. While the company is finally going ‘VR-native’ for desktop, Apple is also zeroing in on augmented reality for iOS 11 with the entrance of their newly revealed app developer kit ‘ARKit’.

Possibly taking a swipe at Facebook’s latest AR demo at F8 in April, Senior VP of software engineering Craig Federighi said: “We’ve all seen a lot of carefully edited vision videos on this topic recently, but in the case, I’d like to show you something for real.”

Starting up a test application that will be made available to developers, Federighi explains that with the iPhone’s computer visions capabilities it’s able to map surfaces and add digital objects—replete with interactive animations and dynamic lighting. Adding a steaming coffee cup, a lamp and a vase to a bare, marker-less table, the tracking proves to be relatively solid.

Federighi says that ARKit provides fast and stable motion-tracking, plane estimation with basic boundaries, ambient lighting estimation, scale estimation, support for Unity, Unreal, SceneKit and Xcode app templates—all available on “hundred of millions of iPhones and iPads […] making overnight ARKit the largest AR platform in the world.”

Apple says iOS 11 will be made available to iPhone 5s and later, all iPad Air and iPad Pro models, iPad 5th generation, iPad mini 2 and later, and iPod touch 6th generation. iOS 11 will be released this fall, likely in tandem with iPhone 8 and iPhone 7S smartphones. A public beta is coming in June.

image courtesy Apple

Apple is working with third-parties such as IKEA, Lego, and Niantic to use ARKit, with Apple showing an improved Pokémon Go on stage that looks to actually utilize augmented reality to bring the game to life. Because ARKit uses computer vision that relies on the device’s onboard sensors and CPU/GPU, no external equipment is required to run these sorts of AR experiences.

The keynote also revealed a new AR-focused company from critically-acclaimed director and FX guru Peter Jackson called ‘Wingnut AR’. A special demo showed off the graphical and camera-based tracking capabilities of Apple’s hardware featuring a complex, real-time rendered scene digitally placed on a tabletop using an iPad. Wingnut AR is bringing an AR experience to the App Store later this year. Check out the video below to see Wingnut AR’s special Apple demo.

 

The post Apple’s ARKit is Bringing Augmented Reality to “hundreds of millions of iPhones and iPads” appeared first on Road to VR.

Lord of the Rings Director Peter Jackson’s Wingnut AR Studio Uses Apple ARKit

Apple’s WWDC 2017 conference has been a fairly low event, without all the grandeur that Google’s I/O show had but that doesn’t mean to say it was any less interesting. While it long been suspected that Apple has been working on virtual reality (VR) and augmented reality (AR) technologies the news today has been somewhat unexpected – mainly due to the company being so secretive. As part of the keynote ARKit was unveiled and to properly showcased the software Wingnut AR was brought on stage to showcase what it had built.

Wingnut AR isn’t some random indie studio specialising in AR tech, in fact it was founded by acclaimed director Peter Jackson, the man behind The Lord of the Rings and The Hobbit. It wasn’t Jackson who took to the stage but Creative Director Alasdair Coull, who along with another Wingnut AR member demoed an unnamed project that came to life in real-time on the stage.

Wingnut AR 1

A scene rolled out on the table with animated characters, vehicles and buildings, all rendered using Epic Games’ Unreal Engine 4 – seemingly proving quite popular with developers. The level of detail and quality of the graphics is apparent straight away, with the entire environment looking as good as a proper videogame running on a console or PC.

Things then liven up with explosions and chaos galore, the characters then start to run about and one even fly’s off the edge of the table. This is all while the guy with the iPad moves around, moving further back to get a wider viewpoint and then moving right up close to get a good look.

While the content wasn’t interactive – merely viewable – it did demonstrate the possibilities that AR, Unreal Engine and ARKit could create. Coull finished by saying that an AR experience from Wingnut AR would be launching on the app store later in the year.

For the latest VR and AR updates from Apple, keep reading VRFocus.