You Can Now Photoshop the World in Real-time With AR on an iPhone

WarpAR is a free iOS app from developer Matt Bierner which lets you use Photoshop-like liquify tools to dynamically modify the world through AR. More than just a cool tech demo, it also helps us imagine what the future might be like when our physical reality becomes increasingly subject to digital whims.

WarpAR brings Photoshop-like liquify tools to augmented reality. The free app works on all iOS devices which support ARKit. On iOS devices with LiDAR, a bonus feature allows users to reach out with their hand to distort reality directly.

The app supports six different tools which will be familiar to Photoshop users:

Push – Move around parts of the world’s textures by tapping and dragging.
Restore – Move the texture back to its original, undistorted state.
Bloat – Expand the texture outwards from the center.
Pucker – Collapse the texture inwards from the center.
Swirl Left – Swirl the texture to the left (counterclockwise) around the center.
Swirl Right – Swirl the texture to the right (clockwise) around the center.

Once you’re done modifying the world, you can easily take photos and videos of the effect directly in the application for sharing.

Developer Matt Bierner has been experimenting with lots of reality-bending AR apps. A prior project, watAR, adds frighteningly convincing AR waves to the world:

And then there’s In The Walls, which projects the user’s face onto surfaces in a way that may or may not induce nightmares:

While largely playful tech demos, Beirner’s work is also a potent springboard to imagine what this sort of reality-manipulating capability might look like in the future.

Though these apps run in a handheld AR mode today, which offers only a small view into the modified reality, the future will bring always-on head-mounted AR devices with much larger fields of view that will make the experience more natural. While many AR applications focus on placing digital objects into the real world, apps like these show that modifying the world itself may be an equally compelling use of augmented reality.

Anything from whiting-out parts of the skyline you don’t like to digitally sculpting the world around you to turning on digital shades in your home when the sun is shining in too brightly, could be practical future use-cases for this sort of AR. On the downside, some uses of these techniques could involve ‘erasing’ parts of the world users don’t want to confront, like rundown streets or even homeless individuals, which could deflect much-needed attention away from disadvantaged communities. As ever, technology itself is rarely good or bad; it’s what we choose to do with it that will determine if it is a net positive or negative to humanity.

The post You Can Now Photoshop the World in Real-time With AR on an iPhone appeared first on Road to VR.

Millions of Sketchfab Models are now Available in Apple’s AR Format

During Apple’s WWDC 2018 event when iOS 12 made its official debut the company also unveiled a new file format specifically for augmented reality (AR) and ARKit, that format was USDZ. Today, 3D model library Sketchfab has announced it fully supports the format, unlocking millions of models in the process.

Available for the launch today will be 400,000 USDZ files instantly available for download across both free Creative Commons licensed models as well as royalty-free models. Creators can then use Sketchfab to convert from most 3D formats to USDZ. This will then give ARKit developers access to a massive range of 3D models for them to use within applications.

One of the biggest benefits adding USDZ support comes in the form of AR Quick Look. This feature allows iOS users viewing models on Sketchfab to quickly view them in AR.  All they need to do is hit the download button on any downloadable model while logged into Sketchfab and select the USDZ option.

The platform is also working to bring the same functionality to the Sketchfab viewer in the near future.

Apple - Augmented Reality/AR At WWDC 18

“Our goal at Sketchfab has always been to make 3D content easily shareable and discoverable as widely as possible, and Apple’s AR platform – enabled thanks to USDZ – has become a key part of the ecosystem we play in. We are excited to offer a great new way to leverage the massive Sketchfab library,” said Alban Denoyel, Co-founder and CEO of Sketchfab in a statement.

There have been plenty of advancements in the AR field, both for consumers as well as enterprise. Companies like Dr. Seuss Enterprises have created educational apps for mobile devices while BBC Studios and Preloaded built BBC Earth – Micro Kingdoms: Senses for Magic Leap 1. For those wishing to create their own AR content, Psychic VR Lab’s STYLY platform will be adding that functionality later this year.

As further progress is made within the AR industry, VRFocus will keep you updated.

New iPad Pro Gets LiDAR Scanner for Improved AR

LiDAR, the light detection and ranging technology, is usually found in commercial and industrial equipment for things like 3D scanning buildings, land, and other objects. Now, Apple announced that its latest iPad Pro includes LiDAR hardware, making it what the company calls “the world’s best device for augmented reality.”

LiDAR is able to create a depth map by measuring how long it takes light to reach an object and reflect back. To that effect, Apple’s new custom-designed LiDAR scanner is said to operate at “nano-second speeds,” and work up to five meters away (~16.5 ft) from an object, something the company says it can do both indoors and out. This basically gives it a higher fidelity way of mapping a room for more accurate AR scenarios.

Image courtesy Apple

To do this, Apple says its new depth frameworks in iPadOS combines depth points measured by the LiDAR scanner, data from both cameras and motion sensors, and on-device computer vision algorithms via its A12Z Bionic chip.

The company has also integrated the new scanner to plug into its existing ARKit framework, giving all of the platform’s AR apps improved motion capture and people occlusion, Apple says. This close integration likely points to other Apple devices getting LiDAR for improved AR in the future, with the most likely suspect being the next flagship iPhone.

SEE ALSO
Report: Apple Acquires Motion Capture Firm IKinema

“Using the latest update to ARKit with a new Scene Geometry API, developers can harness the power of the new LiDAR Scanner to unleash scenarios never before possible,” the company says. “The LiDAR Scanner improves the Measure app, making it faster and easier to automatically calculate someone’s height, while helpful vertical and edge guides automatically appear to let users more quickly and accurately measure objects. The Measure app also now comes with Ruler View for more granular measurements and allows users to save a list of all measurements, complete with screenshots for future use.”

Priced at $800 for the 11-inch version and $1,000 for the 12.9-inch, the new iPad Pro’s LiDAR scanner comes alongside a list of other “pro” features, including new cameras, motion sensors, “pro” performance & audio, and a Liquid Retina display. Can’t forget that new Magic Keyboard, which Apple hopes will entice more users to finally make the switch from PC laptops to iPads. Check out the new iPad Pro here.

The post New iPad Pro Gets LiDAR Scanner for Improved AR appeared first on Road to VR.

New iPad Pro Adds LiDAR And ‘Instant’ AR Placement

Apple unveiled a new line of iPad Pros which include a LiDAR scanner and “new depth frameworks” to combine depth information from all the device’s sensors and cameras “for a more detailed understanding of a scene.”

The new iPads start at $800 and include the LiDAR scanner and two wide angle cameras, with the widest of the two offering a 125-degree field of view.

According to Apple, “Every existing ARKit app automatically gets instant AR placement, improved motion capture and people occlusion. Using the latest update to ARKit with a new Scene Geometry API, developers can harness the power of the new LiDAR Scanner to unleash scenarios never before possible.”

Mixed reality startup LIV recently released to testers a version of its camera app for iOS. With an A12 or newer processor the app automatically recognizes the background of the scene. This can be used to composite a player wearing a VR headset with content from their virtual without the need for a green screen. While the new iPad Pro features an A12Z Bionic chip, it is currently unknown whether or not it will work with an app like LIV.

Still, the new iPad Pro looks like it might be extremely useful relative to VR and AR. In 2018, Facebook showed an incredible tech demo at its OC5 developer conference that featured six people playing Dead and Buried on Oculus Quest at “arena” scale. A tablet was able to peer into the scene in real-time. Check it out here:

Facebook’s Oculus app already supports casting the view from an Oculus Quest to an iOS device. If Facebook could take advantage of the new depth information provided by this latest iPad, one day it might be possible to simply point the device at your friend wearing an Oculus Quest and peer into their virtual world.

Of course, Facebook has made no announcements about support for this kind of capture directly on an Apple device. We’ll provide updates as we hear whether developers are able to take advantage of the 3D-sensing capabilities of the new iPad Pro.

The post New iPad Pro Adds LiDAR And ‘Instant’ AR Placement appeared first on UploadVR.

Apple CEO Tim Cook Expects AR: ‘Will Pervade our Entire Lives’

When it comes to preference over virtual reality (VR) and augmented reality (AR) technologies Apple CEO, Tim Cook has made his stance continually clear, AR is most certainly the future. Cook was in Dublin, Ireland this week to receive a Special Recognition Award for the company’s contributions to the country – 6,000 people work at its Cork office – and discussed the future of tech and AR’s role.

During a session chaired by IDA Ireland CEO Martin Shanahan, he asked Cook about his expectations for the next five to ten years: “I’m excited about AR. My view is it’s the next big thing, and it will pervade our entire lives,” reports Silicon Republic.

While Apple has yet to enter the AR headset market to compete against Microsoft HoloLens 2 or Magic Leap 1, the company has made great software inroads thanks to the launch of ARKit back in 2017 – which is now in its third iteration. This has helped developers create a wide range of apps and videogames for iOS devices. Besides entertainment, Cooks sees plenty of useful applications of AR for home users: “You may be under the car changing the oil, and you’re not sure exactly how to do it. You can use AR,” he mentions.

He also seems to make a subtle nod to his dislike of VR and why AR is his tech of choice: “I think it’s something that doesn’t isolate people. We can use it to enhance our discussion, not substitute it for human connection, which I’ve always deeply worried about in some of the other technologies.”

Bait! Under the Surface

During his visit to Ireland Cook managed to pop into Dublin-based developer War Ducks, the team behind VR titles like Sneaky Bears and RollerCoaster Legends II: Thor’s Hammer. Last year the company announced a $3.8 million USD investment which was going towards a location-based AR experience.

“Yesterday, I visited a development company called War Ducks … in Dublin – 15 people and they’re staffing up and using AR for games,” Cook mentioned. “You can imagine, for games it’s incredible but even for our discussion here. You and I might be talking about an article and using AR we can pull it up, and can both be looking at the same thing at the same time.”

As Apple continues to expand its AR development, VRFocus will keep you updated.

Google ARCore Update Brings More Robust Cloud Anchors for Improved Multiuser AR

ARCore, Google’s developer platform for building augmented reality experiences, is getting an update today that aims to make shared AR experiences quicker and more reliable. Additionally, Google is also rolling out support for Augmented Faces on iOS, the company’s 3D face filter API.

Introduced last year, Google’s Cloud Anchors API essentially lets developers create a shared, cross-platform AR experience for Android and iOS, and then host the so-called anchors through Google’s Cloud services. Users can then add virtual objects to a scene, and share them with others so they view and interact simultaneously.

In today’s update, Google says it’s made improvements to the Cloud Anchors API that make hosting and resolving anchors more efficient and robust, something the company says is due to improved anchor creation and visual processing in the cloud.

 

Google AR team product manager Christina Tong says in a blog post that developers will now have access to more angles across larger areas in the scene, making for what she calls a “more robust 3D feature map.”

This, Tong explains, will allow for multiple anchors in the scene to be resolved simultaneously, which she says reduces the app’s startup time.

Tong says that once a map is created from your physical surroundings, the visual data used to create the map is deleted, leaving only anchor IDs to be shared with other devices.

 

In the future, Google is also looking to further develop Persistent Cloud Anchors, which would allow users to map and anchor content over both a larger area and an extended period of time, something Tong calls a “save button” for AR.

This prospective ‘AR save button’ would, according to Tong, be an important method of bridging the digital and physical worlds, as users may one day be able to leave anchors anywhere they need to, attaching things like notes, video links, and 3D objects.

Apps like Mark AR, a graffiti-art app developed by Sybo and iDreamSky, already uses Persistent Cloud Anchors to link user-made creations to real-world locations.

If you’re a developer, check out Google’s guide to creating Cloud Anchor-enabled apps here.

The post Google ARCore Update Brings More Robust Cloud Anchors for Improved Multiuser AR appeared first on Road to VR.

Google Maps’ ‘Live View’ AR Feature Available in Beta, Makes Getting Lost Harder

Google may seem to be losing interest in its virtual reality (VR) ventures such as Daydream View but on the augmented reality (AR) the company is still pressing forward with gusto. Having released an AR feature for Google Maps earlier this year to Google Maps Local Guides and Google Pixel users, the company has today begun a wider rollout of Live View.

Google Maps Live View

Currently still in beta, the feature to compatible iOS and Android devices which support ARKit and ARCore respectively. While the launch happens today, you may not be able to update your device just yet, as it may take Google several days or weeks to get to your region.

The whole purpose of the AR option is to make navigation using the highly popular Google Maps even easier and straight forward. All you need to do is tap on a location in the app, hit the Directions button then select the Walking option. After that, you should find the option to ‘Live View’ those directions towards the bottom of the screen. With Live View enabled you’ll get some gigantic handy arrows appear in the middle of the street (or wherever you are) telling you the right direction to head.

Obviously, this sort of feature isn’t supposed to make you continually hold your phone up and look like a lost kitten. You can simply bring it up when required to let you know you’ve gone the wrong way – or going the right way. It’s just one of a number of updates Google has added to the app, including being able to see all of your flight and hotel reservations in one place, or finding a nice restaurant and booking a reservation all without leaving the app.

Google Maps

While AR might be seen as the little brother to VR, it’s often thought of as having the greatest potential in the long run. Apart from apps like Google Maps, a lot of the AR content consumers are coming across at the moment are videogames such as Harry Potter: Wizards Unite and Minecraft Earth. VRFocus will continue its coverage of AR, reporting back with the latest updates.

Analysis: Did Apple Really Cancel Its AR Headset?

Taiwanese news outlet DigiTimes recently reported that Apple canceled its long rumored AR hardware project.

The report cited issues making the device light enough as well as its high production cost. But is this really true? And if so, what exactly did Apple cancel?

What was Apple working on?

In late 2017, Bloomberg reported the company was working on AR glasses for release as early as 2020. The outlet didn’t provide any details on the product other than that the project name was T288 and that it could be released by 2020.

In the same year, Apple acquired Akonia Holographics. Akonia was working on novel optics for AR based on holography. They called the approach ‘HoloMirror’, and claimed it had “dramatically higher” field of view with lower production cost.

Last year, CNET reported T288 would feature dual 8K displays and be capable of VR too. Presumably this would have been achieved with video passthrough. The report claimed the headset would be wireless, powered by an external “box” with a 5-nanometer processor.

Avi Bar-Zeev

In 2016 Microsoft launched HoloLens, the first true 6DoF AR headset available for purchase. The principal architect for the project was Avi Bar-Zeev, with his contributions including “assembling the very first AR prototypes, demos and UX concepts”. He left Microsoft in 2012 and went to Amazon.

Bar-Zeev was also a co-founder of Keyhole, which became Google Earth. In the 90’s he worked on virtual reality for DisneyQuest including Aladdin’s Magic Carpet Ride.

In 2016, Bar-Zeev moved to Apple, likely to work on an AR hardware project. A Linkedin page in his name states he led “the Experience Prototyping (XP) team for a new effort”.

Earlier this year, however, he left Apple. No reason was given. We reached out and Bar-Zeev declined to comment on his work at Apple, except to say “the purpose of my team’s work was to inform leadership about what would be great user experiences, and sometimes show which things to avoid.”

More Than One Project?

CNET’s report of an 8K headset with both VR and AR capabilities does not fit with the idea of a lightweight pair of AR glasses.

michael abrash ar vr oculus connect 5

The design goals for such a product would be radically different than for a mixed reality headset with both VR and AR capabilities. T288, as described by CNET’s report, would likely have been a very expensive and heavy device. Oculus CTO John Carmack recently said that making a VR headset “with every hardware feature anyone asked for” would be “heavy and very expensive., so consumers wouldn’t buy it”.

The Case For Glasses

ARKit is already built in to every iPhone released since 2015, and every iPad since 2017. It’s estimated that totals to 500 million devices. The platform already attracted hundreds of AR developers, including big names like IKEA, Edmunds and LEGO. The SDK is even used in Pokémon Go to more realistically place the digital creatures in a real-world scene.

ARKit

Since Apple has full control over the iOS hardware and software, the company could use the iPhone to power such glasses, as described in a recent extensive patent application.

If Apple does use the iPhone to power AR glasses, adding support for the hardware to an existing ARKit app could potentially be trivial — or even automatic. Some question Apple’s intense investment in ARKit and how heavily it is pushed to developers. This could be the reason.

Summary

While we’ve seen no indication Apple will be competing with Facebook in the VR space anytime soon, the two companies will likely be fiercely competitive when it comes to AR glasses. If Google also enters the ring and Microsoft changes its enterprise-first strategy, the 2020’s could see the four giants battle for control of the post-smartphone era.

While Apple cancels projects all the time — with T288 potentially being one of them — it is possible (probable even) that other AR and VR projects live on at Apple parallel to this one.

The post Analysis: Did Apple Really Cancel Its AR Headset? appeared first on UploadVR.

Minecraft Earth Beta Coming To iOS This Month, Android ‘Soon’ After

Minecraft Earth, an all-new mobile AR version of the global sensation sandbox building and survival game, is coming to iOS later this month and Android “soon thereafter” according to the official blog.

We first heard about Minecraft Earth about two months ago in mid-May when developers Mojang released an announcement trailer and then last month debuted actual gameplay for the very first time. Now, they’ve got a brand new beta announcement video that goes over more details regarding the game’s mechanics and gameplay:

Basically you’re presented with an overworld map that looks strikingly similar to Pokemon Go, complete with a Minecraft-style block avatar complete with skins. It uses the actual world map to create the environment. You walk around and tap on items like animals and blocks to collect them. Naturally, they’re called “tappables” in Minecraft Earth.

Once you collect enough tappables, you level up and once you have enough resources you can build things that are placed into the real world from your phone screen. It’s described as a “living, breathing” Minecraft world. It includes multiplayer seamlessly integrated where people can help “or hinder” your creations. Then you can scale creations to life-size to explore and see in the world around you.

minecraft earth gameplay

I haven’t tried it yet, but honestly, it looks impressive.

The limited iOS beta for Minecraft Earth is due out within the next two weeks, which means sometime before July 26th. Only a “limited number of players in a few select cities” will be chosen before a wider release this summer.

You can sign up for a chance to be selected in the closed beta right here and check out the main website for more details as they’re available.

h/t: Engadget

The post Minecraft Earth Beta Coming To iOS This Month, Android ‘Soon’ After appeared first on UploadVR.

First Minecraft Earth Gameplay Revealed, Uses ARKit 3 Body Occlusion

First Minecraft Earth Gameplay Revealed, Uses ARKit 3 Body Occlusion

At Apple’s WWDC 2019 this week Microsoft showed off gameplay of Minecraft Earth for the first time:

Minecraft Earth is a smartphone augmented reality game. It’s loosely based on Minecraft, but it’s definitely not the same game. You make your own creations with blocks, much like creative mode, then place them in the real world with AR.

Through Microsoft’s Azure Cloud Spatial Anchor system, everyone else will see your creation in the same real world position in AR.

ARKit 3

Being at WWDC, Microsoft showed off how Minecraft Earth would be enhanced on iOS compared to Android due to ARKit 3. Version 3 is introducing human occlusion, which allows you to walk in front of blocks, or even over blocks on the ground.

Microsoft describes this feature as “only on iOS”, but it will be interesting to see whether Google’s ARCore adds a similar feature between now and the game’s eventual release.

HoloLens?

This same cloud anchors technology was shown off for HoloLens 2 earlier this year. A demo even showed a user in a HoloLens 2 where another user with an iPad shared the same AR session. While Microsoft hasn’t said anything about Minecraft Earth coming to HoloLens, it seems the technology is there to make it possible if they wanted to.

Minecraft VR

While Earth brings the franchise into AR, many fans have been wondering about the VR version of the original game. While it was released on Rift and Gear VR back in 2016, it didn’t see a port to the Oculus Go or recently released Oculus Quest.

CTO John Carmack has been trying to get the game on Oculus standalone headsets for over a year now, but still hasn’t been successful.

Tagged with: ,

The post First Minecraft Earth Gameplay Revealed, Uses ARKit 3 Body Occlusion appeared first on UploadVR.