How Augmented Reality Will Put People Back to Work

How Augmented Reality Will Put People Back to Work

Through a combination of robotics and artificial intelligence (AI), a large number of jobs may soon go away. A recent McKinsey report states that automation could make up to 45 percent of all jobs in the United States obsolete, potentially affecting $2 trillion in annual wages. Autonomous trucking alone has the potential to automate away 3.5 million driver jobs.

Augmented reality (AR), however, may become the tool that delays this fate by evolving the role of workers in a post-automation world, creating opportunities for continued employment across a more diverse set of occupations. In a scenario where training or technical schools may take too long or may be economically unfeasible in order to change careers as a result of displaced work, AR can enable people to perform unfamiliar and complex tasks. At scale, this can take a displaced workforce and immediately put people back to work performing jobs guided by AR.

A worker using a Microsoft HoloLens performs inspections on a car door

There are already AR applications that allow users wearing an AR headset like the Microsoft HoloLens or holding a tablet computer to see technical information and instructions overlaid on the real world. This information takes the form of a wrench or a hand instructing a user to perform a physical action, such as loosening a bolt with a wrench or removing a part. Through sophisticated computer vision software, these virtual objects would appear directly on the real object in the physical world.

With AR, virtual instructions appear on real-world objects

A worker in the future may put on their AR headset and, just like a ride-sharing driver, start receiving job requests to physically perform technical operations. For example, the user may be instructed to change the fluid of a car engine. A digital wrench would appear over the cap they needed to loosen, an oil-can may appear instructing them to fill the tank, and so on until they had completed the task at hand. In this scenario, a worker would not need to fully understand the technicalities of the operation they are performing. The computer provides the knowledge in real-time and in their field-of-vision, so that their hands are freed up to actually do the work.

 

Workers can use ScopeAR to learn about a task as they perform it.

Additionally, this just-in-time knowledge benefits workers by letting them jump between different fields in order to increase their chances for employment. Perhaps one day they are repairing a car and the next they are acting as a plumber. With this new technology, someone can learn practical skills on the job while being paid to perform them, versus having to spend time away from earning an income to learn technical skills at a continuing education school which may not even have any practical application by the time they graduate.

Finally, AR has many benefits for experienced workers. AR augments the ability of veteran workers by helping them complete jobs more quickly or taking on more complex tasks, potentially increasing their earnings per day. AR also enables experts to stay in the workforce longer as well. AR can leverage veterans and domain experts within an organization by connecting them with on-site personnel that act as their hands in the field. Applications already exist that make it easy for remote knowledge workers to draw and instruct within an on-site technician’s field-of-vision.

 

Remote assistance lets senior and junior team members collaborate using AR

This technology already exists and is deployed in real-world scenarios today. Companies like Scope AR — who are working with companies like Lockheed Martin, Toyota, and Proctor & Gamble to provide augmented reality maintenance solutions for their workforce and customers — and Daqri — who are developing a combined hardware and software solution to make deployment simpler for industrial firms — are focused on building platforms that make it easy for companies to convert the 2D paper schematics and manuals they already provide their technicians to interactive 3D AR “instructables” that can guide an untrained user to perform a highly technical task.

Daqri is focused on using AR to perform inspections

With all of the technological advancements out there, you may be wondering why AR in particular is such a big deal and should arguably be receiving more attention that it currently is in terms of workforce applications. I’d say that the main shift is that knowledge is no longer a limiting factor in being able to perform a job.

Knowledge may now come from a pre-created instruction module, artificial intelligence, or another, more experienced, colleague. As a result, the onsite worker becomes a way of applying that digital knowledge to the real world, acting as a physical pair of hands to digital instructions.

Without AR


With AR


AR knowledge also has the ability to adapt immediately whereas traditional training does not. For example, if there is a shift in the market from gasoline-based automobiles to electric vehicles, existing mechanics would be able to adapt their physical skillset to a new type of vehicle on the job. In traditional schooling, a new set of course materials and curriculum would need to be developed. Teachers would also need to be trained to properly educate students. AR sidesteps all of these issues and provides a worker with the latest information on a task.

While AR isn’t a permanent solution to unemployment, it is one that could ease the transition of displaced workers. It’s true that, in the long-run, even these sorts of jobs may be automated by specialized robots. That level of automation, however, could be decades away or may never become economically viable. In the meantime, augmented reality can provide a way for the human worker to continue to be productive and a value to society.

 Amitt Mahajan is a Managing Partner at Presence Capital, an early-stage venture firm focused on investing in VR/AR companies. Thanks to Daniel Hu, Neil Mehta, Scott Montgomerie, Don Mosites, and Justin Waldron for reading drafts. This post originally appeared on Medium.

Tagged with:

Lifeliqe Is Bringing HoloLens To The Classroom

Lifeliqe Is Bringing HoloLens To The Classroom

Microsoft’s HoloLens and other mixed reality devices have enormous potential to inform and educate, arguably even more so than VR. Immersive education startup Lifeliqe is looking to capitalize on that potential.

You may have already heard of Lifeliqe; last year the company partnered with HTC to make educational VR experiences for the Vive headset. With HoloLens, though, the company is looking to move into the classroom. In fact the company has already run pilot lessons using the headset in classes at Renton Prep in Seattle, Washington, and Castro Valley Unified College in California. You can see a video of the student’s impressions below.

Lifeliqe’s HoloLens apps used interactive 3D models to provide a new kind of visual learning for students. They got to explore the human body, bringing up 3D models of organs, blood vessels and more. In a statement, Michelle Zimmerman, Director of Innovative Teaching and Learning Sciences said it actually looked like students preferred using MR for education over VR, which the school had also been working with.

HoloLens isn’t the only headset that could one day take over the classroom; Google has been pushing VR into educational territory with its Expeditions initiative, which uses mobile-based headsets like Cardboard to take students on virtual field trips. We expect to see plenty more examples of VR, AR and MR in schools as the technology continues to grow in popularity, too.

Lifeliqe is designing HoloLens experiences for grade 6 – 12 classrooms. However, with the developer edition of the kit costing $3,000 and a full consumer version still likely years away, it’s probably going to be a long time before we see MR commonly used in schools across the globe. VR will be a good stepping stone in the meantime, and Microsoft has that angle covered with its upcoming Windows 10 headsets.

Tagged with: , , ,

Rapid Prototyping Projects with CapitolaVR: Car Customization and Driving

Rapid Prototyping Projects with CapitolaVR: Car Customization and Driving

Editor’s Note: In this weekly column, David Robustelli will breakdown the latest rapid prototype he and his team at CapitolaVR have created for VR and/or AR. They are responsible for games like Duckpocalypse as well as prototype projects such as HoloLens Golf, Gear VR Mirroring, and Pokemon GO for HoloLens. Check back each weekend for new prototypes!

CapitolaVR is working with a “rapid prototype” strategy. Developers can invest 20% to 30% of their time in creating their own VR or AR ideas. Each week a team presents their work and based on feedback the prototype may be further developed.

With the latest prototype the team’s goal was to explore the possibilities with an augmented reality car customization system. We experimented a lot with the interaction and how to show the info and features of the car. Interaction was one of the biggest challenges and we tried over 20 different menu’s before we had something we thought could actually work. Also, as the car model was created with specific shaders and we had to be sure the HoloLens could handle it visually which wasn’t always the case. With some downgrading we made it work without losing too much of the visual quality.

Another big challenge was driving the car, we tried multiple types of controlling the car. One option we explored was controlling the car with voice commands telling to slow down or speed up and take a left or right turn. Although it was highly entertaining, we felt the user lacked the precision needed to make the car go where the user wanted it to. Another less user intensive driving mode we explored was making the car drive on its own using pathfinding algorithms to avoid obstacles. That was fun to see but obviously lacked user interaction. In the end we settled on a marker system, the user can place multiple markers and the car will drive through them sequentially. This gives the user a lot of precision to control the car while not having to micromanage the car each second.


This is a guest contribution by David Robustelli, Head of Digital at CapitolaVR

Tagged with: , ,

Mixed Reality Easter Egg Hunt to Feature at VRLA

The VRLA Conference and Expo in Los Angeles will this year have a new feature as part of its virtual reality (VR) and augmented reality (AR) demos – a Mixed Reality Easter Egg hunt.

Organised by AR/VR development agency and content creators AfterNow and FLARB in cooperation with Microsoft HoloLens, visitors to the VRLA will be able to look through a HoloLens headset and see the virtual Easter eggs which will be hidden throughout an indoor park space. The egg hunt is just one of hundreds of AR and VR demos that will feature through the VRLA weekend.

VRLA is set to feature a full schedule of panels, talks and discussions. The creative team from Epic Games who worked on Robo Recall will be offering details of the development process behind the successful videogame. Rick & Morty co-creator Justin Roiland and Unity CEO John Riccitiello will be offering keynote speeches on the Saturday and Friday respectively.

A slate of exhibitors from industry names like Samsung, Oculus, nVidia, HTC and Google will be in the exhibition hall, alongside a dedicated ‘Indie Zone’ where smaller development teams can show off their products. In addition a ‘Girls Make VR’ workshop is being planned, where young women will be able to get to grips with the Unity 3D engine and have the opportunity to produce their own VR applications.

Tickets for VRLA are still available with the business-focused 2-Day Pro Pass at $299 USD and a 1-Day Pass for Saturday priced at $40. The event will take place on 14th-15th April at the Los Angeles Convention Centre.

VRFocus will continue to bring you the latest updates on VRLA and other industry events.

Me Vs. A Decade

Today’s VR vs. story isn’t really about virtual reality. It’s more a story about the writer, as today marks a very important day for me. Let’s begin 24 hours ago though.

It was Monday. My phone was ringing.
It was ringing and it was over on the other side of the flat.

Bugger.

Groaning I drop the speaker I’m trying to repair with one hand and break away from the Twitter post I’m writing with the other, to sprint across the flat. Dodging the overly long and overly patched up internet cable, hurdling the two steps up to the, weirdly, slightly higher level which that side of the flat is at. Before pouncing on the phone lying on my bed before it rings off. I knew who it was of course, if they are still there on the other end. Or, more precisely I know what type of call it would be. Someone from Manchester, or Liverpool, or Dublin or Abergavenny – that was a recent one – who wanted to talk to me about either:

a) The amount of money I could claim from the car accident I had. Which I’m reasonably confident is £0.00 since I don’t drive.

b) Have I thought about pensions and life insurance? Answer: Yes, but do they think about me?

or c) Whether or not I had heard about Payment Protection Insurance (PPI) from mortgages or home-buying or something. How it had been mis-sold or misused and how I was due funds worth hundreds of pounds. Have I checked? To which the answer is I have never done anything financially that involved PPI. The last caller on that demanded to know how I would magically know this.They were told forcefully that I think I would remember such a transfer.  Also since I rent the likelihood of any of this is rather on the low side.

I was surprised as it was not actually any of these but a number I recognise from an employment agency. I picked up, and a somewhat more masculine voice than I expected wheezed “Hello it’s Derek from Kitten Whisperers!” The names have been changed to protect the guilty. “I was wondering if we could have a chat.”  Turned out Derek was after a catch-up on things since the CV they had from me was a bit out of date, and since you never know and its always good to have such companies thinking of you, I agreed.

I ‘hmm’-ed and we went through some run of the mill questions. “Are you doing okay?” “Are you still living here?” “Are you still working for VRFocus?” Yes. “What do they do?” Well…  Then Derek asked, “So, do you have much experience in Community Management?” And for a brief moment I was stumped. I mean, presumably he had my “kinda out of date” CV in front of him. What was he expecting? That I’d suddenly go ‘well actually I made it all up’ and fill him in with a completely different work history? ‘No, in truth from 2008-2009 I was a matchstick-seller and part-time snowboarding clown and from 2011-13 I lectured at Harvard in Esperanto.’

I pursed my lips together. “Actually it’s ten years on Tuesday.”

“Oh.” He said, a bit bored. I slumped because I was actually telling the truth .On the 28th of March 2007 I was bundled though into an office at SEGA Europe and quickly made to sign an NDA. It was all a bit hectic in the office and I wondered what was going on. I was then told that in about five minutes they were going to announce the fact that Mario and Sonic were going to be in a game together for the first time. and I was hurled into a chair and signed up to the official forum with full on mod powers.

“Track what they say.” Said my new line-boss as the press release for what was Mario & Sonic At The Olympic Games rolled out to the press. “If they start getting worked up.” He paused and pursed his lips together. “Well we’ll come to that.” He shrugged and patted my shoulder.

Ten years ago…

After the call ended I thought for a while about that ten years. I’d accomplished quite a lot in that time, not that you’d know it. But the truth of it is most people don’t know what I do, what any of us do. But that’s my career. A ten year stretch during which I had several years at SEGA setting up and managing their social media and working hard to rebuild community trust from the ground up. Which is mighty impressive considering I’ve never had a day’s worth of proper training in any of it before then – or indeed, astoundingly, since. I co-created an world record owning international convention with that community. Wrote blogs every day. Was the first one in and the last one out, and did my damnedest to fix an impossible to fix situation (and took a lot of flack for caring enough to do so) before I left several years later with my head held high despite being left exhausted in every sense of the word by the whole thing.  Still, I’d left my mark.

Of course they then erased everything I ever wrote after I left because they were too lazy to keep the European branch’s blogs when they merged them. Which was nice of them.

Whilst I wasn’t well known by name, (I didn’t exactly promote myself as a ‘figure’ during that time) for those in the know I had gained a reputation for hard work (to the point of exhaustion), dedication and became known for my ability to conjure up miracles from essentially nothing. A social media MacGuyver able to put together content plans with nothing but half a screenshot and a second-hand paperclip. I was hired in the short term at Square Enix to essentially rescue a project after the previous Community Manager (CM) disappeared straight after it was announced. I ended up writing a bunch of game lore and cobbling together the foundation of something that could be built on. From there, after some disappointment, I ended up in Belgium where I led a tight-nit multinational team of newcomers to the role, as we dealt with all manner of projects. Instructing them as mentor/teacher.

I worked on multiple projects; I turned my hand to advertising campaigns having never previously been given a dime except for the convention and essentially doubled the revenue being made and halved the cost. In time one  project was announced to be wound up and, again with nothing, I took over the reigns to somehow get a social game people had spent money on to conclusion and salvage the situation for the creators.  I became de facto Producer and with nothing in my resources and a product announced to be closing I grew the English community by 50,000 in one and a half months. Sent session numbers through the roof and actually brought the game to a resolution which didn’t involve people screaming for blood. They had their money’s worth and they were happy. I still get messages asking if I can somehow bring it back.

After the Belgian firm turned heel on its own employees, I left and my team joined me as soon as they were able. Unemployment was better than staying at a time when there was a global recession going on. That says more than anything else I could. But that team was good, very good. Two have gone on to work with big companies within the games industry and I’m beyond proud of them.

Life took me back to the UK and I ended up working here at VRFocus. Did you know I’ve been here over two years now? It doesn’t feel it. But I have. I’m still a CM, albeit “Community Manager & Writer” now, I do what I can and that reputation I have is still very much in effect. Although the person behind it is rather more tired and worn looking than his 2007 equivalent.

True story: After Square I applied for a job at a major UK studio and during the interview was surprised to be asked if I wasn’t too old to be a Community Manager. I was then told, dumbfounded, in a phone call that I wouldn’t be progressing further and one of the reasons given was “we think you’re too old for the role”. I also didn’t have “the look we are going for”, apparently. Which made no real sense. Apart from the fact discriminating on the grounds of age (as well as apparently, my face) is illegal, I was 28. They made me sound like Methuselah. They’d probably have a coronary to discover I’m still one at 34! (Before anyone asks I was so shocked at what I was hearing it took some considerable time before I’d really realised what had been said, and by then it was too late to suddenly go “hey, hang on a minute!”.)

It all evolves. Much like VR – which we will come back to, I promise.

In fact this reminiscing is partly due to reading an excellent article on what the job entails by my opposite number (I… guess? Although she has a much better title than me – and she has a electronic fancy follower clock/counter that I desperately want to steal.) from Upload VR, Elizabeth Scott. Who got me thinking about what it is I do here and have done previously. But if you’re unsure what it is I do, I write this and Life in 360 and a number of other posts/features as required. Sort out most of the graphics, the moderation, and am the person you talk to on Twitter, or Facebook or Reddit if you see VRFocus being chatty there. I sort the social media in general when I’m in. I work with various partners and the guests writers we have to produce content, I work on the website itself – now with the new site’s designer. I’m HR, I run the time sheets. I edit videos when required. I run events when we run them but you’ll probably never see me at a main one. I search for stories and allocate them to the writers, with whom I work on their stories as I need. I’m, as my author description says, the unofficial Deputy Editor.

I fix.

I’m basically a cross between an online janitor and a hatstand.

But the core of the job is you help, and whilst I’m presently more on social than anything else. It’s kind of ironic that a guy who is heavy on the social anxiety made this his career. But hey, I never said I was smart. Ten years, four companies and a lot of projects have passed. The job has changed and evolved throughout those years and some point in the future it will change again – and it might be VR that changes it.

Community Management is part of that family of Customer Relations-type roles in business. It sort-of-kinda sits between everything. It’s marketing, it’s public relations, it’s creative and design, it’s finance and even legal (sometimes) and several of those are already being touched on and altered by other types of technology. The most obvious one being Artificial Intelligence (A.I.). In the same way will there come a time where a CM’s role will also be to respond to discussions on an article using such a system? Will a young wide-eyed fan be thrust into a virtual forum room to monitor reactions to Mario & Sonic At The Lunar 2028 Olympic Games? Appearing as a cartoony Avatar holding up the announcement trailer for you to then step into. All care of Oculus and Facebook’s Rooms system. Perhaps they’ll appear in your office or classroom as a virtual projection, displayed by Microsoft HoloLens to discuss a news story.

Will my career be supplanted by something else, all travel and interaction made virtual? I’m not sure I’d like that, if I’m honest.  But that’s a question to be answered by the future – and the future is coming fast. For now I’ll continue to evolve as best as I can. Will I be doing the same role in 10 years? Who is to say.

Here’s to a decade.

 

HoloLens: MyLab sorgt für interaktiven Chemieunterricht dank AR

Der Einfluss von AR und VR auf den Unterricht ist immer wieder ein interessantes Thema. Die Soft- und Hardware entwickelt sich stetig weiter, wodurch neue Bildungsmöglichkeiten für Schulen und Berufe entstehen. Die Implementation von AR und VR ins Klassenzimmer ist nicht nur für die Schüler spannend, sondern bringt auch viele Vorteile mit sich. Beispielsweise im Chemieunterricht, wie die AR-App MyLab zeigt.

Eine interaktive Periodentabelle im Chemieunterricht

Der Unterricht in naturwissenschaftlichen Fächern kann oft langweilig und trocken werden, was viele Personen in ihrer eigenen Schulzeit erleben mussten. Besonders im Chemieunterricht werden sehr theoretische Inhalte vermittelt, die man selbst nicht einmal sehen kann, wie z. B. die Lehre über Atome. Zu allem Überdruss erfolgt dies über meist veraltete Lehrbücher. Spannend wurde es dann nur bei den einfachen Experimenten, die manchmal von den Lehrern vorgeführt wurden. Wäre es nicht großartig zu jedem Thema ein solches Experiment zu haben, um die damit verbundenen Interaktionen besser zu verstehen? Dies dachte sich auch Lucas Rizzotto, der Entwickler der App MyLab. Er entwickelte seine App für die Augmeted Reality Brille HoloLens von Microsoft.

MyLab-HoloLens-Microsoft-Atome

Das Ganze funktioniert folgendermaßen: Vor dem VR-Nutzer entsteht das Hologramm einer interaktiven Periodentabelle. Darauf kann man verschiedene Atome anklicken, die wiederum in der Umgebung dargestellt werden. Die Strukturen der Atome werden visualisiert und können entsprechend erforscht und verglichen werden. Die App verbindet damit das Konzept des Lehrens aus Lehrbüchern mit den spannenden Interaktionen aus Experimenten durch Augmented Reality. Ein weiterer Vorteil ist die Freiheit, die durch das schwebende Interface der HoloLens entsteht. Denn beim Tragen von Laborkleidung mit Handschuhen und dem Hantieren von Chemikalien ist man dankbar für jede freie Hand.

MyLab steht zum kostenlosen Download im Microsoft Store zur Verfügung. Die Idee ist nichts Neues, jedoch ist die Umsetzung hier durchaus gut gelungen. Zudem ist die App ein weiterer Schritt in Richtung tiefgehender Integration von AR und VR in den Unterricht. Wir dürfen gespannt sein, was die Zukunft für diesen Bereich noch bringt.

(Quellen: roadtovr, Microsoft Store)

Der Beitrag HoloLens: MyLab sorgt für interaktiven Chemieunterricht dank AR zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Adobe Plans to Integrate with HoloLens and Amazon Alexa

Adobe have unveiled new technology to integrate their new advertising products with Amazon’s digital assistant Alexa and Microsoft’s HoloLens mixed reality (MR) product.

The new technology uses Adobe Sensei, a machine learning platform in order to facilitate the integration. Adobe suggest that with the new technology, it would be possible for someone wearing the HoloLens to be standing in Times Square in New York, and the iconic billboards around them would all display personalised adverts.

The HoloLens integration would also allow retail employee using the technology to see what products within a store are doing well, in order to better emphasise different products and improve store layout.

Adobe are also wanting to personalise the experience for those using Amazon Alexa, for example, using Adobe’s Experience Cloud to ask Alexa for the number of air miles they might have. Then Alexa and the Adobe Experience Cloud could combine this information to be able to alert the user when certain relevant offers or promotions become available.

Adobe is experimenting heavily with virtual reality and augmented reality advertising and analytics. They are working on technology to introduce interactive adverts to the VR cinema viewing experience, which they displayed at the Mobile World Congress in Barcelona in February.

You can watch a video going into detail on how the integration will work below.

VRFocus will continue to bring you news on Adobe’s VR projects.

Theorem Solutions Preview AR/VR App

British engineering solutions firm Theorem Solutions will be displaying a new hybrid VR/AR/MR app at the Develop3D Live event on the 28th March 2017.

The new application uses the Unity 3D graphics engine to produce graphical representations of data, such as CAD drawings, on a variety of commercial virtual reality (VR), augmented reality (AR) and mixed reality (MR) devices. It also allows information from databases to be integrated into the VR/AR representation.

Theorem Solutions new application allows for the same data to be shared across multiple devices, and the company say that the application is compatible with all current low-cost VR/AR/MR devices. Theorem say the application is lightweight and easy to install. The data only needs to be prepared once, and then can be delivered to multiple platforms from Theorem’s servers. The application is currently compatible with the following:

Augmented Reality – Windows 10 and Android Tablets and Phones
Mixed Reality – Microsoft HoloLens
Virtual Reality – HTC Vive

The company say support for the Oculus Rift will be added in April.

Theorem Solutions will be demonstrating the technology at the Develop 3D Live event, which will take place at the Warwick Arts Centre, Warwick University on 28th March 2017 on Stand VR3.
You can find out more about Develop 3D Live at the official website.

VRFocus will bring you further updates on Theorem Solutions’ VR app and related products as they come in.

HoloLens Sees Use in Norway Classrooms

The Pointmedia company in Norway are experimenting with new ways to teach children, including using mixed reality (MR) to help them learn more about the solar system.

Jo Jørgen Stordal a MR producer for Pointmedia is the major driving force behind the project, with the cooperation for Stig Halvorsen, who have together written a report containing guidelines on how to use MR to engage children with learning in a new way.

elevene_og_lærerser_jorda_1

In this experimental science lesson, where children were given the opportunity to see the Solar System in a new way. The students had been studying space and the solar system for some weeks prior to the introduction of the HoloLens technology into the classroom. First, the children listed what they already knew about the solar system, which was written on a whiteboard. Then, some children were given the opportunity to use the HoloLens glasses, while the rest watched the AR projection using an interactive whiteboard.

Teacher Stig Halvorsen believed that his students were more involved with the lesson, and that is facilitated greater curiosity and active student participation.

forside4_

You can watch a video (with English subtitles) on the Mixed Reality experiment below.

Microsfot’s HoloLens has so far seen a variety of uses, from designing new Operating Rooms in hospitals, to showing off a Red Bull Racing Formula One car, and displaying how a connected city would work. Virtual Reality and AR/MR are seeing increasing uptake in education, as seen with Berkeley UC’s new VR lab.

VRFocus will continue to bring you news on MR/AR/VR use in education.

HoloLens Sees Use in Norway Classrooms

The Pointmedia company in Norway are experimenting with new ways to teach children, including using mixed reality (MR) to help them learn more about the solar system.

Jo Jørgen Stordal a MR producer for Pointmedia is the major driving force behind the project, with the cooperation for Stig Halvorsen, who have together written a report containing guidelines on how to use MR to engage children with learning in a new way.

In this experimental science lesson, where children were given the opportunity to see the Solar System in a new way. The students had been studying space and the solar system for some weeks prior to the introduction of the HoloLens technology into the classroom. First, the children listed what they already knew about the solar system, which was written on a whiteboard. Then, some children were given the opportunity to use the HoloLens glasses, while the rest watched the AR projection using an interactive whiteboard.forside4_

Teacher Stig Halvorsen believed that his students were more involved with the lesson, and that is facilitated greater curiosity and active student participation. You can watch a video (with English subtitles) on the Mixed Reality experiment below.

Microsoft’s HoloLens has so far seen a variety of uses, from designing new Operating Rooms in hospitals, to showing off a Red Bull Racing Formula One car, and displaying how a connected city would work. Virtual Reality and AR/MR are seeing increasing uptake in education, as seen with Berkeley UC’s new VR lab.

VRFocus will continue to bring you news on MR/AR/VR use in education.