Augmenteum Reveals Thorassist: An AR Education And Training Tool For Learning Pulmonary Anatomy

One of the topics that always generates a lot of interest on VRFocus, and for myself personally are any developments that involve virtual reality (VR) – or any form of immersive technology for that matter – in the field of healthcare or medical technology (medtech).

AugmentumIn fact, now that the technology has been readily available for a good couple of years, we’re seeing an uptick in the number of stories and updates specifically related to matters of health. As studies and other experiments begun months ago now begin to come to fruition.

The latest development comes out of the US and is actually realted to augmented reality (AR). A presentation is being made today at the American College of Chest Physicians’ (CHEST) annual meeting that will feature Thorassist – a multi-user AR education and training experience created by the company Augmenteum. Thorassist is a learning tool that will assist in teaching of the anatomy of the lungs using AR to display a digital image reconstructed from real-life imagery. It will also allow medical students to visualise bronchoscopic anatomy procedures a physician would perform.

“Thorassist is an exciting new tool to help educate doctors in the complex field of interventional pulmonology,” explains creator Carla Lamb, M.D. “Knowledge of the anatomy and the experience of procedures is challenging for new specialists to understand and retain. Thorassist provides an environment in which I can help students understand the anatomy and learn procedures using detailed models, at much lower cost than other technologies such as virtual reality.”

Augmenteum - Logo

Classroom [the template utilised by Augmenteum in which users use iPads to see and interact with digital 3D models] begins to realize our vision of delivering AR experiences for everyone, everywhere,” Explains David Palacios, Augmenteum’s Founder and Chief Technical Officer. “It shows the tremendous potential of an AR experience shared by many simultaneous users. We will create additional shared multi-user experiences that bring AR to many different applications, from the workplace to home.”

“Thorassist is an outstanding implementation of our Classroom experience,” adds CEO Andrew O’Brien. “It utilizes AR to visualize 3D models to improve end users’ understanding of very complex content and procedures. It leverages Augmenteum’s ability to deliver shared AR experiences for many simultaneous users, as well as provide a “take-home” experience for individuals to use on their own.”

You can find out more about Augmentum’s work on their website. VRFocus will bring you more news about developments in VR, AR and beyond throughout the week.

Google Develop AR Microscope That Can Detect Cancer

Immersive technologies have already seen a variety of uses in healthcare to improve patient outcomes in a number of ways. A team of researchers from Google are taking it further with the reveal of a prototype Augmented Reality Microscope which can help detect cancer in real-time.

The research around the prototype AR microscope was unveiled at the meeting of the American Association for Cancer Research (AACR) in Chicago, Illinois, where Google described the prototype platform as using AR and deep learning tools that could assist pathologists all over the world.

The platform consists of a modified light microscope that enables real-time image analysis and presentation of results directly into the user’s field-of-view. The device can be retrofitted into existing light microscopes, using low-cost components, without need for whole slide digital versions of the analysed tissue.

“In principle, the ARM can provide a wide variety of visual feedback, including text, arrows, contours, heatmaps or animations, and is capable of running many types of machine learning algorithms aimed at solving different problems such as object detection, quantification or classification,” wrote Martin Stumpe (Technical Lead) and Craig Mermel (Product Manager) of the Google Brain Team.

Google has tested the AR microscope to run two different cancer detection algorithms – one for breast cancer metastases in lymph node specimens, and another for prostate cancer in prostatectomy specimens. The results were said to be impressive, though Google said that further studies and assessments needed to be conducted.

“At Google, we have also published results showing that a convolutional neural network is able to detect breast cancer metastases in lymph nodes at a level of accuracy comparable to a trained pathologist,” the Google team said in its blog post. “We believe that the ARM has potential for a large impact on global health, particularly for the diagnosis of infectious diseases, including tuberculosis and malaria, in the developing countries.”

It’s just the latest in a number of developments using immersive technologies in the medtech space and further information can be found on the Google Research blog. For continued coverage of immersive technology use in healthcare, keep watching VRFocus.

First VR Radiology Training Video Unveiled At SIR 2018

The Society of Interventional Radiology debuted the first-ever virtual reality (VR) 360-degree training video for interventional radiology in practice at the recent SIR’s Annual Scientific Meeting in Los Angeles.

SOCIETY OF INTERVENTIONAL RADIOLOGY LOGO

The one-hour long multisegment movie was a special project of its peer-reviewed journal, the Journal of Vascular and Interventional Radiology (JVIR) and its pioneering editor-in-chief, Ziv J Haskal, M.D., FSIR, a professor with the department of radiology and medical imaging at the University of Virginia Health System in Charlottesville.

The VR training video shows Haskal and his colleagues at the University of Virginia Health System performing a TIPS procedure where they create new blood vessel within the liver using tiny catheters, balloons and stents under image guidance.

Radiology VR training 02

“Interventional radiology has always been on the forefront of modern medicine and VR360 is the cutting-edge of medical simulation, so this project embodies the innovative spirit of our specialty,” said Haskal. “We took one of the hardest procedures we perform and created an all-enveloping, in-the-room VR film allowing an operating physician to return to any complex segment they wish for learning, review and perspective.”

Premiering at the Extreme IR session at SIR 2018, session attendees were given VR head-mounted displays (HMDs) in order to experience the video and become immersed in the procedure. With treatment for interventional radiology and other medical areas continuing to develop rapidly, using the technology available such as used here allows for more details and realistic training material.

“VR is a force multiplier, providing expert training to physicians around the world, those wishing to refresh their skills or gain confidence for delivering care in environments where clinician experts cannot provide them in-room training,” Haskal said.

Haskal is already planning the next VR projects to educate patients, trainees and attending physicians about interventional radiology and leverage VR to make it as beneficial to the viewer as possible. A small segment of the one-hour long video is available to watch below.

For more stories like this in the future keep reading VRFocus.

Touch Surgery Doubles Its Funding As It Looks To 2018

There have been many ongoing stories throughout the year for both augmented reality (AR) and virtual reality (VR), we mentioned just some of these earlier this week in a news post relating to how VR and AR are gaining momentum in academic circles, and how companies are using these new courses as a way to help gauge their own future on the platforms.
Another common thread throughout 2017 has been the continued development of VR, AR and mixed reality (MR) within the world of healthcare. As well as seeing how things continue to develop thanks to Dr. Raphael Olaiya’s ongoing feature series The VR Doctor on VRFocus, late last month we featured a story about how surgeons from three different continents had combined on a procedure using Microsoft’s Hololens headset.

Now another company developing a similar technology has revealed a new influx of funding as they look to take the next step in bringing the two fields mentioned above, healthcare and education, together.

At the beginning of the year London based start-up Touch Surgery, which also operated out of New York, revealed they were working on a new AR platform in partnership with smart glasses manufacturer DAQRI. (DAQRI itself recently revealed its latest line of smart glasses were shipping to customers.) Now they have revealed a new line in funding in part from 8VC, an American venture capital company whose founders also invested in Oculus before it was bought by Facebook back in 2014. The investment of £15 Million (GBP) doubles its funding to date and sees it well placed going into 2018 which will also see it launch a new training app called Go Surgery which will give medical trainees a step-by-step guide to various procedures using AR.The app is set to undergo trials in facilities on either side of the Atlantic next year.

“We found that tens of thousands of people were downloading the app. We started out trying to build a technology that we would use. We wanted to know how we could train and perform surgery better.” Explained Dr. Jean Nehme, the co-founder of Touch Surgery to the UK’s The Telegraph newspaper. “It is early days for this technology but we are very bullish on how augmented reality and virtual reality are going to be key technologies in the operating room of the future.”

Touch Surgery was recently announced as one of LinkedIn’s top UK start-up companies for 2017 and was also named by FirstCapital’s Hazel Moore as one of the UK’s top five British VR/AR companies in an article on VRFocus earlier this year. VRFocus will have more news on the developments in the medtech space very soon.

Surgeons Use Mixed Reality to Conference Call & Consult On Surgery During a Live Colonoscopy Operation

New immersive technologies such as virtual reality (VR), augmented reality (AR) and mixed reality (MR) are currently being used in many different ways. From gaming, automation, education and therapy, these immersive technologies are helping train as well as simplify communication between people. (If you need a quick guide comparison guide on these technologies, check out VRFocus‘s guide here). For the first time, three surgeons from Mumbai and London became digital 3D avatars in an operating theatre at The Royal London Hospital and were then able speak to one another in real-time to discuss on how to operate on the patient with the aid of pre-uploaded patient scans.

Aetho’s Thrive software on the Microsoft Hololens is a MR application all about connecting people and information in immersive environments. Professor Shafi Ahmed at the NHS’s The Royal London Hospital was doing a colonoscopy operation on a patient wearing a Microsoft Hololens – as seen in the image below.  Professor Ahmed explains that they chose to do this project to “think about the way we communicate from doctor-to-doctor or doctor-to-patient.”

Professor Shafi Ahmed in an operating theatre, wearing a Microsoft Hololens.

Professor Ahmed was joined by Professor Shailesh Shrikhande, a Cancer Surgeon at Tata Memorial Hospital in Mumbai (the largest cancer hospital in India), as well as Mr Hitesh Patel, Consultant Colorectal Surgeon at BMI The London Independent Hospital. They were also joined by Ian Nott, Co-Founder and CTO of Aetho who was based in Atlanta, USA. All four participants wore a Microsoft Hololens, appearing as moving graphic avatars to one another, with each able to see and hear one another. They were able to look at pre-uploaded patient scans that appeared as three-dimensional holograms of the tumour. In the video below, you can see each specialist discuss and analyse the patient’s data through Professor Ahmed’s perspective, the footage captured from his Hololens.

VRFocus spoke to Professor Ahmed about the project in the video interview below. He explains that the team were connected into a virtual space where they could share the scans, images of the patients, interact with them and then discuss the case in more detail, similar to a multidisciplinary team meeting that surgeons normally do in healthcare practices. The experience was like having a very ‘lucid conversation’ about the patient. Apparently, after you get past the initial shock of feeling like Iron Man, the experience is no different to having a person sit next to you and conversing.

Professor Ahmed is very excited about being in the healthcare space right now and believes that they’re undergoing the fourth industrial revolution. “It’s a question about globalization, if you want help and support – well actually the whole world can support them. These are the type of technologies that will connect people, make the world much smaller and actually make healthcare more equitable”, he says. For the future of surgery, he’d like to teleport or ‘holoport’ himself into another part of the world, walk around the room, stand over the surgeon’s shoulder, see what they’re doing, give advice and then disappear. Although this might seem like this is far in the future, it’s the direction he sees it going and is something he is working on.

Aetho approached Professor Ahmed at Cannes Lions after seeing his talk about creating a digital avatar of himself using photogrammetry. Aetho were working on the concept of avatars, holograms and telepresence for their software Thrive. The two met and Professor Ahmed’s VR company Medical Realities then collaborated with Aetho and co-ordinated the project with the hospitals to do a world’s first MR conference call with 3D digital assets during a real-time surgery.

He explains that new technologies are severely needed because globally there is an increasing demand for healthcare, but not enough capacity to cope with it. Unfortunately, with little funding it’s difficult for public services like the NHS to justify new healthcare services. He hopes that by using new technologies such as these, that healthcare can be better, more efficient and eliminate the need to travel in order to do certain operations. He believes A.I. and robotic machines will take over routine jobs, and doctors as well as surgeons will have to re-design their roles in this future landscape.

Whatever the future holds, this is an exciting step for future healthcare operations. It could save a lot of money on expensive travel, save time on treating patients and free time for doctors and surgeons to treat more patients. If you want to find out more about the project, watch the video below. You can also find out more about how immersive technology is being implimented into the world of healthcare with VRFocusThe VR Doctor and Emotion Sensing series.

VR/AR Focused Realities Centre Hosting Medtech Hackathon in December

Next month a new venue will be opening in London, UK focused on virtual reality (VR), augmented reality (AR) and mixed reality (MR) technology. The Realities Centre has been created to address the need for co-working spaces in the city for immersive developers, and today it has announced two event to celebrate its launch, both dedicated to MedTech.

First there will be ‘Using Virtual Reality CGI & Haptics MedTech in Surgery’, an evening event featuring panels and presentations from developers, academics of MedTech applications and research. They’ll be demonstrating what achievements VR has had in the medical industry thus far and what it can provide for the future. It’ll be held on 15th, December 2016, open to all and completely free of charge to attend.

VRSurgery_1

That evening show will set the groundwork for a hackathon a couple of days later. Taking place on 17th – 18th December 2016 is the MedTech VR Surgery Hackathon, in which small teams of developers will share their knowledge and ideas to create a new application for VR focused on MedTech space over the short time period. Developers who take part in the event will be assigned a topic on the day, looking at VR for surgery training and the role of haptics. The topic will be announced on the day of the event and the teams can interpret it in any such way they choose. After which it’s up to their imaginations and the restricted development time to build a project.

The MedTech VR Surgery Hackathon will be free to participate in and is supported by the TLA AR/VR and TLA HealthTech Working groups, as well as enterprise partners such as Medical Realities, Touch Surgery, UCL and AMD.

Those interested can register for tickets at Realities Centre official Eventbrite pages: Using Virtual Reality CGI & Haptics MedTech in Surgery and MedTech VR Surgery Hackathon.

For further coverage of Realities Centre and the hackathon, keep reading VRFocus.