Developer Designing Digitally have revealed that they were approached by an unnamed client who needed a rather specialised form of training for employees. Namely, the client needed a way to teach power line engineers how to clear trees and debris away from power lines. Designing Digitally turned to virtual reality (VR) for a solution.
Designing Digitally determined that the best way to tackle this problem was the immerse the learners in the environment they would be operating in as a way to teach them how to cut down trees in a way that was both safe and fun.
The task is threefold, the trees near the power lines need to be cut without damaging equipment, hurting anyone or affecting the bottom line of the company. The VR scenario built by Designing Digital therefore follows a branching structure, where different actions and decisions lead to different outcomes.
The developer reproduced within the VR environment several incidents which the client had already experienced, some dangerous, some involving a heavy monetary cost to the company, all of which reflective of the sort of scenarios engineers were likely to face in the field. If a learner makes errors, they are provided with further information and teaching without it needing to cost the company money, and without anyone receiving an injury.
The resulting experience is called VR Serious Game: Chop & Drop. It is compatible with Oculus Rift and HTC Vive. Chop & Drop uses 3D sound and haptic feedback to make the experience as immersive as possible.
Reportedly, the client company have received VR Serious Game: Chop & Drop very favourably and it has resulted in a reduction in the amount of accidents and errors, which means reduced liability for the client company. Designing Digital conclude saying that they will be pleased to offer further updates when they are released by the client company.
Wissenschaftler der Universität Exeter arbeiten gemeinsam mit Cineon Productions und Experten der Nuklearindustrie um ein gemeinsames Training für Industrien mit kritischen Sicherheitsbedingungen, zu entwickeln.
Erhöhte Sicherheit durch Virtual Reality
Die Ausbildungseinheit nennt sich CineonTraining und soll zukünftig im Bereich des Militärs, Wehrdienstes, Luftfahrt und der Arbeit im Nuklearbereich zum Einsatz kommen. Durch das neue VR-Training sollen Unfälle in diesen gesundheitsgefährdenden Bereichen vermieden werden. Dr.Sam Vine von der Universität Exeter leitet das Projekt in Zusammenarbeit mit Cineon Productions und den Nuklearexperten.
Das Cineon Training befindet sich noch in der Entwicklung und soll umfassend auf viele verschiedene Bereiche einsetzbar sein. Das Training basiert auf der 360-Grad-Technologie mit VR-Headsets, um die Effektivität von Angestellten zu erhöhen und Gefahren durch Unfälle zu vermeiden. Des Weiteren kommen Eye-Tracking und eine physiologische Überwachung der Auszubildenden zum Einsatz, um ein Verständnis der Lerneffekte zu erhalten. Außerdem wollen die Projektleiter dadurch herausfinden, wie Fehler während eines Einsatzes, besonders in Stresssituationen, entstehen.
Fundiertes Training basierend auf Forschungsergebnissen
Das Ziel des Projekts ist eine umfassende Ausbildung durch eine Kombination aus Technologie, wissenschaftlicher Theorie und Messmethoden wie Eye-Tracking zu erschaffen. Dadurch sollen die Mitarbeiter effektiver in kritischen Situationen agieren lernen, ohne sie dabei körperlich zu gefährden. Bekannt ist, dass die Simulation einer gefährlichen Situation, die Reaktion bei echter Gefahr durchaus verbessert. Der Mitarbeiter ist in der Lage auf gelernte Verhaltensweisen zurückzugreifen. Das Projekt basiert auf der zehnjährigen Forschung der Universität, die ebenfalls die angewandte Software entwickelt. Durch die Arbeit mit der der Nuklearindustrie trug das Projekt bereits Früchte. Jedoch möchten die Zuständigen in Zukunft enger mit Experten der Bereiche Luftfahrt, Notfallmedizin, Bergbau und Bauwesen zusammenarbeiten.
Das Team veranstaltet am 27.4. einen eintägigen Workshop für Sicherheitsexperten innerhalb der Nuklearindustrie und Trainer in anderen Bereichen. Laut Dr. Sam Vine simulieren die verwendeten Methoden und VR-Technologien stresshafte und risikobehaftete Umgebungen durch die VR-Headsets.
Das Training klingt vielversprechend und wird bald hoffentlich international angewendet. Die Langzeiteffekte müssen noch erforscht werden, jedoch wird deutlich wie viel Einfluss die VR-Technologien mittlerweile in der realen Welt haben.
Industrial Training International is readying their VR Mobile Crane Simulator for the March ConExpo Event in Las Vegas. The simulator uses an Oculus Rift headset in combination with a modular rig, in order to significantly reduce the costs of training, compared to both real-world and older VR solutions.
Last September, Industrial Training International (ITI) announced the development of a ‘VR Mobile Crane Simulator’ (in this case, the term ‘mobile’ refers to the type of crane, rather than VR optimised for a smartphone). Created in partnership with Canadian developer Serious Labs Inc, the system is on schedule for a March launch, having run a beta program since October.
The simulator, which uses an Oculus Rift headset, comes in two forms – ‘Desktop’ and ‘Motion Base’. They use the same modular control system, including five joystick modules for the user to swap out to match the model of crane they are operating. The layout is mapped accurately to the real crane ergonomics to retain immersion, with foot pedals coming soon. The modular design allows for custom features to be added (such as force-feedback joysticks) if required. The Motion Base version uses the same joystick layout, but instead of being clamped to a table, it is a standalone unit, adding a chair with four actuators in the base, creating a convincing sense of movement. The development models are seen in ITI’s ‘First Look’ video (above) – the Xbox One controller shown was a temporary option while the desktop hardware was being fabricated.
ITI VR Product Manager Caleb Steinborn explains the effectiveness of the Motion Base, which is also used for their Aerial Work Platform simulator – another VR development from Serious Labs with equipment rental giant, United Rentals. “I have seen many people experience fairly extreme fear of heights while being hoisted into the air with the AWP simulator, all the while having never left the ground. The realism is truly an experience unlike anything else.”
There is also a small training benefit from sensing some of the forces acting on the crane, and in the case of the Aerial Work Platform, the vestibular feedback created by the actuators minimises simulator sickness. For the crane sim, the accelerations are lower, so nausea isn’t an issue, but it can reproduce an “extremely powerful” feeling of tipping a 100-tonne crane.
The Oculus Rift headset was selected by Serious Labs as they believe it to be the more comfortable and portable solution at this stage, however the sim supports OpenVR, and therefore could work on the HTC Vive, and other headsets. “Currently we use the Oculus Rift simply because at the time the developers felt like it was the stronger headset of the two. As the technology emerges, we will keep up with latest and greatest.”
Founded in 1986, ITI is a world leader in crane and rigging training and consulting; they are relied on by multiple industries such as mining, construction, and energy. Having dismissed older simulator options due to their high costs, impracticality, lack of depth perception and realism, ITI is now introducing a new generation of simulators that utilize VR headsets to create a far more compact, affordable option. Legacy solutions typically cost well over $100,000 – ITI’s simulator will be a fraction of the price.
The sheer scale of a legacy crane sim (ranging from the size of an actual crane cab to a semi-truck trailer) is the other major hurdle, with the shipping, setup and commissioning alone typically costing more than the entire purchase price of the ITI desktop sim. Having trained operators for three decades, ITI has received plenty of feedback from the industry, lamenting the high cost and low portability of legacy solutions, according to Caleb. “Most operators are not physically close to a simulator installation, so even when simulators are owned by the companies in question they are often underused.”
Offering the hardware at cost and charging a subscription for the software means that, in addition to the low cost of entry, subscribers gain access to the full training ecosystem as it becomes available, which will include new content (such as new crane models/types and new environments), new features (such as training events, networked “multiplayer,” enabling multi-crane lifts as well as multi-user lifts performing different functions), as well as future courses utilizing hand controls to train riggers, signal persons, crane assembly/disassembly, and more.
ITI provides real-world training courses at seven training centres in the United States and Canada – the mobile crane operator courses are capped at 12 students to ensure decent seat time in the crane for each trainee, and attention from the instructor. While real-world training is high quality, there are limitations in terms of seat time, available lift scenarios, and possible external conditions. The new VR solution can provide practically unlimited seat time, and can cover topics and events that aren’t easily reproduced (such as dealing with inclement weather) in real life. The cost benefits can be enormous, particularly when you consider the equivalent real-world worksite preparation and dealing with potential damages. It’s possible for a job that involves a single critical lift to be billed at a million dollars or more.
The simulation is realistic enough to provide all the operational practice needed to pass a practical exam from the NCCCO, but real-world training still has its place. “Our goal is to provide every possible training solution option to those who are in need”, says Caleb. “Sometimes that training solution will be a live, instructor-led course, and sometimes it is going to be a VR Simulator. It is our job to ensure that the quality of every option is of the highest caliber, and our VR Mobile Crane simulator is only just beginning.”
We speak to Trinity VR about their Vive Tracker powered Diamond FX project which aims to provide realistic virtual reality simulations for major league baseball teams, all powered by years of real pitching statistics.
One of the more off-beat uses of HTC’s newly announced Vive Tracker came from Trinity VR who have taken the Vive Tracker, as announced by HTC at CES this week, slapped it onto a real baseball bat and were using it to demonstrate their baseball simulator.
Ben Lang spoke to Trinity VR‘s Chief of Product Julian Volyn (see full video interview at the top of this page) to find out a little more about the project and it transpires that the developers only had around 3 weeks to integrate Diamond FX with HTC’s new Vive Tracker. Tellingly, and positively for HTC, Volyn states that in comparison to their previous solution – essentially strapping a SteamVR controller to the bat – the new tracker really allowed the company to increase tracking fidelity and weight distribution for added realism.
As for Diamond FX itself, Volyn says it’s a serious simulation designed very much for professional use, a statement that seems to be backed up by the impressive statistics the application is built on. “It’s a simulation and training platform looking at professional teams and minor league teams,” says Volyn, “What we’re able to do is create scenarios with real world pitchers and simulate their real world play styles..” How is this done? Years of statistics it turns out. “We’re able to do this through a system called Pitch FX. Pitch FX was installed in major league stadiums in about 2007, it’s collected 100s of 1000s of data points for every pitch thrown for the past 7 years and we’re able to take those data points and reverse engineer those pitches.” What this means of course is that teams who use the system will be able instantly recall and replay over and over again individual pitches taken from real matches.
Of course, all that data isn’t much use if it can’t be presented to the player in a useful way. Diamond FX was therefore built for virtual reality and, using a motion tracked bat (see the aforementioned Vive Tracker) bring the real player’s motions into the simulation, playing performance out against the virtual pitches. “In a player development of training sense, there’s not a ton of opportunities for a player to go up against any given pitcher in a season. What we’re able to do is create large sample sizes through .. nobody gets tired, in our simulation ”
We’re almost constantly surprised by new and interesting uses we’d never considered for virtual reality here on Road to VR, even after reporting on the technology for over 5 years. Trinity VR have managed to do it again with Diamond FX and with that excellent statistical grounding fused with this new generation of tracked motion controllers and VR, they seem to have a really compelling product which it would be hard to imagine wasn’t attractive to both minor and major league players alike.
STRIVR Labs, a company focused on using VR for sports training, plans to use a new $5 million investment to expand the scope of its “experiential learning” platform.
According to a press release from the company, Strivr’s $5 million Series A investment was led by Signia Venture Partners with participation from BMW i Ventures, and AdvancIt Capital.
Strivr was co-founded by former Stanford Cardinal kicker Derek Belch and Jeremy Bailenson, the founding director of Stanford’s Virtual Human interaction Lab. The company now counts 25 professional and collegiate teams among clientele making use of its VR sports training technology. The platform has been used by players to review more than 50,000 different plays and scenarios for thousands of collective hours, according to Strivr.
Now the company plans to focus on the broader category of “experiential learning,” which includes enterprise training for areas like sales, operations, customer service, safety,
and HR.
“STRIVR’s success to date has come from being able to improve reaction time, pattern recognition, and decision making in athletes—the same outcomes sought by organizations of all types and sizes. STRIVR is already off and running with its platform expansion, as the company is already working with a handful of Fortune 500 companies on comprehensive training programs utilizing VR,” the company writes in its announcement.
Strivr’s refreshed website also shows the company focusing on more general branded VR content, with a bent toward measuring engagement data and analytics insights.
Valve opened up their SteamVR tracking technology for third-party development in August, and since then, 50 developers have completed the mandatory training course provided by Synapse, the first company to enter the field.
Synapse says that most of the participants have been interested in gaming specifically, but some have expressed interest in applying the technology to the automotive industry, science, sports, education, and general consumer electronics.
Synapse has received more registrations for the training course than originally anticipated, so they’ve added additional slots for the course in November and December. A representative for Synapse said that there are currently no plans to continue the course past December, so interested developers should sign-up as soon as possible to get in the remaining classes.
Synapse will also be presenting a compressed version of their training course at SXSW as part of the VR/AR track in March of 2017.
At Valve’s annual Steam Dev Days event earlier this month, the company laid heavy emphasis on making their Lighthouse room-scale tracking technology available to companies wishing to integrate it into 3rd party products. Valve stated that claiming that those 300 licensees span multiple industries ranging from “entertainment VR to automotive to televisions and toys.” Further, Valve says we can look forward to seeing many of these products appear in 2017.
Shortly after the announcement that the SteamVR Tracking technology would finally begin to open up to third-parties, semiconductor firm Triad Semiconductor announced that it was collaborating with Valve to create the ‘light to digital’ chips that form an important foundation of the sensors and make the impressively accurate tracking and which Valve recommends for use in products integrating SteamVR Tracking.
Dominic Brennan gets his hands on ESI Group’s latest iteration of IC.IDO, their automotive visualisation tool that uses CAD-accurate data to generate extremely realistic VR simulations which let you take apart (or fix) your own virtual car.
The GPU Technology Conference in Amsterdam drew much interest from the automotive sector, with a focus on autonomous driving, AI and visualisation. ESI Group’s IC.IDO program is an established virtual reality software solution with a long history in automotive and other manufacturing industries, through the use of CAVE systems, Powerwalls and 3D displays. For many years, it has been the leading software for workflow testing and decision making in a VR environment, with customers such as Bombardier, Ford and MAN.
In the VR Village, ESI Group were showcasing their upcoming Version 11 of IC.IDO, which enables HMD support (it was running on the HTC Vive, but will support the Oculus Rift too). While the beta version had been unveiled at ESI’s North American VR Summit a couple of weeks before GTC Europe, this was the first time it was being shown to the wider press.
The demo was presented as a “Virtual Service” scenario for a vehicle, which involved mounting the “Brake Booster” in an offline assembly operation, and raising the car on a hoist to access the exhaust pipe. According to Eric Kam, Product Marketing and Community Manager at ESI Group, this was a simplified demo, which would typically involve more accessibility and serviceability checks with engineers, technicians and designers present to validate the process. “What makes IC.IDO special is that we enable engineers to work with CAD data with little or no pre-processing of CAD into new or reduced formats.” The software is powerful, modelling multi solid body mechanics and the elastic behaviour of wires and hoses.
With everything set to real-life scale, picking up a virtual tool with a Vive controller, stepping under the car, loosening some bolts and removing part of the exhaust felt natural and most importantly, it was clear that each piece was interacting with other pieces with accurate collision detection. “Often time during service it would be desired to remove components without having to take all subsystems apart – modelling real-time interactions of the various parts is essential,” says Eric. “Most CAD mock up tools will not live simulate the collisions possible during assembly”.
One aspect of virtual testing that differs from reality is the mass of the components involved. The brake booster I clumsily fitted in VR is certainly not as a light as a Vive controller in real life for instance. IC.IDO has a few solutions to offer, depending on the application, according to Eric. “We often use our ergonomics plugin to enable useful feedback. These types of analysis allow the “pantomime” of VR to evaluate what would happen to humans if they followed those same actions under load. It indicates whether supporting a heavy virtual object under a given posture would stress the body in a bad way.” Using tracked objects with realistic mass, like the actual tools or assembly components is something that they’ve done in the past with CAVEs, but less practical for a VR headset. “Some mix between VR and AR would be needed”, says Eric.
But the benefits of HMDs over CAVEs is pretty clear – for one it is far more cost-effective, and can be set up more easily and used more frequently. Achieving the actions in the demo of getting underneath a car on a hoist would require ceiling projection, which not all CAVE systems have. Ultimately, the HMD is a simpler solution and “key to democratising the engineering benefit”, according to Eric. “The reduced cost means that now engineering decisions that are often “batched” into larger reviews can be made more frequently and earlier. CAVEs are very powerful collaboration tools that are often used by executives and managers of engineering groups to make key gate reviews. Because these reviews are made infrequently, that means that intermediate decisions are often made without the benefit of VR. HMDs might mean that two rank-and-file engineers at different locations could potentially collaborate immersively to make a better design decision without waiting until the CAVE is free.”
Version 11 of IC.IDO rolls out in November. HMD support will be an add-on for the existing desktop version of the software. “We believe that with version 11.0 our customers will be more effective in being able to review immersively that which could not be experienced without a CAVE”, says Eric. “More frequent and earlier application of VR will mean that engineering designs will be more mature and more cost effective.”
Disclosure: Nvidia paid for accommodation for one Road to VR correspondent to attend an event where information for this article was gathered.