During the Game Developers Conference (GDC) back in March data glove specialist, Manus showcased its next-generation of finger tracking gloves, the Quantum Metagloves. Today, the company has announced that pre-orders for the new gloves have begun, with a pair setting customers back an eye-opening $9000 USD (€7500 EUR).
Just like Manus’ previous models, the Quantum Metagloves are focused on enterprise use but even so, these are the company’s most expensive gloves to date. They utilise what Manus calls its new Quantum Tracking technology to provide millimetre accurate finger tracking, thus creating precise hand data and recreation of a user’s hand.
In combination with Manus Core software, the Quantum Metagloves can stream hand and finger data directly into Unreal or Unity or record it to export into animation software like Blender and Maya. As you’d expect from $9k gloves, the sensors are adaptable depending on the situation. The stretchable finger caps are designed for quick, comfortable use or use Manus Finger Tape for those action scenes.
“The new Quantum Tracking technology has enabled us to achieve a new level of detail in finger tracking. As part of the detailed and lifelike finger tracking, we’ve also added tracking of the flexion of the palm of the hand,” says Bart Loosman, CEO of Manus in a statement. “As such, this generation of gloves takes the detail of capture to a new level, allowing creators to convey more emotion and intent through their performance than before. In addition, this level of detail will save significant time in cleanup and post-processing, enabling creators to reach their desired results quicker and easier.”
The Manus Quantum Metagloves will come in three flavours, the standard model followed by versions compatible with Xsens and OptiTrack systems. They’ll all retail for the same $9000, which includes a perpetual license and a 2-year warranty. Shipping is expected to begin in September.
There’s good news if you’re already a Manus customer as a trade-in offer is available. Should you happen to own the Prime II, Prime II Haptic, Prime II Xsens, Prime X, Xsens Gloves by Manus, or OptiTrack Gloves by Manus then you’ll be eligible for a $3000/€2500 discount.
For the rest of us who don’t have a spare nine grand for VR gloves available then there’s always the new 2.0 hand tracking update for Meta Quest. For all the latest Manus updates, keep reading gmw3.
Meta has made no secret of the fact that it is planning on releasing a new virtual reality (VR) headset towards the end of 2022, currently codenamed Project Cambria. Recent details suggest it won’t rival Meta Quest 2 as it’ll be focused on the high-end, more enterprise area of the market. With new reports suggesting it’ll cost $800 USD, the company has unusually come out and said that’s not the case, costing “significantly” more than $800.
The Information (paywalled) published a report saying it had seen an internal roadmap of Meta’s VR headset plans, suggesting that four are slated to arrive between now and 2024. One of those will be Project Cambria which Meta has described as: “more focused on work use cases and eventually replacing your laptop or work setup.”
Suggesting that Meta is targeting a $799 price tag, Road to VR reports that a Meta spokesperson rubbished those claims, saying they were inaccurate and that Project Cambria will be “significantly higher” than the claimed price. How much so wasn’t divulged but such a statement doesn’t exactly indicate a jump of $100 or so.
While no specifications have yet been released, Project Cambria will house plenty of features including eye and face tracking capabilities, controllers that could possibly track themselves, full-colour passthrough mixed reality (MR) and possibly pancake optics for a slimmer profile. It is far more likely that Project Cambria will be HTC Vive Focus 3’s main rival, a standalone headset priced at $1300 (£1272.00 GBP).
As for all those other headsets mentioned in the report, one of those is the second iteration of Cambria (codenamed Funston) supposedly arriving in 2024. The other two are next-generation Quest’s, reportedly codenamed Stinson and Cardiff, slated to arrive in 2023 and 2024 respectively. Whilst certainly an aggressive hardware rollout, with supply lines as they are, these launch windows should be taken with a pinch of salt.
Project Cambria is slated to arrive in September 2022, most likely showcased during Meta’s annual Connect event which takes place around that time. For continued updates on Meta’s VR and AR plans, keep reading gmw3.
Augmented reality (AR) is a key pillar of Industry 4.0 (or the fourth industrial revolution), side-by-side with other potentially transformative technologies like machine learning and big data. Indeed, consultancy firm PwC has estimated that industrial manufacturing and design is one of the biggest areas for augmented and virtual reality (VR), with their use in heavy industry having the potential to deliver a $360bn GDP boost by 2030.
In this latest edition of our series on how augmented reality is faring across a range of industries, we’ll be taking a closer look at why AR is proving so useful in heavy industry, in particular the fields of construction, manufacturing and energy.
Construction
AR is proving to be a key tool for the construction industry, whether in the design stage or actually in the construction process itself, leading a 2020 study of the architecture, engineering, and construction (AEC) industry to say that AR and VR would see “strong growth” in the next 5 to 10 years.
On the design side, numerous architectural tools exist to help with space visualisation using augmented reality. One such example is The Wild, which allows designers to view 3D models in both virtual and augmented reality. Such tools can layer virtual details onto a building plan so that plans can be more readily understood by stakeholders.
That requires highly detailed and accurate 3D models, which is where the technology overlaps with digital twin technology. Using those digital twins, companies like Akular can enable clients to see what a building would look like on-site in the real world before it is built via a mobile application.
When it comes to actual construction, augmented reality again finds a number of uses, not least training workers on safety. That might involve AR headsets that interact with tags on potentially dangerous areas to bring up safety information, but even before workers are on-site, AR can help with training them on how to use heavy machinery – as with the construction equipment training simulators offered by CM Labs or the Arvizio AR Instructor.
“Industries are experiencing a shortage of skilled front-line employees and view augmented reality as a means to accelerate training and efficiently transfer the expertise of experienced workers,” said Jonathan Reeves, CEO of Arvizio. “Arvizio enables organizations to rapidly upskill employees without the need for on-site coaching and mentoring. By delivering no-code authored augmented reality instruction and remote expert connectivity, AR Instructor can substantially increase productivity and reduce errors of workers performing complex operational activities.”
Meanwhile, progress capture and tracking functionality directly compares real-world sites with virtual models to ensure they aren’t deviating – all in real-time. A host of companies provide variations on that technology such as VisualLive, which enables users to witness 3D models in real life via headsets such as the Microsoft HoloLens or mobile devices.
Manufacturing
Much of the technology we’ve covered for construction can equally apply to the manufacturing industry, whether that’s learning how to use dangerous equipment or visualising the layout of equipment and machinery in a factory. None of this is to say there aren’t plenty of bespoke uses for augmented reality in the manufacturing space, however.
One early pioneer was Volkswagen, which was using augmented reality to assist service workers way back in 2013. The MARTA app showed step-by-step instructions on how to repair and replace certain components, overlaying its advice on the car via an iPad app. Along similar lines is Boeing’s more recent use of augmented reality to give technicians real-time, hands-free, interactive 3D wiring diagrams.
Interestingly, that technology has bled over into the consumer space with AR manuals that assist car-owners with basic maintenance operations by showing precisely where components are located within a car.
In the design space, AR has been deployed by the largest manufacturers to rapidly iterate and do away with expensive and time-consuming physical prototypes. In the case of Ford and its partnership with HoloLens, changes can be made to a design and reflected in real-time to collaboratively sculpt a new vehicle.
AR has been trusted at the very highest levels of manufacturing, too. Lockheed Martin utilised augmented reality in the creation of NASA’s Orion Spacecraft, overlaying information to help with mission-critical procedures such as precisely aligning fasteners.
Energy
In the energy sector, AR has the potential to remedy significant problems faced by the industry, chief of which is a brain drain caused by an ageing workforce. Indeed, the US Department of Labor estimated in 2019 that 50% of the current energy utility workforce will retire within the next ten years. The institutional knowledge being lost could be replenished more quickly with the help of AR technology.
Shell is duly using the remote collaboration possibilities of AR to educate workers in the field. Expert consultants are able to see through a worker’s eyes via an AR headset, and even draw on the screen of the augmented reality display they are using. That increases safety as workers interact with potentially dangerous heavy oil and gas equipment, as well as allowing experienced but ageing employees the ability to work remotely.
The energy sector is no slouch when it comes to more specific AR solutions either, such as Upskills’s Skylight platform which allows companies to more easily develop bespoke augmented reality apps for use with AR devices, ranging from Google Glass to Microsoft HoloLens 2 and mobile devices. Then there are solutions such as Adroit, which can provide guidance on repairing high-stakes equipment such as oil rigs by scanning and identifying faulty components and machinery.
Final Thoughts
In heavy industry, where the costs of prototyping are enormous and the potential risks from machinery are significant, leaning on the virtual possibilities of augmented reality is common sense – hence the interest in the technology from across the sector.
To find out more about how AR is progressing in other fields, read the previous entry in the series, where we explored the healthcare industry in particular.
One of the definitive leaders in hand tracking technology is Ultraleap, with its tech integrated into devices such as Varjo’s headsets or available as a third-party accessory. It’s the latter that Ultraleap is announcing today, bringing hand tracking to Pico Interactive’s Neo 3 Pro and Pro Eye headsets.
As you can see in the image above, the setup consists of an Ultraleap Stereo IR 170 camera inside a bespoke mount, with a power cable running to the Pico Neo 3’s USB-C socket. The setup will then run Ultraleap’s fifth-generation hand tracking software Gemini, with Unity and Unreal platforms supported for developers.
The Ultraleap Hand Tracking Accessory won’t be sold as an individual unit it seems for current Neo 3 Pro and Pro Eye owners to upgrade to. It’ll be sold as a new bundle with one of the aforementioned headsets (Gemini coming pre-installed) through select retailers, available now in early access for developers and enterprise customers. An official launch will then take place this summer, with prices yet to be revealed.
“VR for training is on the cusp of mainstream adoption and we truly believe hand tracking plays an important part in tipping it over the edge. We’re already seeing significant wins from customers who have deployed VR training programmes or LBE experiences with hand tracking,” said Matt Tullis, VP, XR at Ultraleap in a statement. “This first phase of the Pico relationship will mean more developers and organisations will be able to test, pilot and refine their applications to unlock the true power of VR now and deploy at scale in a few months.”
“We’re very excited to bring Ultraleap hand tracking to our latest VR headsets through this accessory. When applications need the highest performing hand tracking for complex interactions or challenging environments, Ultraleap’s hand tracking really is world-class. We can’t wait to see what developers and organisations will create from this joint effort,” adds Leland Hedges, GM for Pico Interactive Europe.
Hand tracking has been gaining ground of late, featuring in devices like the HTC Vive Focus 3 whilst the upcoming Lynx-R1 utilises hand tracking (Ultraleap’s again) as its default input method. And, of course, let’s not forget about Meta Quest 2 which supports hand tracking out the box with titles like Cubism, Vacation Simulator and Clash of Chef’s VR all adding hand tracking updates.
gmw3 will continue its coverage of hand tracking as further announcements are made.
While all the focus might be on virtual reality (VR) gaming, the enterprise side of the industry is a hotbed of advancing tech and ever-evolving workflows. Another nod to that fact arrives today with the reveal that AEC (architecture, engineering, construction) software specialist Autodesk has acquired immersive collaboration The Wild.
Thanks to its own acquisition of IrisVR in 2021, The Wild is also heavily involved in the AEC sector, allowing Autodesk to gain an even stronger foothold as businesses look towards XR to help deliver projects in a world becoming more attuned to remote working practices. Integrating with tools such as Revit, SketchUp, and BIM 360, The Wild’s cross-platform ecosystem integrates with Meta Quest, HP Reverb, Pico Neo, HTC Vive, PC and even AR (on iOS devices) to make it easily accessible to all co-workers.
Between The Wild and IrisVR, both platforms serve over 700 customers worldwide which Autodesk will now be able to build upon. No acquisition sum has been revealed and there’s been no mention of how this will impact the current teams going forward.
“Our acquisition of The Wild reflects the rapid transformation taking place in the building industry, from the complexity of projects to the geographic diversity of teams who design, construct, and operate them,” said Andrew Anagnost, CEO and president, Autodesk in a statement. “XR is a must-have business imperative for today and an important part of Autodesk’s Forge platform vision.”
“The Wild and Autodesk share a common mission of encouraging a more productive and collaborative AEC industry, and in this case, one where teams can resolve issues in minutes from their desks rather than the traditional miles of costly travel,” Gabe Paez, founder and CEO of The Wild adds. “The Wild’s customers understand the value from the get-go, building consensus as a team in the virtual world with the ability to make changes to their designs at the speed of thought.”
Considering the way prices for essentials like gas and electricity are going up, more companies are going to be looking at ways of reducing costs. Reducing travel is an easy one to take out the equation and with remote solutions now prevalent making the switch to a collaboration platform like The Wild is a simpler sell.
For continued updates on the latest enterprise use cases for VR and how immersive collaboration is changing industries, keep reading gmw3.
Almost two years into the COVID-19 pandemic, there’s no doubt that this seemingly neverending crisis has shifted how our societies work and connect with one another. It’s also rapidly accelerated the adoption of various integral technologies — particularly XR technology, blockchain, NFTs and Web 3.0 — the next phase of the internet that will bring us closer to the metaverse.
The term “metaverse” has existed for many years now, having first presented itself in Neal Stephenson’s iconic 1992 sci-fi novel Snow Crash. The idea of a metaverse is that it is a virtual space that appears to be completely real and three-dimensional, allowing for a more immersive and interactive experience for connected users. In a matter of months, it’s also become one of the biggest buzzwords of our current era — especially after Facebook CEO Mark Zuckerberg announced his plans to rebrand the company name to Meta and turn the social media giant into a leading metaverse platform.
Microsoft tycoon Bill Gates has also recently shared his belief that within two or three years, most remote meetings will take place in the metaverse. While we can’t be certain that we are headed into the metaverse on such a proverbial bullet train, we do know that in due time, much of our professional and social lives will soon find footing within the next phase of the web.
As our work lives carry on, what might office culture look like in the metaverse? Here are some of the key changes we can expect to see in the not-so-distant future.
More personalised remote connections
It seems that remote and hybrid work is here to stay — albeit, still through “flat-screen” applications such as Zoom, Slack and Google Meet. As functional and familiar as these applications have become, their “two-dimensional” experiences haven’t quite managed to replace the efficiency of meeting with people in-person (an ordeal that has led to the now popularly coined term “Zoom fatigue”).
While employees seem to have enjoyed the idea that they can work from anywhere, prolonged periods of remote meetings have also made workplace cultures feel more bland and impersonal. Without things like body language or sharing similar settings, remote work has offered less room for people to form organic, human connections. Humans are spatial learners who learn most efficiently by doing — which explains why it can be harder for us to feel like we’re really in the presence of our colleagues or friends when speaking to them over an ordinary video chat.
Leading brands, such as Meta and Microsoft, believe we can improve the art of connecting remotely in the metaverse. Cognitively, the use of VR and metaverse platforms are likely to make us feel more focused and present with our connections. And instead of speaking with coworkers over a “flat” screen, multiple parties will be able to experience more immersive, life-like meetings that will simulate the sensation that everyone is in the same place and time.
In an effort to make remote communication easier for employees, Big Tech platforms have started unveiling more immersive communication tools. Facebook (or Meta) has already explored the idea of an early metaverse platform in their early metaverse platform Horizon Worlds, where users can use their Oculus Quest headsets to access and hold meetings in VR.
Microsoft’s new Mesh Teams software also combines mixed-reality capabilities found in Microsoft Mesh — a platform that allows for people in various locations to create digital avatars of themselves, collaborate within a shared virtual space, chat with one another, complete projects inside shared documents and much more.
More enhanced collaboration software for employees
With a massive rise in remote and hybrid working, several technology firms have seen opportunities to offer more enhanced digital collaboration solutions for teams. Collaborating on projects in real-time also presented itself as one of many post-pandemic challenges, with employees often struggling with logistics or team communication while working on projects simultaneously.
3D design platform Gravity Sketch has recently launched its innovative LandingPad virtual collaboration room, making real-time collaboration between professional designers and teams much easier and more accessible through VR. Users have the ability to create personal collaboration rooms, invite team members and design at scale in 3D. There are also functions that allow for in-app voice conversations, the ability for users to edit others’ work and the ability to move around projects at scale.
NVIDIA also recently made its popular metaverse-building Omniverse software free for individual creators and artists to access in 2022. Omniverse has been a leading contender in the growing collaboration software market, with downloads from over 50,000 creators and counting. So far, Omniverse has been adopted as an industry-standard within a range of different sectors — such as the robotics, automotive, construction, media and architecture industries.
When we think about what collaborating in the metaverse may look like 10 years from now, platforms like Omniverse are leading the way. With its stunning interface and cross-disciplinary functions, NVIDIA has taken input from several developers, customers and partners to produce real-time renderings and interactive workflows that, well… work.
More diverse and inclusive teams
Working in a metaverse office, as opposed to a physical office, means there are zero limitations on who can access it. In our post-COVID era, we may remember a time when we would only seek employment in markets where we were restricted to the job market that was tethered to our home city or our physical location. Those who choose to work remotely can already wave goodbye to the days of spending two hours commuting to get to work on time, or feeling pressured to relocate for the sole purpose of seeking employment.
Companies that adapt to metaverse technology should also consider how this will impact their diversity and inclusion targets. A non-physical office or remote team will allow them to hire nationally or internationally, providing them with greater access to global talent.
Final thoughts
According to recent research released by Owl Labs, nearly half of the UK population believes that working in the metaverse will be an asset to workplaces. 52% of respondents also claim to be confident that the metaverse will “bridge the gap between in-person and remote workers by creating a more immersive environment.”
A smart idea for workplace vendors may be to consider implementing a metaverse strategy that will well-position them to access new opportunities offered by Web3. This may include staying on top of metaverse products — or looking into more streamlined integrations between space reservation interfaces and collaboration platforms.
Either way, the emergence of the metaverse is an exciting time for workplaces — offering many possibilities for companies to improve their workflows, advance their collaborative capabilities and hire more diverse talent. These possibilities, combined with the optimistic view from survey respondents, suggest that we will soon see more immersive, embodied office environments come to life.
Manufacturing is a highly complex process, in addition to being the most important part of supply chain management. There are several components that affect the manufacturing production process — such as the availability of raw materials, labour costs, inventory costs and overall marketplace demand.
Since the start of the Industrial Revolution, the effective marriage of systems and machines has allowed us to increase production times, reduce product costs and find new ways of organising work. Within the last 50 years, digital transformation has continued this trend, enabling us to better understand the physical through digital operations.
With that being said, however, the physical has still held precedence over the digital for most of modern times. The rise of the metaverse will allow us to reverse this dichotomy, giving us access to a primarily digital space. In the case of the manufacturing industry, we will be able to translate this digital space onto the physical world, rather than simply just enhancing it.
Let’s look at some of the key ways where we can expect to see the manufacturing industry change within the metaverse.
An entrance into the creator’s economy
The metaverse will provide users with easier access to digital materials — a major shift that may very well encourage more creators and consumers to pursue industrial design. This will inevitably create new industry demands and completely change how products are made.
3D content creation tools will also become more widely available in the metaverse. This will add manufacturing to the creator’s economy, providing the general public with more tools to render and simulate 3D prototypes at their own convenience.
Just like with gaming platforms, streaming services or other various forms of online content creation, we will be sure to see the same type of growth proliferate within manufacturing and supply chain management. According to analyst firm TrendForce, the industrial metaverse revenue is set to reach $540 billion by 2025.
Easier collaboration on product development
The metaverse will also provide much easier collaboration on all aspects of product development. Given that it will be capable of serving as a communal space for all stakeholders involved with a project, multiple processes will be able to be achieved more rapidly and simultaneously — such as product design, sharing with manufacturers, iterating based on feedback and much more.
NVIDIA’s VR-based collaboration tool Omniverse has experienced a successful launch in the enterprise sphere. As a multi-GPU, real-time development platform for design teamwork and 3D simulation, it has become a staple for those working in the industrial sector or for those who specialise in the creation of digital twin applications.
To date, Omniverse has been downloaded by over 50,000 creators — with a recent platform subscription having been launched by NVIDIA to allow for wider outreach. The Omniverse platform has already experienced tremendous growth, with integrations from popular design platforms (such as Blender and Adobe) being made available for developers to use from any location. These integrations have well-positioned NVIDIA as a viable leader for collaborative product development in the metaverse.
Workplace changes due to the pandemic have also led to a rise in collaborative XR solutions within the enterprise sector. SkyReal, an aerospace-focused software company, started its operations by helping companies collaboratively approach their various stages of manufacturing — from conception and industrialization, though to training and marketing. Now, SkyReal helps aerospace teams work on CAD files in real-time, offering them an immersive experience that allows for even better collaboration capabilities.
More streamlined processes through digital twins
Digital twins are virtual representations that serve as real-time replicas of a physical object. From gaming companies to automotive manufacturers, many industries have already started using digital twins to collect real-time data and predict how objects will perform before they are manufactured and sold.
The digital twin market has been projected to grow to an incredible $86 billion by 2025. This level of growth is largely being fueled by an increase in demand for things such as predictive maintenance, industrial IoT solutions and a smarter and more energy-efficient infrastructure.
Digital twins also provide real-time data for users, allowing them to gain better insights on overall production processes. For example, automotive manufacturers are already using digital twins to better pinpoint equipment failures and ensure that all parts are meeting quality standards before being delivered to customers.
BMW has already started using a simulated system to better streamline its production process. A version of the company’s Regensburg-based production line exists solely within a computer simulation, serving as a digital twin to its physical counterpart. Before any parts enter the production line, the entire manufacturing process runs in a hyper-realistic virtual iteration of the factory. By adopting this technology, managers can now plan their production process in greater detail.
Other large companies that have adopted the use of digital twins include Microsoft, Unilever, Boeing, Siemens Energy and Ericsson. With Azure Digital Twins, Microsoft has created a leading IoT platform that features a live execution environment, allowing users to create digital representations of real-life things, people, places and processes.
In all, digital twins will be an extremely integral building block of the metaverse. They will provide us with lifelike representations of things from our physical world and come equipped with live feeds of every sensor and component they contain.
Shorter lead times
The collaborative approach offered by working in the metaverse will certainly shorten the life cycle for projects. More robust virtual spaces will also allow manufacturers to quickly see how moving assets around can impact a production cycle. By simulating real physics and identifying potential errors, this approach is a great way for manufacturers to see more efficacy and faster turnaround times.
Down the road, greater interoperability initiatives will also make product designs generally easier and faster to implement. Designers and creators will no longer have to go through as many hoops to complete their designs and get them into the hands of manufacturers. This will result in shorter lead times, as well as an exponential increase in the number of product designs they can complete.
Supply chain transparency
In more recent years, demand for supply chain transparency has been on the rise. According to the MIT Sloan School of Management, consumers are reportedly willing to pay between 2% and 10% more for products that offer greater supply chain transparency.
What we can deduct from this data is that consumers find value in the treatment of workers in a supply chain, as well as in a company’s efforts to provide decent working conditions. Ethical concerns, such as slave labour or deforestation, have made consumers increasingly more averse to purchasing products that don’t meet these standards.
With this being said, the truth is that supply chains were not originally designed to be transparent. However, access to the supply chain or to digital twin management in the metaverse could resolve this issue for good.
Working in the metaverse will also provide far better project visibility, for both staff members and consumers alike. Given that multiple collaborators will be able to work within the same space, regardless of their physical location, all parties will have access to 3D design representations of how products are designed, built, sold and distributed. Customers may even grow used to tracking their orders throughout the entire cycle, from raw materials through to a finished product. With this added insight, customers will gain full transparency into the entire production process.
Greater supply chain transparency will also give customers greater visibility of lead times. This will offer them a better sense of real-time shipping costs and allow them to better prepare for potential pitfalls (such as shipping delays).
Final thoughts
The metaverse will pave the way towards a digital-first approach to manufacturing. This will essentially be driven by both consumer preferences and different types of actions that will be necessary to operate inside a virtual world.
There are valuable steps that manufacturers can take to bring us closer to an ideal metaverse system. For starters, it is critical that they work on harvesting data from their processes — and also that they implement the best interoperability protocols for connecting said data across the entire supply chain.
Recent innovations — such as NVIDIA’s CloudXR platform (which has been configured to work with Google Cloud) — have begun enabling organizations to securely access their data through cloud-based solutions. This will allow creators to access their work and collaborate on projects from anywhere in the world, all while doing so through the lens of an immersive, high-quality user experience.
In all, these areas are all currently being worked on to forever disrupt and change the concept of supply chains. This is an extremely exciting and innovative time for manufacturing technology — and we look forward to tracking the eventual paradigm shift that’s to come.
Finnish headset manufacturer Varjo launched its immersive collaboration tool Reality Cloud last year. The company has announced an upgrade to the platform today, seeing the addition of cloud streaming for enterprise customers.
Varjo has a vision for metaverse collaboration that’s the same as real life, combining its professional-grade VR/XR hardware with intuitive immersive tools. The cloud streaming rollout on the Varjo Reality Cloud platform will be an early access release for select customers – one of them being electric vehicle manufacturer Rivian – whether they’re in VR with the Varjo Aero or in mixed reality using the flagship Varjo XR-3.
Just like NVIDIA’s CloudXR, the whole point of cloud streaming is the ability to provide high-end workflows on PC’s that weren’t built for intensive VR applications. Varjo has its own foveated transport algorithm that can stream immersive content at a bandwidth of 35 megabits per second. To make this possible Varjo’s collaborative platform is powered by Amazon Web Services (AWS) and NVIDIA GPUs.
“Being able to achieve the same quality experience through Varjo Reality Cloud with less powerful local PCs is a game-changer for companies looking to scale their use of virtual and mixed reality,” said Urho Konttori, founder and CTO of Varjo in a statement. “Now, with our new cloud streaming service, users can join photorealistic virtual experiences with almost any laptop with a dedicated NVIDIA GPU and a Varjo headset and start collaborating in an immersive environment.”
Varjo Reality Cloud is still under development itself with an official commercial launch expected to take place during the first half of 2022.
Alongside Varjo Reality Cloud, the company is making it (a bit) easier to access its products thanks to the release of Varjo Aero several months back. The cheapest headset the company has made to date – it’s still $2,000 USD – the Aero packs in some serious specs including 2880 x 2720 px per eye resolution, eye tracking and SteamVR compatibility for those that want the best VR gaming experience.
As Varjo continues to enhance its product lineup gmw3 will keep you updated.
Varjo, the Finnish hardware manufacturer behind some very expensive (and highly regarded) virtual reality (VR) and mixed reality (MR) headsets, has begun teasing something new is in the pipeline. So far the only information available is via a brief mention of a launch event that takes place in a couple of weeks.
Varjo XR-3 and VR-3
Over on its website, Varjo simply states: “This is the one you’ve been waiting for. Join us for a live event on Thursday October 21st, and witness the unveiling of our most highly anticipated product release yet.” There’s the option to signup and register for the launch event which begins at 12:00 pm ET/5:00 pm BST where Varjo co-founder & CTO Urho Konttori will reveal all.
Currently, Varjo has two products available for enterprise use cases. The VR-3 is priced at $3,990 USD (inc. a 1-year Varjo subscription) whilst the XR-3 comes in at a substantial $6990 (inc the subscription). So that gives you an idea of where Varjo has positioned itself in the market. Could this “highly anticipated product release” be a new headset and if it is where is it being positioned?
Varjo has built its business around very high-end hardware, with both headsets featuring the company’s “human-eye resolution” Bionic Display, a combination of two screens. A small central 70 PPD uOLED with a 1920 x 1920 px per eye resolution, with a secondary peripheral 30 PPD LCD, at 2880 x 2720 px per eye. They also feature eye-tracking, Ultraleap hand tracking, and more. So whatever the new product is, if it is hardware it won’t be cheap and cheerful.
The website does feature a singular image of someone wearing a headset but it does look exactly the same as Varjo’s other products. If it isn’t a headset then there’s always software. Although that’s more unlikely considering Varjo’s last major announcement in June was for Varjo Reality Cloud.
Whatever it is, the timing couldn’t be better. HTC Vive has its “Go with the Flow” event on 14th October whilst Facebook Connect is being held on 28th, making for an exciting few weeks for the XR industry. Could Varjo steal some of Facebook’s thunder? As further details come to light, VRFocus will keep you updated.
KAT VR has always been in the business of omnidirectional treadmills for virtual reality (VR) applications, mainly for the professional and enterprise market with devices like 2018’s Kat Walk Mini. Today, the company has announced its second generation of that model, the Kat Walk Mini S, set to hit the market this month.
The Kat Walk Mini S aims to refine the walking VR experience for users with a range of improvements over the previous model. A new vibration module built into the base adds immersive feedback from events happening in their surroundings such as explosions or earthquakes, so when a player is running through a battlezone it’ll certainly feel like it.
Even more importantly, the overall walking experience is said to be easier thanks to an optimised base. The learning curve for using the device has been reduced, accommodating a user’s natural gait more effectively whilst supporting both KAT VR’s quick-boarding shoe covers – great for location-based entertainment (LBE) venues – and its own dedicated shoes which have four adjustable levels of friction.
Also on the improvement list are better ergonomics, allowing users to more easily bend down, squat or kneel if they need to reach an item low down or use the environment for cover. The overall look and feel of the Kat Walk Mini S has also been enhanced with a far more professional, eye-catching design, built-in lights to give it a nice futuristic look and a new cable management system when venues are using PC-tethered headsets.
As the Kat Walk Mini S isn’t a consumer product KAT VR hasn’t released pricing information but it has confirmed it’ll be available to purchase worldwide this month. KAT VR does sell consumer products, the most recently released being the Kat Walk C treadmill which completed a Kickstarter last year and retails for $1,499 USD on Kat VR’s website.
As further details on the Kat Walk Mini S are released, VRFocus will keep you updated.