IC.IDO and Varjo: When mixed reality feels like magic
This summer, I conducted my first IC.IDO process review while wearing a Varjo XR-3 headset, during a live event broadcast to a few hundred people around the world. Comments from participants of the event, my colleagues from ESI, and shared customers who already had early access to the beta told us that IC.IDO with the Varjo XR-3 was “like magic”.
As pleasing as it is to hear such praise weeks before the release of version 16.0—we had been using a “release candidate” during the event and had only first touched the XR-3 a few days earlier—it is far from magic. A more scientific analogy would be time travel.
When immersed in an IC.IDO process review, wearing a Varjo XR-3, you are effectively transported to the future, where you can evaluate product accessibility in the context of assembly or service processes – often faster and more comfortably than users wearing a more “common” VR headset.
By seamlessly mixing virtual objects from IC.IDO with one’s own body and surroundings, human-centric processes like assembly or maintenance can be evaluated very naturally. It is as if the future production plants and products were already available today, further accelerating product development.
Human-Centric Process Validation Accelerated
Crucial milestones in a new product’s development are reached when people first experience their product in serial production, when they initially use it as intended, or when a maintenance procedure is successfully completed. It is not until these achievements, that a product is truly realized. CAE and FEA analysis might show a product is “feasible” – but that is different than knowing that people can build it. Analysis of people using physical models, or simulations using digital mock-ups, is different than predicting how individuals will experience a future product in a proposed procedure.
Discovering that a process is impossible for people to perform on a new product after the start of production is far too late to resolve that issue. Delivering interactive experiences, with new products in proposed processes, earlier in the product development timeline is a core value for ESI IC.IDO—human-centric process validation and virtual product integration.
Reach it, Load It, and Install It
Prior to the acceptance of new products, cross-functional teams traditionally convened at a common site, with physical mock-ups of parts and the necessary tooling. Once there, they evaluated many human-performed assembly or manual service tasks intended for that new product. Historically, engineering and product development teams had ready access to physical builds to inform engineering decisions which gave engineers and method planners several opportunities to experience those future products in context.
However, in digitally transformed enterprises, physical builds have been reduced and, in some cases, eliminated. This has left many organizations without the experiential discoveries made when people can hold the products; after start-of-production (SOP).
In virtual reality experiences, an immersed team member can evaluate if their virtual reality avatar—a virtual embodiment of the immersed individual’s self—can reach, load, or install key components of an assembly or subassembly.
Unfortunately, using virtual reality for this process has limitations. Digital avatars and what one perceives during their experience in VR can vary depending on how a person holds their controllers, wears their head-mounted display, or if they are significantly different sized than digital representations; potentially eroding trust in decisions made during a review.
Mixed Reality Changes Everything
With IC.IDO, we empower enterprises to, effectively, transcend time and space to experience human-performed procedures with yet-to-be-made products weeks, months, or years ahead of production.
Unlike merely reviewing a simulation result stereoscopically, participating in a human-centric review using IC.IDO makes the immersed person an active participant in the process being simulated. The person in the HMD and how they behave while performing a task becomes an input.
Is a component comfortable for ME to reach?
Can MY arms get past an obstruction?
Is there space between for ME or OTHERS to step?
Is there a clearance sufficient for our hands?
The ability to interact with large CAD datasets on a true-to-life scale, with your own hands and arms integrated with the virtual scene, accelerates process planning decisions and the product development timeline overall. Interactive experiences with products in the context of human-performed processes, like assembly or maintenance, informs process planning in ways often not possible until after start-of-production (SOP).
When YOU wear the headset in the same scenario you will know in an instant if you can reach it as well. The process is highly intuitive, and understanding is immediate. As engineers, we might still need to do further work to document or communicate our findings, but we already know the outcome: fewer surprises at the Start of Production.