Do HMDs Belong Inside The CAVE?
Andrew Wheeler posted on January 20, 2017 |
An interview with Eric Kam from ESI about the future of virtual reality in industrial settings.

I had the pleasure recently to correspond with Eric Kam, the Product Marketing and Community Manager of Immersive Experience at ESI Group, to talk about his perspective on CAVE (Computer Aided Virtual Environments), HMDs (Head Mounted Display) and the future of industrial engineering and design in virtual reality.

There are two solutions from ESI that I wanted to know more about: IC.IDO and VRify, a cloud-based virtual reality solution that empowers designers and engineers to immerse themselves and others in their 3D designs. I wanted to focus on understanding the connection (if any) between CAVE and HMDs, like HTC Vive.
Assembly process review conducted using ESI IC.IDO 11 with an HTC Vive Headset and controllers. (Image courtesy of ESI Group.)
Assembly process review conducted using ESI IC.IDO 11 with an HTC Vive Headset and controllers. (Image courtesy of ESI Group.)

If you don't know, IC.IDO combines high-end visualization and real-time simulation of product behavior in its actual size, and allows product operation very close to reality. ESI seems to occupy an intersecting convergence of industrial VR by updating and innovating in the CAVE space, but also incorporating newer VR experiences found in HMDs as they continue to improve.

Caterpillar using ESI's CAVE system to design and visualize new heavy equipment and machinery. (Video courtesy of ESI Group and Caterpillar).

Obviously, the biggest difference between CAVE systems and HMDs are the cost of entry. CAVE systems are way more expensive, so the first conclusion anyone can make is that larger companies can afford them, but smaller engineering firms will eschew them in favor of more traditional methods, like sticking with CAD on 2D displays, or maybe investing a small amount in a VR headset like the HTC Vive for a fancier way to visualize 3D environments and models.

Are HMDs and CAVE systems mutually exclusive? What are the benefits of one over the other and how do you see their uses converge and diverge in industrial engineering settings?

I prefer to think of HMD as increasing access to the VR ecosystem. When we think about the way CAVEs are used, they are used for large group reviews. Multiple stakeholders can all come together for a consensus or gate review. HMDs give engineers the ability to review designs individually in an interactive immersive environment, without needing access to the CAVE.

In general, CAVEs are relatively rare resources that are often in high demand. If every borderline decision that an engineer makes has to wait until one of the major reviews to be validated or explored in a larger context amid other subsystems, the engineer's capabilities are diminished.

Using ESI's CAVE to develop new aircraft, Safran Nacelles can be immersed in virtual prototypes to validate design and installation of manufacturing lines as well as train new operators.

Using HMDs to accomplish more frequent intermediate reviews puts one back on the path to higher performance and productivity. Of course, HMDs are compelling to use in association with CAVE. For example, say a review is happening among all the stakeholders except for those in a remote manufacturing facility. The VR session at the CAVE could act as a host for a cooperative session between two instances of the same review. In other words, the remote site could join the session and interact much the same way that they might if they were standing in the CAVE. We are already doing this between geographically separated CAVEs, and it's possible for an HMD remote user to join this interaction.

Accessibility, visibility are crucial reasons why the virtual worker simulation in ESI’s IC.IDO was built and designed for collaboration. (Image courtesy of ESI Group).
Accessibility, visibility are crucial reasons why the virtual worker simulation in ESI’s IC.IDO was built and designed for collaboration. (Image courtesy of ESI Group).

How does VRify and VR in the cloud help a startup who wants to work on the problems associated with untethered VR service from the cloud? (Broadband speeds, latency etc).

Cloud hosted VR, which is different from the cooperative session I just mentioned, does offer some unique challenges. In the cloud, the goal is to make the experience independent of the user’s hardware. The hardware is all remote, or “in the cloud”. This means that rendering of 3D views and simulating interactions with the model would be computed in the cloud. A user in an HMD turns their head, then they signal to move the view – so this responsive change of view needs to make a roundtrip to the cloud and back to the hardware. The elapsed time from cause to effect is twice that of simply downloading a view. That kind of delay might be unpleasant to most users.

Resolving challenges like this is something that our technology partners at NVIDIA and our development team are very interested in addressing. Just how we are going to accomplish this is something that we are very excited to discover.

How do startups leverage the benefits of IC.IDO?

When you find yourself evolving to a cloud served immersive environment, you begin to see that no matter what, virtual engineering is cheaper than producing physical prototypes for most products. This being said, perhaps the hurdle of buying the technology is a bit high for a bootstrap startup to clear. For that exact reason, we have kicked off a little startup inside ESI to bring a more democratized virtual engineering tool to the market. We call it VRify.

We are only in the phase of testing with a very small prototype community at this time, but with this tool we aim to bring some very specific benefits of IC.IDO to the cloud as a highly specialized App.

Like with any App, the use and application would be very narrow and perhaps only address one specific “job-to-be-done”, which is why we are bringing it out to a small focused “trial community” first, to allow for more rapid iteration of the App.

With VRify, we aim to bring the benefit of interactive and immersive engineering tools to the masses. So far our users have been very vocal in their likes and dislikes of our approach, and have been aiding in the evolution of the app. We hope that we can widen our testing phase soon and even start the public beta in the not-too-distant-future. But we do want to get the product closer to what people need before hitting the market.

What are some obstacles and challenges to improving industrial VR and tying together CAVE and HMD experiences for engineers?

It isn't only large companies that can afford VR in the form of CAVEs. Now that consumer class VR is improving (Oculus, HTC Vive, Samsung GearVR, Google Daydream/Cardboard), and the Vive and Oculus are delivering pretty high fidelity VR experiences, we are seeing the emergence of a converging “prosumer” class HMD that we are addressing for the wider market.

And while this does lower the investment threshold for bringing VR online, it still does not completely remove the hurdles to adoption. Workstations that power the VIVE Business Edition still require pretty hefty professional graphics cards, which are expensive. Not only that, but CAVEs serve as many multiple users as can fit in a room and the number of pairs of glasses that are available. But an HMD is a single user experience — sure it is possible to link multiple users into a share session of IC.IDO, but only while maintaining a one to one ratio of PC-to-HMD user.

How well does IC.IDO work with low level VR with Daydream Cardboard?

IC.IDO does not interface with Google Daydream or Samsung GearVR. Those devices have limited interaction potential and only 2 to 3 degrees of freedom. You can look up and down, side to side, and with the Daydream remote maybe you can navigate. But generally, these are not tracked interfaces. Also, they don’t carry a lot of rendering capability onboard. With our VRify startup we are experimenting with ways to stream the immersive experience to the mobile device and offering stereo viewing, but the method and level of interaction is something still being discovered.

Again, the action to view change delay and latency would be very unnerving for many users. Different modes of collaboration between HMDs and CAVE are possible. The HMDs won't replace CAVEs, but may provide new synergy.

How can CAVE and HMDs work together? Is there an example of a small business using HMDs to collaborate with a larger firm using CAVE environments? Or within a large engineering firm, do they ever use your software to collaborate between HMDs and CAVE 3D models? View captured real-time sequences?

We will release this information in a coming version with some modes of cooperation between CAVEs and HMD. It is clear that not all the things done in a CAVE can be achieved with HMDs (like rendering massive data, for which we use clustered machines), or tracking objects that are not either the Vive controllers or the HMD itself.

But we do envision the remote access potential that a CAVE session could have in one locale. For example, someone might share their session with HMD users who are participating remotely in another room, building, or city.

Are people doing this today? Not quite. The HMD version is in its final Beta, but we definitely know that our users share Powerwall sessions with smaller projection and monitor-based systems. A user with an HMD would not experience this all that differently.

In many ways the HMD is far more immersive than CAVE. But that completely immersive quality does make basic person-to-person interaction more difficult. And it's not as intuitive as being able to truly see and perceive the others who are in the session with you. Collaboration while under the HMD is something people would have to get used to.

ESI Group has sponsored to write this article. It has provided no editorial input. All opinions are mine. —Andrew Wheeler

Others who read this also liked:


Recommended For You