More Features Added to Mixed Reality Software from Arvizio for AEC
Andrew Wheeler posted on May 03, 2018 | | 1668 views

HoloLens and the Windows Mixed Reality headsets from Samsung, Acer, HP, Lenovo and Dell use the same operating system designed by Microsoft. Engineers, manufacturers and product design teams looking for ROI (Return on Investment) information with benchmarks or timed data proving that mixed reality Head-Mounted Displays (HMDs) can make specific workflows more efficiently enough to justify the expense. Most companies and organizations interested in the set of Windows Mixed Reality headsets (including HoloLens) need to see more enterprise level solutions.

One company that is attempting to persuade newcomers to the wonders of enterprise level mixed reality applications is Ottawa-based Arvizio, Inc. Launched in March of 2017, the company first offered a glimpse of what they were calling “the industry’s first enterprise class mixed reality server platform.”

In May of 2017, Arvizio introduced MR Studio for HoloLens, Android MR/MR devices and Windows Mixed Reality headsets. Using a proprietary technology, they call Advanced Spatial Processing Engine (ASPEN), MR Studio promised the ability to process complex 3D models for AEC 3D data, LiDAR and Graphical Information System (GIS) data and medical imaging data. (Image courtesy of Arvizio.)
In May of 2017, Arvizio introduced MR Studio for HoloLens, Android MR/MR devices and Windows Mixed Reality headsets. Using a proprietary technology, they call Advanced Spatial Processing Engine (ASPEN), MR Studio promised the ability to process complex 3D models for AEC 3D data, LiDAR and Graphical Information System (GIS) data and medical imaging data. (Image courtesy of Arvizio.)

In July of 2017, followed through on its promise to integrate complex and large-scale 3D data into MR Studio, targeting AEC, Mining, Surveying as well as Oil and Gas sectors who rely on point-cloud data for engineering and operational decisions. These 3D scans can be composed of billions of points over many square kilometers. Arvizio designed MR Studio to handle these vast 3D data set by constructing a spatial data pyramid with rapid data indexing. This effectively segments the data which users can then access, display and navigate through, depending on attributes that define a user’s model viewing requirements.

In September of 2017, Arvizio introduced MR Studio ViewPoint, a feature that leverages a depth-sensing camera to capture physical data and distribute it to a network of HoloLens headsets. HoloLens users in this distributed network can view, annotate and share captured data from ViewPoint. However, users do not need a HoloLens to view the captured 3D data—any viewing device will work as well. (Image courtesy of Arvizio.)
In September of 2017, Arvizio introduced MR Studio ViewPoint, a feature that leverages a depth-sensing camera to capture physical data and distribute it to a network of HoloLens headsets. HoloLens users in this distributed network can view, annotate and share captured data from ViewPoint. However, users do not need a HoloLens to view the captured 3D data—any viewing device will work as well. (Image courtesy of Arvizio.)

At the I/ITSEC 2017 conference in Orlando in December of 2017, Arvizio demonstrated how MR Studio could be used to create training simulations for a distributed team. In January of this year, Arvizio announced that they’d successfully completed a feature for MR Studio that enables users to integrate massive point cloud scans from DotProduct 3D scanners. The Dot Product hand-held scanners capture very dense point cloud data, and Arvizio MR Studio Director prepares compressed versions of the point cloud data and prepares it for viewing in mixed reality. In their raw form, the point cloud data is too dense to be viewed in mixed reality devices like HoloLens.

Multi-User Collaboration Suite for HoloLens

In April of 2018, MR Studio evolved into an “enterprise mixed reality software platform” with some new and refined features:

A feature called HoloSync gives users the ability to gather in a group around a digital object and walk through it, making annotations for the others to see as time passes during a meeting. Each user has their own independent visual perspective relative to their own individual HMD from the gaggle of Windows Mixed Reality options.

The Multi-Room feature allows a team of users to walk through shared models in sync while enjoying their own individual perspective. They can hear each other, share documents, see shared annotations and hear each other as though it were a conference call—all while interacting with each other’s 3D avatars in real-time. (Image courtesy of Arvizio.)
The Multi-Room feature allows a team of users to walk through shared models in sync while enjoying their own individual perspective. They can hear each other, share documents, see shared annotations and hear each other as though it were a conference call—all while interacting with each other’s 3D avatars in real-time. (Image courtesy of Arvizio.)

The Remote feature gives users the ability to collaborate by viewing real-time video feeds from specified headsets and gives them the ability to chat and annotate as well.

The team from MR Studio added alignment and scaling tools to improve the accuracy of superimposing digital 3D models and heat maps, aligning them closely to their intended and actual position in the display data representing their physical world context.

They also already added more types of 3D file formats to allow users of popular BIM and CAD software to import their work into MR Studio. Importantly, this includes real world scans from DotProduct, Matterport OBJ and XYZ point cloud exports, and a few file formats for LiDAR point cloud data. These 3D scan file formats can be extremely large sizes, so for them to be viewable on mixed reality devices, MR Studio optimizes them with automatic LOD.

The processing power of standalone mixed reality devices approaches that of most smartphones which makes integrating diverse types of 3D data files challenging. Rendering large dense sets of 3D data so that they function well as immersive experiences on HoloLens and other mixed reality devices is also challenging. Most approaches to rendering are like those used in video games, segmenting and processing only the data that the user can see in his perceptual framework as they move along.  

Arvizio chose another route by designing MR Studio to process all the data necessary for mixed reality viewing first. After pre-processing, the user gets simplified versions of the whole model to navigate through with detailed sub-views becoming available to the user as they move through the 3D data.

Each 3D model is composed of polygons, and HoloLens’ computing is limited within the range of 100,000 – 200,000 polygons. The trick for developers is to prioritize the best detail possible within the perceptual framework available to users while sequencing and simultaneously prioritizing the order of instructions given by MR Studio to various hardware setups that instruct Windows Mixed Reality devices and Microsoft HoloLens.

Adding compatibility for more 3D file formats has led Arvizio to pursue a proprietary universal spatial format for processing and rendering in mixed reality. This is partly reactionary, as more 3D data file formats come from an increasingly diverse array of system types.

Bottom Line

The last point in MR Studio’s evolution as a mixed reality platform is the new ability to display real-time IoT sensor data, adding live data to each user’s mixed reality device.

If you’re headed to AWE 2018, held from May 30 – June 1 in Santa Clara, California or attending Microsoft Inspire in Las Vegas from July 15 – 19.

Recommended For You