The Weather Channel unveiled its new immersive graphics, thanks to a new interactive mixed-reality system.
If you don’t live in tornado alley or on a coast barraged by hurricanes, it’s hard to fully understand how it feels to be in the midst of a powerful weather storm. While TV graphics have come a long way, simply seeing footage is not the same as experiencing it.
The Weather Channel (TWC) wants viewers to be immersed in a real weather event. To make that happen, it launched a new broadcast technology: immersive mixed reality.
“We’ve talked about transforming the way that we present weather, evolving it into something that’s a visceral kind of experience, where you just want to watch the presentation because it’s amazing, because it’s beautiful,” said Michael Potts, TWC vice president of design.
Its initial debut included a full-fledged tornado ripping the walls off the studio, and a power line and car falling from the sky as its anchorman ran for safety. While these gave viewers a peak into the possibilities, it wasn’t until Hurricane Florence that viewers got a firsthand experience of a storm surge rising around the meteorologist.
So, how did they make it happen? TWC began the move to enhance its graphics 18 months ago when it invested in Unreal Engine and Future Group, a company specializing in interactive mixed reality. Next was building the elements—such as a water animation that could be placed at different heights—needed to make an array of weather events seem like they were happening in the studio.
“All the graphic elements are loaded up into the system,” Potts said. “Then, each one of the scenarios is called upon by the data from the National Hurricane Center, so the map that’s displayed is live and in real time. And that’s informing what the environment’s going to be. The operator has a tool that lets them choose the right scenario.”
TWC just finished its new wraparound, green-screen, immersive studio at its Atlanta headquarters as Hurricane Florence hit. The studio—the only one with this technology—uses a custom rig from Future Group that has a Mo-Sys camera tracking system, a physical box that attaches to a camera and uses sensors, and an IR signal to triangulate the camera’s position in a virtual space. Unreal Engine software translates the graphics into a broadcast-ready format. The Hurricane Florence segment took 90 minutes to create from the time National Hurricane Center data arrived to broadcasting.
“We can control any number of scenarios, from how high the water needs to be, the wave height and the speed of the waves on top to rain density, the clouds and how dark and overcast it’s going to be,” Potts said. “The entire goal is to try to paint and recreate a reality that’s in the future. This is what to expect. This is really honestly what it could look like if you looked out your window and weren’t prepared.”
Interested in more AR innovations? Check out New Augmented Reality Platform Could Help Save Lives in Combat and Other Crisis Areas and Augmented Reality App Lets Users Scan and Share 3D Models.