How Hollywood is a testing ground for the computer graphics tech that permeates design and manufacturing.
AMD has sponsored this post.
Think of the last movie you watched that, when the credits began to roll and the lights went up, gave you the feeling of waking from a dream. The last movie you got completely lost in. The last movie that made you think of the saying “movie magic.”
Well, to paraphrase legendary sci-fi author Arthur C. Clarke, movie magic is indistinguishable from sufficiently advanced technology. In fact, it’s thanks to it.
In the entertainment industry, technology is getting more magical every day. And it continues to revolutionize how movies are made, from the first forays into computer-generated (CG) visuals in the 1990s to the so-called virtual production of the modern day.
“Media and entertainment is considered responsible for most of the innovation in computer graphics,” says James Knight, media and entertainment and visual effects director for chipmaker AMD.
But the innovation doesn’t stop at the silver screen, says Knight. “That then trickles to design and manufacturing and other verticals.”
It’s done more than trickle—computer graphics have flooded the engineering discipline. Pencil and paper drawings began giving ground to computer-aided design (CAD) in the 1960s, and by the end of the 90s 3D CAD had become mainstream. Today, CAD is complemented by sophisticated computer-aided engineering (CAE) technology, and entire engineering workflows are being rebuilt around a common, graphics-driven theme: virtual.
Virtual prototyping, virtual twins, virtual reality—if you can see it on a screen, it might as well be real. It might even be better than real.
Engineering and Hollywood: Two Sides of a Technology Coin
Digital innovations in both media and engineering feed off one another, and there continues to be a symbiosis between the two domains. Digital technology is, of course, built by engineers, but it’s often artists who find creative ways to push the tech to its limits. In order to expand these limits, engineers must adopt creative new solutions of their own.
James Knight understands the symbiosis between engineering and entertainment as well as anyone. He began his career in the entertainment industry, working on visual effects for music videos and DVD menus. He would go on to manage motion capture for James Cameron’s Avatar, a $230-million production that premiered in 2009 and was widely acclaimed for its ground-breaking graphics (it won Best Visual Effects at that year’s Academy Awards, along with Best Art Direction and Best Cinematography).
In 2015, Knight flipped to the engineering side of the coin by joining computer processor maker AMD. At the time, AMD didn’t have a media and entertainment (M&E) division; it focused on selling its most powerful processors to industries like design and manufacturing, oil and gas, finance, biotech, and government labs. But Knight’s experience in M&E made him the perfect ambassador to an industry that was rapidly embracing computer technology—and lots of it.
“[M&E] is a great battle testing ground for our new technologies,” says Knight. “They’re the ones that push the envelope in computer graphics, because audiences want to have a vacation for 90 minutes. Great content is all about suspension of disbelief.”
Take Avatar, a film which took motion-captured performances and turned them into CG characters set against a CG backdrop. It was almost entirely CG, but it didn’t feel that way to audiences. Millions of moviegoers suspended their disbelief and lost themselves in the photorealistic world of Pandora, which was built as much by engineers as by visual artists. That’s the reason Avatar made a record-breaking $2.8 billion at the box office.
An insane amount of computer processing power made Avatar possible. The data center used by the production—operated by Weta FX, Peter Jackson’s visual effects production house—covered 10,000 square feet and included more than 4,000 blade servers packing nearly 35,000 processing cores and 104 TB of RAM. The so-called render farm churned through roughly eight gigabytes of data every second.
“Render farms are the media and entertainment version of HPC [high-performance computing],” says Knight.
Knight’s colleague Brock Taylor, Global HPC solutions director at AMD, illustrates the similarity by pointing to a particularly compute-intensive task in M&E: realistic simulations of hair and fabric.
“Being able to create effects around hair and cloth is extremely challenging, and at its core, that is a high-performance computing workload in structural mechanics,” says Taylor. “The physics behind it is practically the same whether you’re creating part of a crane that needs to hold 2,000 tons or you’re simulating fabric blowing in the wind.”
Behind the Scenes of Virtual Production
Today, engineering and M&E are both rapidly embracing the power of virtual. Engineers will be familiar with terms like virtual twin (aka digital twin), virtual reality (VR) and virtual prototyping (aka simulation-driven design). In Hollywood, the buzzword du jour is virtual production, a digital-first approach to making movies.
“Virtual production is where the physical and digital worlds meet,” according to Weta FX.
In a standard production, actors are filmed in front of a green screen and visual effects are added in post production. But this doesn’t always pass the suspension of disbelief test. For example, it can be hard to line up the camera perspective for both real actors and CG visuals, resulting in scenes that can look off, if not downright bad. And on set, the cast and crew must try their best to imagine the final scene—a barrier to the creative process.
Virtual production fixes these problems by bringing the CG visuals directly into filming. There are two types of virtual production, says Knight, but you can think of both as “computer graphics on stage.”
One option is to change what filmmakers can see in the camera viewfinder. Similar to augmented reality, this combines the real and virtual worlds in one live image.
Knight explained how an early version of this approach worked on Avatar. In real time, the team he was a part of could see a live composition of a real set with a CG character. Actor Sam Worthington, who played protagonist Jake Sully, drove the CG character by performing in a suit of motion capture (mocap) markers that tracked his every move. The film camera also had mocap markers. By combining the tracked movements with a virtual 3D model of the set, the team Knight was part of created two versions of the scene: the real scene in the real world, and a CG scene in the virtual world.
“Because we had the exact geometry of the real world represented in the virtual world, we married the two in the viewfinder, so it was a one-to-one match,” Knight explained. From there, all it took was some clever keying (removing one part of an image, such as the green color of a green screen).
“We keyed out the virtual world, so you wouldn’t see an overlay in the camera, but the one thing we didn’t key out was the motion of Jake Sully. So, we would hit play on the actor’s motion and follow around the CG character on a live-action film plate, as if he really existed,” Knight said. (A film plate is a shot of a background which is composited with a subject.)
The second type of virtual production is to surround actors not with a green screen, but with a real one. Actors are filmed in front of a photorealistic wall of LEDs that display CG environments. These backdrops serve as virtual sets that can be completely controlled from behind the camera. With this approach, both the cast and crew can visualize what the final scene will look like, because they’re standing right in it.
“In that form of virtual production, you don’t necessarily need any visual effects, in theory,” Knight continued. “In some cases, the last thing you do is color correction and edit. That’s it. So it eliminates guesswork, and it also speeds up the process.”
Speeding Up the Process
Though perhaps not as glamorous as blockbuster filmmaking, engineering has its own forms of virtual production. Product design teams use simulation to study virtual 3D models before building expensive physical prototypes. Automotive engineers generate virtual roads to train their self-driving cars. Manufacturers cut virtual blocks of material before hitting start on a real machine. Factory planners create virtual plants to optimize production lines in silico.
“The progression of digital manufacturing is almost a parallel to visual effects,” says Taylor. “It used to be you had to do all of this by hand. You had to physically crash the car, you had to physically drop the product. And then do it over and over and over again. Digital simulation takes care of that now.”
For artists and engineers both—Taylor uses the umbrella term “creator”—the biggest advantage of all this is time. The less time spent building product prototypes or compositing visual effects, the more time creators have to, well, create.
“It’s about having that maximum time and flexibility to create and explore and do interesting things, and to be able to get things wrong,” says Taylor. “That’s part of what AMD is able to do, provide the solutions that give creators more time.”
In Knight’s experience, creators are very grateful for that gift. During his visits to film sets over the past few years, Knight has been greeted with an enthusiasm he never expected as a technology partner.
“People are genuinely excited about AMD [being] on set,” says Knight. “When we go to Industrial Light and Magic or to another [production studio], they want to pile in and tell us their war stories. When I ask an engineer, ‘How does it compare to what you’re used to?’ they go, ‘There’s no comparison. We haven’t been able to do this before.’”
To learn more about AMD’s role in M&E and how it transfers to engineering, visit AMD.com.