Engineer Naomi Nagata (played by Dominique Tipper) watches a projectile whiz past her in an episode of “The Expanse.” Mavericks VFX was responsible for the whiz. (Mavericks VFX Photo)

It used to take a cast of thousands to create cinematic extravaganzas, but now the job can be done with a cast of dozens of artists and developers, plus thousands of cloud-connected computer servers.

The proof of that can be seen today in science-fiction epics ranging from “Star Wars” to “The Expanse.” And those shows merely hint at the beginning of a computer-generated revolution in visual effects, or VFX. Just wait until artificial intelligence hits its prime.

“That’s changing the game for all of us,” Brendan Taylor, president and visual effects supervisor for Mavericks VFX, told me. “That’s going to turn the VFX industry on its head in the next couple of years.”

Taylor should know: His Toronto-based company is doing visual effects for “The Expanse,” the interplanetary space opera whose fourth season just made its debut on Amazon Prime Video. Mavericks VFX also has a hand in the look for “The Handmaid’s Tale,” Hulu’s award-winning post-apocalyptic series; “The Boys,” Amazon’s dark superhero sendup; “What We Do in the Shadows,” a campy vampire mockumentary series on FX; and more than a dozen other productions for big and small screens.

Brendan Taylor
Brendan Taylor is president and visual effects supervisor for Mavericks VFX.

Cloud computing is already transforming the way Taylor and his colleagues in the trade do their thing. The VFX wizards at Mavericks have servers onsite to render scenes such as the mammoth Martian shipyard depicted in “The Expanse,” but sometimes that’s not enough. That’s when they turn to the banks of render blade servers that are accessible through Amazon Web Services.

“The great thing about cloud is, we can surge,” Taylor said. “So if we desperately need to get a render up, now you have to pay a bit more for it, but we can surge up to 300, 400, a thousand if we want. … And the great thing is, it’s secure. It’s very secure.”

In a wide-ranging interview, Taylor and I talked about the technologies behind visual effects and how they’re pushing science-fiction sagas further into the final frontier. Here’s a transcript of the Q&A, edited for brevity and clarity:

GeekWire: Tell me about Mavericks VFX and how it is that you came to be involved with “The Expanse.”

Mavericks VFX’s Brendan Taylor: Well, we started operating in about 2014 — just myself and another guy. Then we had a little bit too much work for two people to handle, so we brought on another person. Then we had a little bit more work and brought on another person. Cut to 2019, and we’re at 50 people, having to move offices like every two years.

We’re really close to Toronto’s movie studio district. Our offices are actually three blocks away from “The Expanse” studio and office, which is great. … In the beginning, I had a bunch of friends working on the show. Then I saw it, and I was like, “Oh, this is something I can be a part of.” I reached out to them, and we did a couple of concepts just on spec. We were a lot smaller then, but they decided to take a chance on us. We got some good work for season two, and we continued to get more for three and four, and now we’re working on season five as well.

The work is split between maybe six or seven VFX companies, for various reasons. Some companies are better at something in particular, and there’s a lot of work to go around. No single company in Toronto could handle it all. So we split it up, and we’re one of the main vendors now.

Q: Are there types of visual effects you specialize in?

A: Environments. That’s what we like to do. We do a lot of what’s called set extension. There’s a practical foreground, and we just extend the background digitally with matte paintings — you know, the full computer-generated world.

In season four, we were responsible for the Martian shipyard — basically, it’s a circular domed area where they disassemble ships. They shot scenes in front of a blue screen, and we were responsible for everything else.

The design part of it is really interesting. We stole, took the inspiration, paid homage to the Vehicle Assembly Building at NASA — and a lot of the decommissioned Russian Soyuz launch sites, stuff like that.

The look of the Mariner Valley shipyard on “The Expanse” pays tribute to NASA’s Vehicle Assembly Building as well as launch facilities at Russia’s Baikonur Cosmodrome. (Amazon Prime Video)

Q: What kinds of computer tools do you use, and how has the work changed?

A: We’re still using the same software that we’ve been using for 20 years. That’s Maya. But where things have been changing is in how we render. We’ve moved over to a program called V-Ray, and we’ve moved from CPU rendering to GPU, which is just a lot faster.

I don’t think there’s any way we could have gotten this done without moving to a GPU renderer. It needs to be very fast, because the environment is so big, and what we need to calculate is the light bouncing all over the place. If we were to use the older renderers, it would take something like 10 times as long as it took us, and we’d never be able to complete it on time.

That’s the big thing that’s changed. What hasn’t changed is the approach. It’s still the same: It’s research-based, compiling as much stuff as you can and conceptualizing it. Then you just hand different pieces off to different people who base everything they do on the concept work.

One of the things that we’ve started doing very recently is cloud rendering. In the past, we used to have render blades in our server room, taking up space, eating up heat. Eventually, we had to move to the cloud, because in our current space, we didn’t have enough room to add another rack of render blades.

My old roommate from the university ended up at Amazon, so we contacted him, and he got us in touch with the right people over at Amazon to start some cloud rendering. Basically, we have a mirror of our own system up on the cloud, and it’s as if we have an additional hundred render blades in the studio.

Obviously it needs to be accounted for financially, but it gives us the power to render massive scenes.

Q: Did the fact that you’re using Amazon Web Services lead to some interesting conversations when you were talking with Amazon Studios about working on “The Expanse” and “The Boys”?

A: No, no, no. We didn’t talk about that with them. It would make sense, but the line of communication is a little bit divided. We don’t really talk with Amazon Studios. We talk to an intermediary, like the visual effects supervisor.

Q: Is there an example of the work that you’ve done for “The Expanse” that fans might recognize?

A: Sure. One of the favorite scenes that we’ve ever done as a studio is the tool scene in season three. The crew members of the Rocinante are in the ship, and the gravity starts changing, and the tools go flying.

Q: Oh, right: You see all these dangerous tools whizzing past the crew’s faces:

A: Exactly. We did all that, and it was really fun because we created an exact digital replica of the set, and then we just ran a simulation that said, “OK, gravity’s now going this way.” So the tools would change position, and we had collision objects for all the railings in the spaceship. It’s like, “OK, this tool would hit this other tool, and then spin off the railing and pinwheel over this way.” We tweaked it a little bit with animation, but for the most part, it was all just simulated.

Q: Beyond the cloud, what are you seeing on the technological frontier for visual effects?

A: I think the big thing is machine learning. We didn’t use it in “The Expanse” … But Thanos from “The Avengers” is a great example. They used machine learning to drive the animation. They would film a ton of Josh Brolin talking, and then they would use machine learning to attribute that to a digital model. They took all the different permutations and combinations of what the muscles were doing, the twitches and all the small stuff, and applied it to the digital model. The more they filmed him, the more realistic it looked.

That’s doing the kind of stuff an animator could never do. I mean, maybe they could, but it’s a lot easier. What machine learning allows them to do is to analyze and apply many different variables, many sorts of different sources of information. And that’s going to be huge in the future.

Part of me is very excited. Part of me is a little bit nervous about what the future holds for us. Maybe we would be a much smaller studio. Who knows?

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.