I’m standing on the rear cargo door of a S.H.I.E.L.D. Quinjet, just as Captain America did before his famous leap without a parachute. I am surrounded by two assailants both holding Terrigen crystals. The woman smashes hers on the deck of the Quinjet and a blue haze rises. As it does, a black, lava-like shell forms around my body, a cocoon of sorts. If that cocoon is hollow, I am dead. If I have the right gene, I emerge as an Inhuman.
Those are the kinds of thoughts that go through your mind on the set of ABC’s superhero series, Marvel’s Agents of S.H.I.E.L.D. In addition to the Quinjet, I also had the opportunity to stand on the main body of Zephyr One, the plane that replaced agent Coulson’s famed “Bus” — and I got to hang out in the S.H.I.E.L.D. break room.
And those two assailants? One was my daughter, Alyssa, and the other was Visual Effects Supervisor, Mark Kolpack, who invented the effect known as “volcanic necrosis,” which consumes humans when exposed to the Kree gas.
What brought all of us together was a sensor that fits on the back of an Apple iPad. The Structure Sensor (structure.io) from Occipital, Inc., is an add-on camera that clips to an iPad and transforms it into a three-dimensional scanner.
As I walk the sound stages, I imagine I’m a lowly extra, billed at the bottom of the credits for one week only. Unlike Clark Gregg or Ming-Na Wen, I don’t rate a full laser body scan to capture all of my physical subtleties for Visual Effects and prosthetics. I’m in one show, I die, and that’s it.
In the old days, I might not get an effect, or I would get something subpar, from a distance — or perhaps just a cut away that returns to me in an ill-fitting rubber coating — or I might rate a combination where blurs and waves suggest a transition between states. But Marvel’s Agents of S.H.I.E.L.D. seeks theatrical film level Visual Effects every week, which means high quality, delivered quickly and on a budget.
Mark Kolpack, who has created Visual Effects for the likes of Buffy the Vampire Slayer and Heroes, describes the process of using the Structure Sensor on set:
“There was a scene where we had a couple, one stunt guy and one actress, that came in for one scene at a dinner party and restaurant. I scanned her face and his face and it worked out really nicely. We were able to use the scan geo to map the volcanic necrosis animation perfectly onto those scans very precisely. And I just did it recently on Episode 304 (‘Devils You Know’). We had a requirement where essentially we had to apply something to Chad Lindberg’s face — sort of like rash. Once the makeup was applied, I scanned him and I also did some single camera photogrammetry with my Canon 5D Mark II. I shot three series of images around the actor, high, level and low where he’s sitting on a chair. Those became textures. We are able to match them, moving with the scanned head.”
In our conversation, I ask how the new tool fits with his work, and what reactions he gets on the set:
“It’s extremely beneficial. I enjoy it, and it’s always a great conversation piece because people see this apparatus go onto my iPad, and they’re like, ‘What the hell is that?’ And I show them. I scan them and they’re like, ‘Oh my God.’ It’s blowing the minds of people on the crew and directors. It’s really fun. But it serves a real-world practical purpose for me because it gives us an exact representation of an actor or an actor’s face. I also used it on the Lash hole, where Lash blasts holes through people’s chests. The effect in Episode 301 at the hospital. At the morgue, you see the chests are blown out. I needed to apply one digitally, so I took out the scanner and I scanned it and photographed it and sent it off to the Visual Effects guys and they used that. It’s a cool tool.”
Although costs are variable and production numbers not available to the public, Kolpack says the structure sensor approach is much less expensive than a traditional laser scan. When he needs a quick scan for a prop or geo of an actor it is a handy tool. The kind of effects on S.H.I.E.L.D., Kolpack, reflects were “pretty much relegated to the feature world not too long ago, say like seven years ago, ten years ago.”
The fact that Kolpack can scan on-set, with anybody, also makes it more convenient, expanding the use of high-quality visual effects to scenes where they might not have made economic sense in the past. “When laser scanners first came out in the late 90s,” Kolpack says, “they were these giant contraptions that people stood in and they stayed still and it went all the way around. And then it became the hand-held scanners, and that was revolutionary, and that’s what a lot of people use today to scan both props and people. And then, of course, this thing comes along and this takes us to a whole other level of accessibility. If you need something in a pinch, wham, you have it.”
So what is the structure sensor? It is a camera that augments the built-in optics of the iPad, providing it with a second camera to capture depth. Like many technologies, the sensor started out as a Kickstarter project. The camera attaches to the iOS device via a bracket made specifically for each iPad model. The camera must be charged prior to use. A Lightning cable between the iPad and the Structure Sensor provides real-time communications.
In use, a laser projects mesh onto the objects in the field of view. The infrared camera reads these to discern shapes and distance. Structure Sensor employs the iPad’s camera, which is precisely aligned via the bracket, to capture color.
For Kolpack and his effects houses, they end up with an object file, or OBJ, that can be used for complicated post-production tracks if needed, even applied frame-for-frame, or for object tracking.
The Structure Sensor also brings Kolpack some calming control that is all-too-rare in weekly television productions.
“I think it’s completely crazy. I can take my time, I don’t feel rushed with a vendor. If I am working with an actor, I can say, ‘Okay, now give me an expression where you’re shocked; now give me one where you’re angry; now where you’re really in pain.’ And I just scan him or her and it just works.”
As I stand in Director Phil Coulson’s office, looking at the detailed model of the “Bus,” the Axe on the walk that severed his hand as he was threatened by Terrigenesis, and the currently blank giant projection screen used to inform the S.H.I.E.L.D. team, it becomes clear that science fiction television has entered a phase that demands a new level of realism. For television viewers, the particular technology being used might not be all that important. But when they look at effects, camera tricks, and animation no longer suffice. And computer-generated effects need to, well, not look like computer generated effects. At CES this year, many were abuzz about virtual and augmented reality. The Structure Sensor is really a tool of augmented reality, permitting the blending of humans and real objects with effects, in ways that were not possible before, and at a cost that was unimaginable just a few years ago.
Kolpack sees the Structure Sensor as “a pretty revolutionary kind of tool to be able to use, and not just on a feature environment. It helps give viewers Marvel feature Visual Effects for television. We have to uphold the brand and make them the best that we can with the time and budget that we have. And I think we’ve raised the bar pretty damn high. And a lot of shows have followed suit, so that’s kind of nice.”