Sci-Fi Magic
Issue: Volume 39 Issue 1: (Jan/Feb 2016)

Sci-Fi Magic


An enviable combination of critical acclaim and box-office returns has catapulted Star Wars: The Force Awakens, the seventh episode in the Lucasfilm series, into the record books. In less than a month, the Disney/Lucasfilm sequel rocketed into number one at the box office, becoming the highest-grossing domestic release in history. After four weeks, it had earned $1.8 billion globally, with the China market just coming on board. 

A scan of reviews on Rotten Tomatoes, where the science--fiction action-adventure achieved a 93 percent positive rating, shows little if any mention of the visual effects. No complaints. No praise. One critic actually extolled the filmmakers’ success in avoiding “the deficiencies of those latter films: the orgiastic overuse of CGI.” 

Another critic raved about BB-8: “The simple design is ingenious and how wonderful that he/it is not computer-generated but of our world.” 

But, Star Wars: The Force Awakens has CG effects in 2,100 of its 2,500 shots, a proportion closely matching the number for each of the three prequels. And, BB-8 is digital perhaps a third of the time in the film. 

No wonder the feature has received Oscar and BAFTA nominations for best visual effects, and seven VES nominations. In addition, it received the AFI award for movie of the year. 

In the run-up to the film’s release, numerous articles talked about the practical effects. Few, if any, mentioned CG. Finally, on January 15, the day after Star Wars: The Force Awakens received a visual effects Oscar nomination, Disney released a video showing before and after visual effects shots. Greenscreens? Yep. From the first scene of Rey scavenging. The “Millennium Falcon”? Digital. The TIE Fighter sinking into the sand near the beginning? The sand in the background is real, but the ship and sinking sand are digital. Maz’s castle? Digital. That lightsaber battle? The entire forest surrounding the actors is digital. The explosions? A combination of real pyro and CG simulations. And the final battle, surely to no one’s surprise, has all-digital shots.

How did the artists at Industrial Light & Magic achieve the sleight of hand that convinced viewers and critics what they were seeing was real – or at least as real as the first three films? 

The answer starts with Roger Guyett, visual effects supervisor and second unit director. This is his fourth film with Director JJ Abrams, fifth if you count a brief bit of help with Super 8. 

“[JJ and I] were in the process of finishing Star Trek when he said yes to [Producer] Kathy Kennedy,” Guyett says. “Thankfully, he asked if I was interested. Once Star Trek came out, we were on Star Wars full time. I went down to LA once or twice a week. We talked a lot about the tone and style we were trying to capture.” 

Guyett had been a visual effects supervisor for Episode III, and was keenly aware that for the seventh episode, people wanted something with more of the flavor of the first films, Episodes IV, V, and VI

“I’m incredibly proud of the work we did on III, and have fond memories of working with George [Lucas], who was inspirational, with an incredible imagination,” Guyett says. “But on this one, we wanted to recapture more of the tactile, visceral quality of the early films. We wanted to find the right balance for the fans between historically correct with the right levels of respect and diligence, and at the same time bring something new and exciting. A lot was done practically. But, a lot is digital. The trick wasn’t worrying about approaches. It was the end result. To convince the audience that what they were seeing was really unfolding.”

Also joining the project early from the VFX side were the previs team at Halon led by partner Bradley Alexander, and the modeling team at ILM led by David Fogler, asset build supervisor.

Alexander adds, “We started around October 2013 and jumped right into talking with ILM to figure out file formats and scales that would work, so what we handed over would be nice and tidy. They gave us assets, and we also built a lot.”

Shipbuilders

Fogler and his team began working on the assets in Auto-desk’s Maya soon after Fogler started on the project, also in October 2013.

“I was 10 years old when I saw the first Star Wars [Episode IV], and I was blown away,” Fogler says. “What struck me was the aesthetic. The dirty, worn, rusty, real world. I could relate to it. It has stuck with me to this day. It fed all my decision making for the [The Force Awakens] world.” 

One of Fogler’s first jobs at ILM had been building practical models for Star Wars Episode I, and continued that work for Episodes II and III. He became a digital modeler for Pirates of the Caribbean: Dead Man’s Chest, and went on to receive five VES nominations and two awards ( Transformers and Transformers: Dark of the Moon).

“Finding the balance between old and new was hard,” Fogler says of the modelers’ work on The Force Awakens. “In building our new versions of these ships, we had to figure out the collective memory of those ships and then decide where to take them. We’d have conversations about what these things looked like. Was the Star Destroyer white or gray? Well, it was white in the film but gray on stage. We’d look at the old miniatures. We’d look at them on-screen, and then plug that information into our new matrix.”

For the “Millennium Falcon,” Fogler scanned a five-foot-long miniature built for Episode IV and created a mechanically logical digital version to match. “If you squint at it, it looks like the five-foot miniature,” Fogler says. “But, you can look at full-scale details a foot away.”

In addition,” Fogler says, “we designed and built the Star Destroyer, which is huge. It’s bigger than the Super Star Destroyer in Empire, more armored feeling, sleeker. We tried to strike a balance between things that looked photoreal but did not feel foreign to the miniature work done before. Our Star Destroyer could have been built as a miniature.”

As for practical ships, the production team built full-scale TIE Fighter and X-Wing ships, which were used on location. The ILM modelers scanned and photographed those models to create digital versions.

“Ours needed to function mechanically and work in multiple lighting environments,” Fogler says. “The camera gets very close to our builds.” 

Previs

Meanwhile, Halon’s artists previs’d that short, early sequence in which Rey digs through a Star Destroyer and jumps down a rope, and then began a more complicated sequence with the TIE Fighters and the “Millennium Falcon” – a desert chase sequence through the graveyard of starships.

Anytime a spaceship is in the air, it’s computer-generated, as is this X-Wing.
Here, tie fighters are shown against the sun in an all-CG shot.

“Roger Guyett pretty much held our hands as we figured out what JJ [Abrams] wanted,” Alexander says. “Once we got the script, we started building the foundation shot by shot. Then my whole team met with JJ, which was cool. We ended up with a good starting point for a sequence before everyone left LA for London. I previs’d the shot when the ‘Millennium Falcon’ flies up into the sky, twists, and does an inverted U-turn. When I saw the final shot with the John Williams score, I got tears in my eyes.”

Halon artists in LA also previs’d the TIE Fighter escape from the hanger and the Stormtroopers’ attack on the village. Then, Alexander followed Abrams and Guyett to London. 

“They filmed pretty much everything in the village attack in London and the hangar sequence,” Alexander says. “They shot some of the desert chase in Abu Dhabi. The interior of the ‘Falcon’ was in London, but the exterior was mostly CG.”

The Halon team continued working on previs through production and well into postvis (see “Postviz,” page 14).

“We use the previs and storyboards to flesh out ideas,” Guyett says. “But on the shoot, we invent new ideas.” 

Those ideas sometimes resulted in blank cards within edited footage.

“You’d look at an edit of a sequence and there would be an actor on bluescreen reacting to something,” says Halon Senior Previsualization Supervisor AJ Briones, “then there would be a few seconds of black cards with ‘action’ written on them.” 

At ILM, Animation Supervisor Paul Kavanagh helped design shots that would replace those black cards. With the exception of the CG characters Maz and Supreme Leader Snoke, a CG creature, and digital doubles, the primary focus for the animation department was shot design and ship animation for the all-CG sequences – the desert chase with the “Millennium Falcon” and TIE Fighters, the TIE Fighter escape from the hangar, the shots with ships traveling between locations, and an extensive battle between the X-Wings and TIE Fighters in the third act.

Animation

“Nowadays we see CG ships and creatures doing impossible things for the sake of spectacle,” Kavanagh says. “It’s tempting to go over the top with CG. But we wanted the feel of the original.”

To get that feeling and honor the director’s guidelines, the animation team relied on Guyett, Halon, the original films, and advice from artists at ILM who were on the original crews.

“Roger [Guyett] had a real grasp of what JJ [Abrams] wanted as far as shot design,” Kavanagh says. “We had a lot of previs from Halon. And, we looked at the old movies. We went back to the original trilogy and looked at the motion-control work as a guide to what we should do. They had restrictions on how fast the miniatures could travel. The longest motion-control track was 60 feet, and the camera could go back only so far in the building. We took some liberties, but the ‘Millennium Falcon’ doesn’t fly 2,000 miles per hour.”

During postproduction, Dennis Muren would often look at the work done by ILM’s artists and animators, and give advice. Muren had worked on the first Star Wars, received Special Achievement Awards from the Academy for the second and third, Star Wars: Episode V – The Empire Strikes Back and Star Wars: Episode VI – Return of the Jedi, a Technical Achievement Award, and six more Academy Awards for best visual effects. 

“He would come every week,” Kavanagh says. “He might say, ‘Yeah, that’s how I would have shot it.’ Or, he might have a different idea. He gave us a mini presentation about how they did the ships and camera moves for the original trilogy. The lenses they used. The depth of field. It was fantastic.”

Kavanagh’s team included 25 animators in ILM’s San Francisco studio, six in Vancouver, 12 in London, and 10 in Singapore. He also worked with animators at Base FX and Hybride. Singapore animators handled the TIE escape sequence, some face replacements for digital doubles, and a cute moment between BB-8, R2-D2, and C-3PO. ILM London animators worked on the two CG characters, Maz and Snoke.

“The San Francisco team did everything else,” Kavanagh says. “The ‘Falcon’ chase. The spaceship sequences. Animators in Vancouver basically worked as an extension of San Francisco.”

CG Characters

To better achieve the nostalgic look they were after, the filmmakers used puppets and prosthetics for most of the alien creatures, many of whom appear in Maz’s bar in her castle.

Maz (Lupita Nyong’o), however, is CG, as is Supreme Leader Snoke (Andy Serkis). The animators had motion-capture data for both. 

“The actors gave great performances,” Kavanagh says. “They were so good, we wanted to make their performances come though the characters. We’ve done a lot of that at ILM; we have a system.”

The crew captured Serkis at his Imaginarium studio in London and on set. Facial capture was most important for this character, which rarely leaves his elevated chair. Maz, however, was a tiny, wizened character with facial features that differ greatly from Nyong’o. For her, the ILM crew relied more on the Medusa system from Disney Research to capture a library of expressions that modelers converted into shapes on the 3D model. Then, rather than apply motion capture directly onto the model, the animators matched high-resolution video of Nyong’o’s performance, much as animators had done to create the character Davy Jones in Pirates of the Caribbean.

For the digital double of BB-8, animators matched that character’s performance as well – that is, the performance of the practical droid puppeteered on location. However, Kavanagh was instrumental in giving the lovable droid its first moves. He put a simple rig into the CAD model used to build the practical robot, and did quick animation tests for Abrams. 

“I had his head sort of lead the big ball,” Kavanagh says. “His head would go down first, and then he’d rock back to stop himself. I showed that to JJ, and he liked it.” The rest of BB-8’s personality developed as the puppeteers had the droid interact with the actors, but in the film, he is sometimes digital. 


(top to bottom) Modelers referenced “Millennium Falcon” miniatures from the original Star Wars to create the current CG model. Tie fighters chase the “Falcon” over a digitally reconstructed desert. The castle, X-Wings, sparks, and destruction are all digital.

“There’s a shot where he’s spinning inside the ‘Falcon’ and falling around until he supports himself with grappling hooks,” Kavanagh says. “We put him in the back of ships. Also, in some areas, it would have been harder to paint out low material on the ground than to use a CG version. The guys in modeling and look development did such an amazing job on BB-8; once he’s rendered, you can’t tell he’s CG.”

Animators also performed the Rathtar, a vicious creature with eight arms, suckers, and a giant mouth filled with gnashing teeth. 

“We couldn’t use too much simulation and still have it live in a world of foam latex creatures,” Fogler says. “Animators provided the balance.”

Environments

In addition to building and performing spaceships and a few CG characters, the artists at ILM also created massive environments – set extensions and all-CG environments.

Postvis

Halon artists created previs and postvis for Star Wars: The Force Awakens, working with Director JJ Abrams and Visual Effects Supervisor Roger Guyett. Sometimes the lines between previs and postvis blurred.

“You think of postvis as getting footage shot on the day in which something needs to happen – a set extension or CG element – so the editors can make a cogent story,” says Senior Previsualization Supervisor AJ Briones. “Or, you might combine plates. But, there are also CG shots that haven’t gone through previs, so a lot of post is still previs.”

Briones gives an example from the third act.

“We added set extensions and temporary lightsabers to the plates for the lightsaber battle,” Briones says. “At the same time, we were doing postvis on the space battle between the X-Wings and TIE Fighters to track the camera, add snow, take out bluescreens, and extend the shots.”

And then one day, in between shots in the edited footage, Briones saw a black card. 

“It said something like, ‘The planet explodes. As it explodes, it turns into a sun,’ ” Briones says. “That was a lot of fun, challenging. You relish the opportunity to do those shots. When I was a kid playing with Star Wars toys, I never imagined I’d be flying those things in a real film. It was pretty surreal.”


Supreme Leader Snoke is CG, performed by Actor Andy Serkis and animated at ILM London.

“There were a lot of fantastic locations,” says Visual Effects Supervisor Pat Tubach. “When we created the ‘Falcon’ chase sequence, because the ship is flying at high speed, we had to re-create a lot of that environment. We had a ton of plates shot in Abu Dhabi, and Roger [Guyett] shot a lot of aerial footage. We used some in shots, some as reference, and some as projections for our digital environments.”

Tubach worked with Guyett, as he had done on previous films, to manage the crew of approximately 200 artists in ILM’s San Francisco studio, 75 in Singapore, 75 in London, and 75 in Vancouver, as well as the work done by 80 artists at Abrams’ Kelvin Optical, and at Base, Hybride, and Virtuous. 

The photography and scans of the desert in Abu Dhabi, where the village battle, the desert chase, and other sequences on Jakku take place, gave the artists at ILM what Environments Supervisor Susumu Yukuhiro calls a “recipe for the environment. 

“We knew how the sky looks, how colors reflect on the dunes,” Yukuhiro says. “You think you know what a dune looks like, but the sun on particles of sand creates weird colors.”

To make the desert dunes, the artists sculpted 3D shapes, and then projected photography on top. Because the geometry was so complex, Yukuhiro began experimenting with Isotropix’s Clarisse iFX. 

“That worked fantastic,” Yukuhiro says. “Throughout the show, the geometry is big and heavy, but we didn’t have to optimize or make proxy versions of the environments. Clarisse doesn’t do modeling, but we could bring in 3D geometry, interactively light it, interactively place things in the environment, and then render it. It was a really great way of working.”

Yukuhiro counts the ‘Millennium Falcon’ chase through the desert graveyard as the team’s most challenging.

“It was not a technical challenge,” Yukuhiro says. “It was an aesthetic challenge. We weren’t making another sci-fi movie; we were making Star Wars. Not the prequels, the original Star Wars. Star Wars has iconic shots that are about simplicity, not complexity. For example, in one of the first sequences, there’s a shot of the speeder bike. The camera pans and we see the Star Destroyer in the desert. It’s a simple flat shot with the iconic silhouette of the Star Destroyer. We did that shot four times. We really pushed that look.”

A second major location was Maz’s castle. The filmmakers shot footage for that sequence in the back lot at Pinewood Studios outside London. 

“There was no actual castle,” Tubach says, “only a destroyed version. So we reverse engineered how it was oriented and worked out what it looked like. Then, using plates from the Lake District in England, we created a picturesque English countryside location with a beautiful, ancient castle. Of course, we then had to figure out how to bring it down in an interesting way.” (See “Believable Destruction” on page 16.) 

Trees in the background were a combination of photo elements and CG trees. “Anytime a tree moves, we created it in Speedtree so we could animate it,” Yukuhiro says. “Also, the grass.”

The largest environment was the Starkiller base planet, which the ILM artists created by extending plates shot in Iceland and creating fully CG worlds. 

“Instead of having a base on top of a planet, the First Order went to that mind-blowing level where they dug in and built the base inside the planet,” Tubach says.

The environments team again relied on Clarisse for shots of the ‘Falcon’ landing on the snowy planet, rendering the ships with a Katana-to-RenderMan pipeline and the environments in Clarisse.

“We had a lot of full-CG shots, but because we had photographic reference and because JJ [Abrams] wanted a practical, nostalgic look and feel, we also did a lot of 2.5D matte paintings with photo projections,” Yukuhiro says. “We worked closely with effects. They could take our scene, add effects, and place lights interactively. It was a really good way of working.”

Pulling It All Together

At the end, as always, the compositing teams create the final images, and for this film, Compositing Supervisor Jay Cooper led a team of ILM artists who worked on 1,200 shots. Abrams’ Kelvin Optical composited the rest. Thirty artists in San Francisco handled 60 percent of ILM’s shots, with 15 artists in London and 18 in Singapore taking the other 40 percent.

“There was a concerted effort to make this film feel like the originals, which were optical, with hand-drawn elements, rotoscoping, and glass matte paintings,” Cooper says, “and, of course, tons of models. We wanted to bridge the past and make it feel like those films, but not antiquated or anachronistic. No one was in any rush to go back to matte lines.”


(top) Artists at ILM extended footage shot on set with digital environments.
(bottom) Everything in this shot except the actors is CG.

To achieve that look, Cooper and his team worked in The Foundry’s Nuke with deep compositing. Hundreds of layers. With each shot, they tried to tease out elements they thought were important in the past. 

“Our goal was to make the film look organic, whether through lens flares, dirt, or grime,” Cooper says. “When people think of CG, they think of clean, antiseptic, razor-sharp edges.”

For rendering, various artists on the teams used Clarisse, Pixar’s RenderMan, Chaos Group’s V-Ray, and Solid Angle’s Arnold. 

“It’s whatever knives the chefs bring to the kitchen,” Cooper says. “It’s what artists feel comfortable with. There are things, obviously, like real-world rendering and environment lighting, that are huge helpers in the way you get the right falloff and the right reaction to materials. But at the end of the day, it’s an artistic process. We noodle everything – does this feel too sharp? Does it look too metallic? Too plastic? We have a matte painter on the show, Paul Houston, who has been doing this for 40 years. We reviewed shots with Dennis Muren and [longtime, award-winning VFX Supervisor] Scott Farrar. Getting feedback from people like them is worth way more than any tool.”

Star Wars’ nomination for best visual effects affirms the success this team had in achieving their goal. 


“JJ [Abrams] wanted this to be a film that looked like it could have taken place during the original trilogies, but he didn’t want to ignore modern filmmaking techniques,” says Tubach. “The idea was to marry current technology and techniques into the old-school techniques, to maintain the legacy of the project.”

Given the studio’s history and high level of artistic skills, they were probably the only ones who could.  

Barbara Robertson (BarbaraRR@comcast.net) is an award-winning writer and a contributing editor for CGW. 

Believable Destruction

Effects TD Supervisor Daniel Pearson led a team of 24 artists in ILM’s San Francisco studio, four in Singapore, 14 in Vancouver, and four in London who created fire, water, smoke, dust, explosions, and tons of destruction – ships blown apart, buildings collapsing, planets exploding – for Star Wars: The Force Awakens. 

“On this project, fire, water, and smoke were grid-based simulations using proprietary software based on the FLIP solver. We used our proprietary tool called Plume, our Zeno particle system, and for certain effects, [Side Effects’] Houdini. Because we didn’t have many water effects, we simulated them inside Zeno and rendered inside Houdini,” says Pearson.

In addition, the creature development team handles some simulations. 

“We do cloth-, flesh-, and spring-based simulations on top of our rigid-body simulations,” Pearson says. “That gives them bending and tearing. So their work overlaps what we do. We have a mix of both.”

For destruction, however, the effects team wanted a new technique that would be faster, better, and easier for the artists. 

“The big destruction shots are usually the last to finish because they’re so complex, so that gave us the lead time to build a new system, refine the pipeline, and solve issues,” Pearson says. “Rick Hankins, effects TD and R&D engineer, came on early and probably spent a year in development. He wrote a position-based dynamics system (PBDyn).”

Pearson explains how the new system works: “It takes a set of positions and, based on contact with neighbors, solves constraints. Constraints might include friction, attraction, and repulsion. We also have a shape-matching constraint so groups of particles react like rigid bodies. We can set different strengths for the constraints. The positions are calculated first. It derives velocities from where the points move.”

Before, the simulation system the team used treated velocity within one continuous volume. When applied in one area, the rest of the grid would compensate to reach equilibrium. With PBDyn, the simulation is more akin to spheres in space. As they touch, they exert forces on each other, and the constraints dictate how they act when that happens. As a result, one simulation can handle multiple levels of detail and material types. 

 “We start with every particle having the same set of constraints,” Pearson says. “Friction determines how far one point can rub against another before it slows down. Repulsion makes sure they all stack on one another. Shape matching on a subset of particles moves bigger groups as one unit but with single constraints on individual particles. That way we can have dry dirt on top of a rigid body, and snow on top of that – a set of particles that stick together and operate in the same simulation at the same time.”

Hankins developed a pipeline around the system so that it can procedurally generate the rocks, snow, and so forth based on models from the modeling group or the generalist [digital matte-painting and environments] group. 

A simulation might start with a big chunk of rock populated with particles. Voronoi partitioning breaks the rock into big and small chunks distributed in organic ways. 

“Then, we manipulate the data to erode certain sections based on edges and distance to the ground surface,” Pearson says. “That creates dirt. We defined other areas based on distance from the surface as snow. It’s meaningless to the simulation – it’s just a different setting on the attraction constraint.”

In practice, an animator might block in the destruction over time – the ground breaking apart, for example. The TDs could use that as a visual reference or, sometimes, as a raw footprint.

“In one sequence, the ground is collapsing,” Pearson says. “Animators roughed that in to show which sections should collapse at a particular time. We used those models to drive the sim at first. Then, we decided to use them as a visual guide instead, to make updates easier.”

For rendering these large simulations, the crew used Clarisse. 

“Clarisse can handle massive amounts of geometry,” Pearson says. “It was great.”