Final Fantasy: Pushing CGI to the Brink of Reality

The Making of Final Fantasy: The Spirits Within



Look, listen, wonder.



Takes Computer Animation to New Levels of Reality -- So How’d They Do It?
By Andy Comer

“Nothing you’ve ever seen, nothing you’ve ever experienced can prepare you for where the next evolution in reality will take you!” So claims the preview for the completely- digital movie Final Fantasy: The Spirits Within. Are they exaggerating? Of course. But it’s tough to argue that Final Fantasy isn’t a great leap forward in the history of computer animation quality. Whether or not you actually believe that Dr. Aki Ross is a real person, you’ve got to admit that CGI technology has come a long way since it was first introduced to feature filmmaking in 1982’s Tron

So how did they do it? Browse these fast facts to find out how director Hironobu Sakaguchi and his team of computer geeks pushed CGI to the brink of reality. 

• The Japanese gaming company SquareSoft, which has sold a combined 26 million copies of its nine “Final Fantasy” role-playing-game titles, spent an estimated $115 million on the movie. The Japanese government even got involved, releasing investment bonds to help foot the bill. It’s a long-term investment: Some 200 programmers, animators and software designers were hired to develop proprietary technology that the company’s U.S. CEO and “Final Fantasy” game creator Hironobu Sakaguchi plans to use in game and movie projects for years to come. 

• The film was produced largely at a state-of-the-art computer animation studio in Honolulu. An estimated $10 million worth of equipment was used, including cutting-edge, 20-thousand-dollar Silicon Graphics Octane workstations for every animator. The project was as secret as a military operation -- entry to any of the main offices was by electronic key, and cameras were mounted outside each entrance to track the comings and goings of employees. 

• A team of actors (which included an “American Gladiator” contestant and even a former regular on “Love Boat”) played every major scene from the movie in skintight black body stockings, each laced with 37 reflective markers. A ring of 16 specialized cameras surrounded an all-black stage and recorded the simulation of true-to-life human motion. 

• Data from the reflective markers was downloaded into a computer, which translated it into movements in a virtual 3-D space, creating stick figures that matched the actors’ gestures. 

• Teams of animators then added human traits and characteristics by lacing as many as 50,000 flat, shaded polygons together. The result: faces that look like near-photographic representations of human beings. 

• Dr. Aki Ross’s face was modeled on an artist’s sculpture. She was then rendered in “wire frame mode,” in which a three-dimensional wire frame is superimposed over a sketched drawing of the character, becoming the character’s skeleton and allowing animators to give it lifelike movement and form, down to the movements of Aki’s spine. 

• Next, the wire frame was given a skin, which is known as “shaded mode.” Then, texture mapping added lighting, texture, shadow, reflections in the eyes, imperfections and other details. Freckles and pores were added by hand-painting on the computer. Because these details presented such enormous challenges, SquareSoft programmers again wrote their own software tools. A costume is separately rendered and layered on top of the form. Technical directors reportedly spent months ripping up clothes and learning to sew in order to faithfully render the behavior of fabric in motion. 

• How long did all this take? A whole month to produce a mere three to five minutes of film. And 20 percent of the film’s total production time was spent on Aki Ross’s hair alone. There are 60,000 hairs on her head, each of which had to be individually manipulated by the animator. 

This narrowing of the gap between reality and its digital simulation has some actors worried they might soon be out of a job. “I am very troubled by it,” said Tom Hanks in the July 8th edition of The New York Times. “But it’s coming down, man. It’s going to happen. And I’m not sure what actors can do about it.” After all, adequately humanized CGI actors could do whatever their producers and directors wanted -- without demanding a cut of the profits. Could this be the future?

     

Back to SFX News


 

back to top