Need a break from seeing the latest comic book hero movie, or found-footage horror flick? Learn a little about the behind the scenes efforts of a diverse crowd that help make these movies. Last week for our summer blockbusters series, we featured an excerpt from Lab Coats in Hollywood which explores the relationship between science and the film industry. For our second post we have a Q & A with Moving Innovation author, Tom Sito, a legend in the CGI world.
In what way was the dream of computer animation and its ultimate potential created and pushed along by “outsiders” and artists on the fringes of mainstream Hollywood?
In the early years, graphics was never a major focus of a mainstream computer science programs, except perhaps for demonstration purposes. Computer animation began as the projects of experimental filmmakers, gifted students and scientists with time on their hands. The pioneers of the medium can be categorized into two types: (1) The engineer scientist who longed to be an artist; and (2) the artist who sought to create in other than traditional means. Since the 1940s, these people labored at night or on down time, scrounged for funding, and shared information on their breakthroughs. Computer animation pioneer John Whitney Sr said, “We were outsiders, inside Hollywood.”
Your book centers on how drastically computer animation altered the movie industry. How resistant was Hollywood and its studio executives to computer animation initially?
At first the movie industry felt computer animation was too expensive, too complicated, and yielded results often below their expectations. Digital missionaries like John Whitney and Ivan Sutherland would make the rounds of studios trying to interest them. But more often than not they spent their time fruitlessly sitting in waiting rooms. CG production houses were advised not to use the words “computer” or “digital” in their sales pitches, because of the immediate negative connotation. For some studios it was easier and cheaper to buy a CG student film, like Ed Catmull’s thesis Computer Animated Hand (1972), and optically insert it into a movie, than actually create new animation.
How much has the film industry’s growth and evolution simply been driven by the emergence of new technology, such as computer animation?
Hollywood has periodically had revolutions that turned over its established way of doing things. In 1928 with sound technology, in 1948 with television and magnetic tape recording. In the 1990s with computer generated imaging. In each case fortunes are made and lost, power bases crumble and new ones rise, old production methods yield to new ones. This is as much a Hollywood tradition as the Chinese chicken salad lunch meeting.
In what ways did the U.S. government contribute to the growth of computer animation?
The important formative work creating the basic tools of computer animation occurred at a time when Hollywood was in financial decline, the late 1950s and 1960s. This coincided with the period of the Cold War, when the US government engaged in open-ended spending on research projects to compete technologically with the Soviet Union. Among the Defense Department’s priorities was computer graphics for simulations, both for training purposes as well as research films. In a flight simulator for an advanced supersonic aircraft, when the pilot turned his controls, the virtual landscape on his screen had to move in real time in accordance with his commands. This necessitated instantaneous computations on a very advanced level, what was called then “virtual reality”. The first wireframe constructs of people and devices were created in government labs. The term “Computer Graphic Imaging” (CGI) was coined by an engineer at Boeing working on ergonomically structured cockpit seats for jet fighters.
In the build-down after the Vietnam War in 1973, much of this research and the researchers themselves moved into the private sector.
What films were most pivotal in leading to the mainstream acceptance of CGI?
The turn in the public’s acceptance in digital filmmaking, what we now refer to as the Digital Revolution in Hollywood, occurred in the early 1990s with three films. Terminator II: Judgement Day (1991), Jurassic Park (1993), and Toy Story (1995). The effects created for Terminator and Jurassic were so breathtakingly real, as to be indiscernible with flesh and blood figures. Toy Story established that computers weren’t only good for making killer cyborgs, but could create warm, and loveable Disney-style characters. Before these three films, the idea of making a movie with computers seemed ridiculous. After these movies, the idea of doing a movie without computers seemed equally ridiculous. This year (2013) Hollywood is completing it’s transition to digital by shutting down the last producers of celluloid film. It marks the end of a way of making movies for over a century.
Or, if you prefer to use an RSS reader, you can subscribe to the Blog RSS feed.RSS
- art (7)
- bioethics (1)
- cognitive science (2)
- computer science (5)
- contest (1)
- current affairs (4)
- design (5)
- digital humanities (1)
- economics (3)
- education (1)
- engineering (2)
- environmental studies and nature (6)
- games (2)
- history (1)
- humanities (3)
- information science (5)
- journals (2)
- literature & poetry (2)
- math (2)
- MIT (1)
- neuroscience (2)
- new media (1)
- philosophy (2)
- robotics (3)
- science (3)
- technology (12)
- web / tech (2)
- women (1)
Blogs We Like
Books, news, and ideas from MIT Press
The MIT PressLog is the official blog of MIT Press. Founded in 2005, the Log chronicles news about MIT Press authors and books. The MIT PressLog also serves as forum for our authors to discuss issues related to their books and scholarship. Views expressed by guest contributors to the blog do not necessarily represent those of MIT Press.