The Downfall of CGI
One of the most ubiquitous and fundamental elements unifying modern media is the usage of computer-generated imagery, or CGI. Found frequently in film, TV, music videos, and video games, CGI has seeped its way into the modern understanding of visual media – so much so that the term ‘CGI’ has entered common vernacular. You may often find yourself commenting on the quality of CGI used in a newly released movie, in awe or perhaps even in disdain. One may also be familiar with the issue of CGI in recent features being less believable than older works. While this can easily be explained as a decline in quality, the issue of CGI is not as simple as it seems.
To understand CGI, we must go back to its beginnings. Surprisingly, CGI is not necessarily a nascent technique, especially if one were to define it according to its name, ‘computer-generated.’ CGI has been in use as early as 1958, when Alfred Hitchcock made use of animated patterns in his highly acclaimed thriller Vertigo. Early methods of creating computer animations were mechanized, often requiring the animator to simultaneously perform a role akin to that of a design engineer. An early computer animation system, the Scanimate, required that the animator stitch together signals emitted from an image in order to provide it with “movement,” which was then adjusted to perfection via hundreds of knobs.
As technology improved, the process of CGI became more streamlined, allowing for the artist to engage more intuitively with their work, though it remained tedious. These developments came to an apogee in 1995. With the release of Toy Story, CGI was revolutionized; both the medium itself and public understanding of it. The first film to be entirely animated using CGI, Toy Story proved to be innovative and groundbreaking. The shiny, sterile textures created a realistic finish to the toys, and the intricate backgrounds depict how Andy’s house had been well-lived in. Through CGI, the film was able to capture minute and temporal details unachievable through traditional animation, creating an entirely new relationship between film and viewer. Toy Story showed that CGI was capable of achieving what was once impossible: emulating the details of real life to the finest degree.
The twenty-first century brought more groundbreaking accomplishments in filmmaking, achieved with the aid of state-of-the-art CGI. James Cameron’s 2009 film Avatar is 60% CGI, making use of revolutionary motion-capture technology allowing for computer animations to be overlaid onto the live actors. Such impressive techniques certainly contributed to the film’s success, to the point where the film broke box office records by an impressive margin.
In recent years, the public perception of CGI has shifted. The audience has grown accustomed to the feats achievable through CGI; thus, they are acutely aware of when it displays shortcomings. There are now clear distinctions between what makes ‘good’ CGI and ‘bad’ CGI: textures, rendering, and realism are all considered before a judgment can be passed on the quality of a film.
Due to the audience’s keen eye, many films (and consequently their studios) have been lambasted for featuring ‘bad’ CGI. A lack of consistency with the real-life actors and sets, easily discernible CGI, and what appears to be corner-cutting to minimize production costs are all common criticisms. Marvel Studios is a notorious example of this supposed poor quality, so much so that the cast and crew themselves have commented upon it.
In July of 2022, Vanity Fair uploaded a video to its YouTube channel titled “Taika Waititi and Tessa Thompson Break Down Thor: Love and Thunder ‘Taste The Rainbow’ Scene.” The video featured director Taika Waititi and actress Tessa Thompson analyzing a scene of their newly-premiered film. Early in the video, Waititi offhandedly remarks at the poor CGI. Pointing at Korg, the hulking Kronan warrior he portrays through the usage of motion capture and CGI, he asks if his character looks ‘real.’ Thompson responded that he does not, prompting a short exchange where the two laughed at the unbelievability of the rock creature. Although seemingly harmless in nature, these comments reflect the overarching attitude towards modern CGI.
How can CGI, considered jaw-droppingly impressive a decade prior, have devolved in quality so that the CGI found within movies boasting budgets in the hundreds of millions is laughed at even by its directors? Have we as the audience become so accustomed to such technology that the medium has lost its magic? Or is it indeed the case that CGI has nosedived in its excellence? The answer lies in the perspective of the artists behind the CGI.
A multitude of VFX artists spoke up in response to the video, explaining how they are underpaid and overworked. Through the testimonies of various artists, the picture became clear; rather than a singularity, it is a recurring transgression committed by various film studios. Most importantly, rather than the final product simply being bad due to a lack of skill and laziness, the final product is bad because the artists were given impossible expectations to work with.
A VFX artist explained to Vulture writer Chris Lee that they are “working seven days a week, averaging 64 hours a week on a good week.” Furthermore, they describe the phenomenon being ‘pixel f—ed’ by studios, a VFX term referring to a situation where “the client will nitpick over every little pixel.”
VFX houses will bid on a project in order to secure its rights to work on it. The studio will accept a bid it deems fair; in the case of Marvel, it often accepts lower bids in order to save on production costs. VFX artists are then given shots to work on. In theory, this is fine. However, Marvel often forces its artists to make immense changes at the last minute, such as asking the entire third act be reanimated a month before the deadline, as they did in Thor. It doesn’t help that the directors of the films have little understanding of how VFX as a whole works, often failing to understand the empty spaces on set whilst filming.
Perhaps, then, rather than a decline in objective quality, the issue of ‘bad’ CGI in recent releases stems from a disconnect between the artists and the director.
What, then, can be a solution?
The first is obvious: VFX artists must be given better pay and working conditions. Good art cannot be created if the artists are subjected to inhumane conditions.
Another solution is to rethink how CGI plays a role in modern cinema. The audience has become so familiar with the utilization of CGI in film that they are quick to subject it to high standards. In his essay “The Myth of Total Cinema,” André Bazin notes how cinema originated as a nebulous concept, combining the visions of various inventors, thinkers, and artists – thus a “myth” in its formation. In order for cinema to evolve, “every new development added to cinema must, paradoxically, take it nearer and nearer to its origins.” In other words, he claims “cinema has not yet been invented.” Perhaps we should take this view into account, both as filmmakers and audience, and thoroughly consider the role CGI plays in film. Do we rely on it too much? What comes next for computer-generated imagery?