//literary and arts//
Fall 2019
Fall 2019
CGI Realism in a Post-Truth World
Harry Ottensoser
Film has reached a crisis point. The medium once celebrated as being uniquely capable of presenting truth in a world of subjectivity has now entered a phase where that very idea must be challenged.
When film was first introduced into the world, it was seen as a revolutionary medium—one that could convey the world exactly as it was. Early films, such as Thomas Edison’s Fred Ott’s Sneeze (1894), often featured mundane scenes of everyday life, their draw being the sheer spectacle of getting to witness what felt like an exact recreation of the real world. Unlike any medium that preceded it, film offered a window into the world unfiltered by human subjectivity. A painting relies on the painter’s interpretation of what they see, a poem on the poet’s conception of what they feel. But a celluloid film strip would always contain images that were an exact replication of the scenes that had played out before the camera, free of any human intervention that could corrupt that objectivity. Yes, filmmakers determined the positioning of the camera, the lens used, and in the case of narrative filmmaking, what sort of costumes or characters passed before it, but the viewer always knew that everything they saw had really existed.
The realist quality of cinema was a gift that had never existed before in history. No other medium could be convincingly brought as evidence in a trial. No other device could so objectively capture the lives, worlds, and actions of its subjects. No other art form empowered its viewers to rely on their own subjectivity first and foremost, drawing their own conclusions from the information before them with the confidence that what they were seeing was there.
Not anymore.
Computer Generated Imagery (CGI) is not a new phenomenon in film; it was employed to alter films as early as the 1970s, and its use has only increased throughout the following decades. It refers to the digital editing applied to a film after its initial shooting, usually involving a green screen or a character wearing motion capture dots to track the slightest movements of their face. It’s the technology that turns Mark Ruffalo into the Hulk, creates a surreal Spartan landscape in 300, and transforms a pillow on a stick into a dragon in Game of Thrones. CGI’s technical changes to the form, and more notably its complete and total adoption by some of today’s most popular filmmakers, has radically altered the identity of the medium.
What’s so critical about CGI today is the way it’s being used to mimic the realism once inherent to film, thereby forcing viewers to question whether a film is presenting an objective, unedited view of the world in the way it once did. Unlike the Hulk and GOT, wherein the point of each is to emphasize its fantastical nature, CGI realism blends itself into the background of a scene, proving most successful when it goes unnoticed by a viewer. Its aim is to pass for reality, masking its nature as an effective sham emanating from a computer.
Take, for example, IT: Chapter 2— the sequel to 2017’s IT, which famously featured an ensemble cast of prepubescent teenagers. Although the sequel was set 27 years later, with all the actors recast as adults, it did have some flashback sequences that involved new scenes with the younger actors. But, because those actors had undergone puberty in the years since filming the original IT, their faces were de-aged for their cameos in the later film using CGI. What appeared to be a simple scene with a young actor in front of a camera was actually a fully doctored sequence generated in post-production.
This de-aging technology is en route to attaining mainstay status in Hollywood today. It was used to de-age Will Smith by 20 years in his recent film Gemini Man, to make Robert De Niro appear as his younger self in Martin Scorcese’s The Irishman, and even to restore Carrie Fisher to her 1970s-era self in the Star Wars film Rogue One. Again, this technology aims to deceive its viewers, allowing filmmakers to radically alter a form once defined by its objective realism. It even allows filmmakers to subvert the ultimate reality of death, bringing people, like Carrie Fisher and Paul Walker back to life.
This past summer’s remake of The Lion King was, in essence, a showcase for the capabilities of CGI realism. While the movie’s talking and singing animals were more immediately obvious in their CGI composition, it was the vast, entirely software-rendered African landscape that was most shocking in its imitative reconstruction of reality. No camera was used to shoot any of that film. The natural world itself is now replicable within an entirely unnatural realm.
Ultimately, this influx in the adoption of CGI realism forces viewers to question whatever it is they see in a video. While any film can be made without any digital alteration if the creator so chooses, what is lost in our era is the assumption that what we’re seeing has really transpired in material reality. If even seemingly straightforward depictions of 13-year-olds are really 15-year-olds de-aged, or a tree in an African grassland is nothing more than an intangible fabrication, then what can we believe is real when we watch movies? How can we know that footage brought into a court case doesn’t have a digitally altered time stamp, or that it includes a figure that wasn’t there?
This technology isn’t just in the hands of the wealthiest film studios. Photoshop and other image editing softwares have put us in the position to question every picture we see on the internet. The foundation of our trust in visual media is no longer sound, and is now prone to a new level of exploitation. Film is not merely entertainment; it plays a role in our understanding of current events, and so the collapse of its assumed objectivity can have serious implications on our civic lives.
This technology, once celebrated for its freedom from human intervention in its depiction of reality, is now regularly subject to the same subjective manipulation that defines the art forms that came before it.
All of this amounts to the radical reinvention of the medium once uniquely suited to objective truth. In a post-truth world where news is skewed by the agendas of its presenters, and every position can be defended by a string of oxymoronically false facts, the death of film as a facilitator of objective reality may well give rise to devastating complications. We’ve lost a medium of certainty, unlike any that had existed before.
This is why we have to find new avenues to sustain a grip on reality and a commitment to objective fact. While film can no longer be trusted by virtue of its construction to present the world as is, it is still one of our most powerful tools at sharing personal ‘truths’ and connecting people worldwide. As CGI continues to work its way into popular cinema, it is imperative that we consume content with a discerning eye, and approach the issues of visual manipulation responsibly.
When film was first introduced into the world, it was seen as a revolutionary medium—one that could convey the world exactly as it was. Early films, such as Thomas Edison’s Fred Ott’s Sneeze (1894), often featured mundane scenes of everyday life, their draw being the sheer spectacle of getting to witness what felt like an exact recreation of the real world. Unlike any medium that preceded it, film offered a window into the world unfiltered by human subjectivity. A painting relies on the painter’s interpretation of what they see, a poem on the poet’s conception of what they feel. But a celluloid film strip would always contain images that were an exact replication of the scenes that had played out before the camera, free of any human intervention that could corrupt that objectivity. Yes, filmmakers determined the positioning of the camera, the lens used, and in the case of narrative filmmaking, what sort of costumes or characters passed before it, but the viewer always knew that everything they saw had really existed.
The realist quality of cinema was a gift that had never existed before in history. No other medium could be convincingly brought as evidence in a trial. No other device could so objectively capture the lives, worlds, and actions of its subjects. No other art form empowered its viewers to rely on their own subjectivity first and foremost, drawing their own conclusions from the information before them with the confidence that what they were seeing was there.
Not anymore.
Computer Generated Imagery (CGI) is not a new phenomenon in film; it was employed to alter films as early as the 1970s, and its use has only increased throughout the following decades. It refers to the digital editing applied to a film after its initial shooting, usually involving a green screen or a character wearing motion capture dots to track the slightest movements of their face. It’s the technology that turns Mark Ruffalo into the Hulk, creates a surreal Spartan landscape in 300, and transforms a pillow on a stick into a dragon in Game of Thrones. CGI’s technical changes to the form, and more notably its complete and total adoption by some of today’s most popular filmmakers, has radically altered the identity of the medium.
What’s so critical about CGI today is the way it’s being used to mimic the realism once inherent to film, thereby forcing viewers to question whether a film is presenting an objective, unedited view of the world in the way it once did. Unlike the Hulk and GOT, wherein the point of each is to emphasize its fantastical nature, CGI realism blends itself into the background of a scene, proving most successful when it goes unnoticed by a viewer. Its aim is to pass for reality, masking its nature as an effective sham emanating from a computer.
Take, for example, IT: Chapter 2— the sequel to 2017’s IT, which famously featured an ensemble cast of prepubescent teenagers. Although the sequel was set 27 years later, with all the actors recast as adults, it did have some flashback sequences that involved new scenes with the younger actors. But, because those actors had undergone puberty in the years since filming the original IT, their faces were de-aged for their cameos in the later film using CGI. What appeared to be a simple scene with a young actor in front of a camera was actually a fully doctored sequence generated in post-production.
This de-aging technology is en route to attaining mainstay status in Hollywood today. It was used to de-age Will Smith by 20 years in his recent film Gemini Man, to make Robert De Niro appear as his younger self in Martin Scorcese’s The Irishman, and even to restore Carrie Fisher to her 1970s-era self in the Star Wars film Rogue One. Again, this technology aims to deceive its viewers, allowing filmmakers to radically alter a form once defined by its objective realism. It even allows filmmakers to subvert the ultimate reality of death, bringing people, like Carrie Fisher and Paul Walker back to life.
This past summer’s remake of The Lion King was, in essence, a showcase for the capabilities of CGI realism. While the movie’s talking and singing animals were more immediately obvious in their CGI composition, it was the vast, entirely software-rendered African landscape that was most shocking in its imitative reconstruction of reality. No camera was used to shoot any of that film. The natural world itself is now replicable within an entirely unnatural realm.
Ultimately, this influx in the adoption of CGI realism forces viewers to question whatever it is they see in a video. While any film can be made without any digital alteration if the creator so chooses, what is lost in our era is the assumption that what we’re seeing has really transpired in material reality. If even seemingly straightforward depictions of 13-year-olds are really 15-year-olds de-aged, or a tree in an African grassland is nothing more than an intangible fabrication, then what can we believe is real when we watch movies? How can we know that footage brought into a court case doesn’t have a digitally altered time stamp, or that it includes a figure that wasn’t there?
This technology isn’t just in the hands of the wealthiest film studios. Photoshop and other image editing softwares have put us in the position to question every picture we see on the internet. The foundation of our trust in visual media is no longer sound, and is now prone to a new level of exploitation. Film is not merely entertainment; it plays a role in our understanding of current events, and so the collapse of its assumed objectivity can have serious implications on our civic lives.
This technology, once celebrated for its freedom from human intervention in its depiction of reality, is now regularly subject to the same subjective manipulation that defines the art forms that came before it.
All of this amounts to the radical reinvention of the medium once uniquely suited to objective truth. In a post-truth world where news is skewed by the agendas of its presenters, and every position can be defended by a string of oxymoronically false facts, the death of film as a facilitator of objective reality may well give rise to devastating complications. We’ve lost a medium of certainty, unlike any that had existed before.
This is why we have to find new avenues to sustain a grip on reality and a commitment to objective fact. While film can no longer be trusted by virtue of its construction to present the world as is, it is still one of our most powerful tools at sharing personal ‘truths’ and connecting people worldwide. As CGI continues to work its way into popular cinema, it is imperative that we consume content with a discerning eye, and approach the issues of visual manipulation responsibly.
//HARRY OTTENSOSER is a junior at Columbia College and Literary and Arts Editor of The Current. He can be reached at ho2262@columbia.edu.
Photo courtesy of The Outer Haven.
Photo courtesy of The Outer Haven.