The Before Times: Practical Effects
Before the first polygon was ever rendered for a feature film, movies created magic the hard way. Miniatures were filmed at high frame rates to slow their movement. Animatronic creatures were constructed by hand and operated by puppeteers hidden just off-camera. Makeup artists labored for hours transforming actors into aliens, monsters, and historical figures. The results were tangible, textured, and imperfect in exactly the ways that made them feel real.
Stanley Kubrick built a rotating set to simulate zero gravity in 2001: A Space Odyssey. Ridley Scott used scale models and forced perspective to create Blade Runner's dystopian skylines. Steven Spielberg wrestled with malfunctioning rubber sharks on Jaws (a production nightmare that accidentally produced a more terrifying film, because the audience's imagination filled the gaps). Practical effects were labor-intensive, expensive, and genuinely dangerous. They were also irreplaceable.
The tools of practical filmmaking built an entire industry of craftspeople (prosthetics artists, pyrotechnicians, model makers, stunt coordinators) whose work gave cinema its physical weight. When a building exploded on screen, it actually exploded. That reality carried through the lens and into the audience's nervous system in ways that would later prove difficult to replicate.
Jurassic Park Changes Everything
The story of CGI in cinema has a clean before-and-after point: 1993, and the first time a fully digital T. rex appeared on screen in Steven Spielberg's Jurassic Park. Industrial Light & Magic had already experimented with computer graphics (the stained-glass knight in Young Sherlock Holmes (1985), the liquid metal villain in Terminator 2 (1991)) but nothing had yet convinced audiences they were watching a living creature.
The dinosaurs in Jurassic Park moved with weight and musculature. Their skin shimmered with environmental light. When the T. rex turned in the rain, its wet scales caught reflections. Audiences and critics were not just impressed; they were destabilized. The boundary between what was real and what was manufactured had shifted permanently.
What made the achievement so significant was not the technology alone but its integration. Spielberg and ILM used animatronics for close-up shots requiring tactile interaction and CGI for wide shots requiring movement at scale. The seams were invisible. That approach (physical and digital working in concert) would remain the gold standard for years.
The CGI Arms Race
After Jurassic Park, every major studio wanted to do what ILM had done. The 1990s became a decade of escalating digital ambition. James Cameron pushed water simulation in The Abyss and went further with the morphing effects in T2. The Wachowskis invented "bullet time" for The Matrix in 1999, using a ring of cameras and digital interpolation to freeze and rotate through action scenes in a way that had never been possible before.
Peter Jackson's Lord of the Rings trilogy (2001-2003) introduced Massive, a proprietary software system that gave each digital soldier in a battle scene individual AI-driven behavior. The result was the Battle of Helm's Deep: ten thousand orcs and soldiers fighting with apparent autonomy across a night landscape. It was a technical and artistic landmark.
Then came James Cameron again. Avatar (2009) represented a decade of development in performance capture, allowing Cameron to translate the full nuanced facial expressions of human actors onto digital Na'vi characters in real time. The film became the highest-grossing of all time partly on the strength of its world-building and partly because audiences paid premium prices for 3D glasses to see something genuinely new.
When CGI Goes Wrong
The problem with infinite creative possibility is that it invites infinite creative mistakes. As CGI became cheaper and faster through the 2010s, studios began to use it as a production shortcut rather than a tool of last resort. Green screen replaced location shooting. Digital characters replaced practical stunt work. Post-production timelines were compressed, leading to unfinished visual effects being released in wide theatrical cuts.
The results were often catastrophically visible. Audiences began to identify the telltale signs of rushed CGI: figures that seemed to float slightly above the ground, explosions without heat shimmer, faces that moved without quite matching the skull beneath. Marvel Studios, despite its enormous budgets, became a particular target for criticism as its films' visual effects grew increasingly plastic-looking.
"The danger of CGI is not that it's fake. It's that it makes you forget you should care about the difference. A real explosion has weight. It has danger. The camera operator has to decide how close to stand. That decision is in the frame whether you see it or not."
The Uncanny Valley Problem
The Polar Express (2004) and Beowulf (2007) demonstrated another CGI pitfall: the uncanny valley. Motion-captured human characters looked almost right but not quite, eyes lacked genuine light, skin lacked genuine texture. Audiences found them deeply unsettling without quite being able to articulate why. The more realistic the attempt, the more distressing the failure.
The Practical Effects Renaissance
By the early 2010s, a counter-movement had emerged. Christopher Nolan became its most prominent advocate, insisting on in-camera effects wherever physically possible. For Interstellar, his team built practical spacecraft sets and consulted theoretical physicists to render a scientifically plausible black hole. For Dunkirk, real Spitfires flew real combat formations. For Tenet, a real Boeing 747 was purchased and crashed for a single scene.
George Miller revived this philosophy for Mad Max: Fury Road (2015), filming actual vehicles in actual Namibian desert, with CGI used to extend and enhance rather than replace. The film's propulsive, chaotic energy felt nothing like the green screen heavy blockbusters of the same era. It reminded audiences what kinetic practical filmmaking felt like.
- Mission: Impossible series: Tom Cruise performing real stunts (HALO jumps, motorbike cliff rides) became a brand promise
- Mad Max: Fury Road: 80% practical stunts and effects, still feels unlike anything else
- Top Gun: Maverick (2022): real F/A-18s, real G-forces. Audiences responded with massive enthusiasm
- Everything Everywhere All at Once: micro-budget ingenuity over CGI reliance
De-Aging Technology and the Ethics of Digital Likenesses
One of CGI's most consequential recent applications is de-aging: the process of digitally smoothing an actor's face to represent them at a younger age. Marvel used it extensively; Scorsese used it for three-way de-aging of De Niro, Pacino, and Pesci across decades in The Irishman (2019). The results ranged from impressive to slightly unsettling depending on the scene.
More controversially, technology now exists to recreate the likenesses of deceased performers. The digital recreation of Peter Cushing in Rogue One (2016) opened a legal and ethical debate that has not been resolved: whose consent is required to render a digital version of a human face? What rights do estates hold? These questions are increasingly urgent as the technology improves.
What AI Means for Visual Effects
Generative AI is the latest technological force reshaping VFX. Tools like Adobe's AI-assisted rotoscoping, Runway's video generation, and various neural rendering pipelines are beginning to compress tasks that once required entire departments into workflows manageable by small teams.
For independent filmmakers, this represents genuine opportunity: the ability to create sequences previously impossible at non-studio budgets. For VFX workers, it represents existential threat. The 2023 SAG-AFTRA and WGA strikes both included provisions around AI, reflecting how immediately the industry recognized the stakes.
The question CGI has always posed (what should be real?) is becoming harder to answer. When generative AI can produce a photorealistic actor, a location, or a crowd from a text prompt, the question of what we are even watching when we watch a movie becomes philosophically interesting in ways it hasn't been since cinema began.