The CGI Paradox

· algieg's blog


Key takeaways #

Deep dive #

The quality of CGI in modern films is often criticized despite technological advancements. This paradox stems from a shift in how CGI is used: from a tool to enhance storytelling (as seen in 'Jurassic Park' and 'Lord of the Rings') to a primary driver of the narrative (e.g., Marvel films). Early films like 'Jurassic Park' (1993) used CGI sparingly, with only 4% of its runtime featuring visual effects. Over time, this percentage dramatically increased, reaching 80% or more in recent blockbusters like 'The Avengers' (2012) and 'Endgame'. This increased reliance on CGI, coupled with increasingly tight production schedules, means visual effects artists have significantly less time per shot. For example, 'Jurassic Park' allowed about 4.5 days per VFX shot, while modern films often demand completion in a matter of hours. This unsustainable demand leads to rushed, lower-quality CGI despite more powerful computers. The core issue isn't technology or artist skill, but rather studios treating art and stories as products to be mass-produced, leading to a focus on quantity over quality. Good CGI, often called 'invisible CGI' (as seen in 'Top Gun: Maverick' or 'Barbie'), seamlessly blends with practical effects and goes unnoticed, serving the story rather than becoming the spectacle itself. The solution lies in studios adopting a more measured approach, using CGI as a tool to support the narrative rather than overwhelming it, and allowing adequate time for its development.

last updated: