• Blog
  • »
  • What Is CGI? Your Guide to Movie-Making Magic

What Is CGI? Your Guide to Movie-Making Magic

/ WeVideo

Colorful planetary landscape.

Every few decades, technological advancements transform the film industry, and it’s never the same again. Arguably, no innovation had as significant an impact as the introduction of sound — until the introduction of CGI. In this article, we’ll explore what CGI is, its history, and where it’s heading in the future. 

What is CGI?

Actor in motion capture suit on film set. Image via Netflix.

CGI, short for Computer-Generated Imagery, is used throughout visual media to generate 3D animated visuals, allowing creators to build entire worlds or subtle visual elements that enhance storytelling.

Created from specialist computer software, CGI has completely transformed the way films are made from small details like a CG animal in the background of a scene to massive environments like a sprawling orc kingdom deep underground that the camera flies through. 

It's allowed filmmakers to transport us to distant galaxies, bring to life monsters and creatures that once only existed in sketches, fill scenes with hundreds of extras that would have been way out of budget, and create sequences that are too dangerous or complex to achieve in reality. For many blockbusters, CGI is as imperative as the camera used.

Film School illustration with the call to action, “Seamless video creation starts here.”

CGI's role isn't limited to film; it’s also found in animated movies, TV shows, video game cut-scenes, and, by definition, even print.  Buzz Lightyear from “Toy Story” — yep, he's CG. The droid in this on-set photo from Star Wars: Rogue One with live actors? He, too, is CG.

Still frame from Rogue One. Image via Disney.

However, unlike turning on a camera and directing it toward a subject, which can sometimes produce excellent results with minimal input, creating CGI elements is an incredibly complex process that requires a full post-production team, often spread across multiple facilities, to bring these creations to life. 

From sculpting and texture mapping to character rigging, animation, lighting, and rendering, the process follows an entire pipeline to create realistic-looking imagery. And that doesn't even include compositing CGI elements into live-action footage if the scene isn't entirely CGI!

Let's look at where CGI started to understand how we got to this stage.

A brief overview of CGI origins

When we think of CGI, it’s easy to link it with modern films, especially the blockbusters from the 2000s onwards. But CGI has been around almost as long as some of Hollywood’s biggest names. The first big leaps in computer-generated imagery (CGI) came in films like "Westworld" (1973).

Video via Ultimate History of CGI

Even though the CGI appeared for just seconds, the leap was groundbreaking. John Whitney Jr. and Gary Demos used NASA tech to convert an image into numerical data, which the computer could manipulate differently: changing colors, shrinking, expanding, etc. The result was a POV shot from a robot’s perspective. While it looks primitive now, it was unlike anything audiences had seen before.

From here, the film industry moved forward and never looked back. Every so often, a new film would enter the market with new ideas presented by new technology, such as a number of advancements brought to the table by “Star Wars” (1977). Notably, the wireframe technology was used in the briefing scene, which would be used in “Alien” just a few years later.

Video via YouTube

From “Star Wars,” Industrial Light and Magic (George Lucas’ VFX company) was propelled forward with interest and funding, and we saw new and advanced uses of computer-generated imagery in every year that followed, such as this sequence from “Star Trek II: The Wrath of Khan.”

Video via Ultimate History of CGI

On first watch, nothing seems out of the ordinary. But that entire sequence of the planet being terraformed was cinema’s first entire CGI sequence.

The 1980s and 1990s marked a golden era for CGI, with each new film contributing to significant advancements and a deeper understanding of the technology. The potential for CGI accumulated with every new step. Some of the additional standout moments from this era also include

  • Alien (1979): First use of a raster wireframe model for the navigation monitors during the landing sequence.
  • Tron (1982): First combination of live-action and CGI, pushing the limits of computer-generated visuals.
  • Labyrinth (1986): Featured the first realistic CGI animal—a digital owl in the opening credits.
  • The Abyss (1989): The first use of digital 3D water effects and the first entirely CGI character—a watery alien creature.
  • Indiana Jones and the Last Crusade (1989): Showcased the first all-digital composite in film, used for Walter Donovan’s rapid aging scene.
  • Total Recall (1990): Motion capture was used for the first time to create CGI skeletal characters during the X-ray subway shootout.
  • Backdraft (1991): First use of photorealistic CGI fire.
  • Terminator 2: Judgment Day (1991): First realistic human movements in a CGI character and the first use of a PC to create 3D effects in a major movie.
  • Jurassic Park (1993): The first film with photorealistic CG creatures, blending animatronics and CGI to create dinosaurs.
  • Toy Story (1995): The first full-length feature film made entirely with CGI, marking Pixar's debut.
  • Titanic (1997): First use of open-source Linux-rendered elements and groundbreaking water effects, blending CG with miniature models.
  • The Matrix (1999): First use of CG interpolation for Bullet Time, blending motion capture with still-camera footage for rotating action sequences.

While it came out in 2002, we can also include “Lord of the Rings: The Fellowship of the Ring” for introducing Gollum as the first photorealistic motion-captured character. Again, this CGI advancement would change how films could be made. 

Is VFX the same as CGI?

Although the general audience often uses CGI and VFX interchangeably, they differ on a technical level. As mentioned earlier, CGI refers to computer-generated imagery, while visual effects (VFX) technically refers to manipulating live-action footage.

For example, removing a billboard in an establishing shot and replacing it with one from the in-world universe is a visual effect. Changing a bright, sunny sky into one filled with dark clouds is a visual effect. Removing reflections, adding telescope overlays, removing the green screen, or removing wires — all of these fall under VFX. The list goes on.

Film School illustration with the call to action, “Seamless video creation starts here.”

However, it's important to note that there is some overlap between the two. A great example is seen in this VFX breakdown reel from Netflix for “The Crown”:

Video via Still Watching Netflix

We see a CG deer at the very start, which a CGI artist would have created. However, a VFX artist would have worked on the shot where the deer is composited into the live-action footage. 

So, while CGI is not inherently visual effects, it works closely with the VFX department.

Why are audiences turning away from CGI?

So, why are audiences increasingly against visual effects? It has reached a point where studios, actors, and even filmmakers emphasize the lack of visual effects in their films to entice viewers. 

Video via The Movie Rabbit Hole

But here's the reality: visual effects and CGI are used consistently across all forms of media — films, television shows, music videos, commercials — and more often than not, audiences have no idea they're even there. A great example is David Lynch's “The Girl with the Dragon Tattoo.” This showreel highlights the visual effects from this film and demonstrates how subtle and minimal visual effects can be, yet they remain almost undetectable to the average viewer.

Video via FilmIsNow Movie Bloopers & Extras

Given that CGI has become so powerful, so lifelike, capable of transporting us to distant worlds and creating characters born from dreamscapes, why do audiences want to return to practical effects? Why do they prefer films made with yesterday's techniques over tomorrow's technologies? 

You could assume that it comes down to bad CGI. But here's the thing: it’s not that there are bad visual effects artists in the industry — quite the opposite. If you're working as a visual effects artist at a post-production facility, you’re likely highly talented at what you do. The real issue is poor time management. 

Video via Corridor Crew

One of the more popular channels surrounding VFX, Corridor Crew, produces weekly videos in which they react to good and bad VFX, and in their video, reacting to the DC Comics film, “The Flash,” they say: 

“Personally, this is an example of visual effect work that is suffering from the triangle of expectation: it's cheap, fast, and good, but you can only pick two. What we see in the visual effects here is basically an example of artists who don't have enough time. We know where the tech is and what the abilities of these artists are, and when we see bad visual effects like this, I feel like the default reaction is to blame the artist. 

A lot of the best stuff, like what you've seen in ‘Avatar,’ is proprietary. So, it’s not like you can just hire another VFX house to replicate that. When you're setting yourself up for this, it's less about whether you have the budget and more about whether you're working with the right team for this particular task.” 

One of the biggest culprits of this trend has been Marvel Studios. Marvel is notoriously known for not locking in creative decisions during production. As a result, when it comes to post-production, visual effects artists are given more flexibility to change elements like costumes. However, when these changes keep evolving throughout post-production, the time frame for these effects to be created appropriately and polished starts to shrink. As a result, we see shots that look like a character’s head is floating inside a costume, or perhaps a scene feels more like a video game.

Bruce Banner in Hulk Buster Armor from Avengers film. Image via Marvel.

As Variety magazine reported:

“The schedule swap with ‘The Marvels’ had left the ‘Ant-Man’ sequel in a squeeze, pushing up its post-production schedule by four-and-a-half months. Marvel films are known for coming down to the wire, given Feige’s ability ‘to foam the runway and land a plane that way,’ says one executive familiar with how the company operates. But this unfinished level was unprecedented and would be noted in scathing reviews when the tentpole, with the $200 million budget, opened 11 days after the premiere. Critics weren’t the only ones dismayed. Fed up with 14-hour days and no overtime, Marvel VFX workers voted unanimously to unionize in September, sparking an industry-wide trend.” 

Unfortunately, these conditions are not just linked to Marvel and are widespread across the industry. So the next time you see bad VFX, just remember, it’s not because they’ve cheaped out on the artists, but it’s often down to poor upper management from the studio themselves.

So, where does CGI go from here? 

So, we've had incredibly lifelike imagery for the past decade. Smoke, water, and fire simulations are at an all-time high, artists are able to de-age actors with convincing results, and the visual effects industry continuously improves visuals with newer technology. Is there still room for improvement? Where does CGI go from here?

Harrison Ford in Indiana Jones. Image via Lucas Film.

The volume

Well, as of 2024, it’s not so much about pushing the boundaries of CGI regarding how great the imagery can look but how it can aid in creating films. One of the most notable and substantial advancements in the last five years has been the rise of virtual production stages. These are large, cutting-edge setups where tall, panoramic LED screens surround a set — sometimes in a complete 360-degree configuration. These screens create digital environments in real-time, allowing actors to perform in front of the LED backgrounds, as opposed to green screens, which would traditionally be generated and composited afterward in post-production.

Video via Industrial Light & Magic

One of the more challenging aspects of working on a green screen stage is that visual effects artists are often in a reactionary position. They must implement computer-generated imagery (CGI) to match what the actor is doing and make it look convincing. With virtual production stages, however, the actors can see the environment around them and incorporate it into their performances instead of relying on imagination alone.

One key benefit of this setup is the lighting. For a green screen, the stage must be uniformly illuminated for accurate key pull during post-production, which limits the type of lighting that can be used in that environment. In contrast, with LED volumes, filmmakers can use the light from the LED screens as part of the ambient lighting for the actors.

Film School illustration with the call to action, “Seamless video creation starts here.”

A prime example is seen in “The Mandalorian.” The character's shiny silver armor captured real, lifelike reflections from the environment projected on the LED screens, resulting in a far more convincing visual effect—something that would have been nearly impossible on a traditional green screen stage.

image5 (4)Image via Disney - Note how the sky and environment naturally reflect off the spaceship.

So, where does CGI come into play with LED volumes? It all comes down to what is transmitted on the LED screens. The environments displayed on these screens are digital environments created with CGI software like Unity, Blender, and Cinema 4D. By using the Unreal Engine — a real-time 3D creation tool— artists can adjust scene elements in real-time during filming.

Video via Cinematography Database

For example, suppose a rock in the background doesn't quite fit the composition, instead of waiting for post-production to fix it. In that case, the artist on set can change the size or position of the rock right then and there (as seen in the above video), allowing for a much more efficient workflow. Additionally, the camera's movements are tracked using motion tracking, so the background shifts accordingly to maintain the correct perspective, creating a sense of depth. This extra step, typically done in post-production within a CGI environment, is now captured in camera.

Performance capture

Regarding other technological shifts, industry leaders like James Cameron are redefining how films are created with CGI. In movies like “Avatar,” Cameron essentially places himself, his crew, and his cast in a giant playground, where the actors are fully equipped with motion capture gear. This means that their faces and performances are captured in real-time and projected onto CGI bodies and environments they can reference on set.  

Video via 20th Century Studios UK

Everything the characters do is directly based on the data captured from the actors' performances. This eliminates the need for animators to make guesses or manual changes — what you see is a direct translation of the actors' actions. Not only does this enhance the actors' performances, but it also simplifies the job of artists tasked with animating those movements. In particular, for “Avatar: The Way of Water,” where characters move underwater, the digital data is mapped and transferred directly into the CGI, creating incredibly lifelike movements.

As James Cameron calls it, this isn’t just "motion capture"—it's "performance capture." This approach gives his films realism and sets the CGI apart from other blockbusters.

How to get into CGI

How does one get into the field of CGI? Like many areas of motion art — filmmaking, editing, motion graphics, and CGI — the path into this career has shifted from where it was twenty years ago. Back then, you would often need formal education. The software and hardware used for creating CGI were not readily available for home use, and if they were, the cost of licensing such tools was often prohibitive for beginners or hobbyists.

Now, in 2024, the field looks vastly different. YouTube is flooded with tutorials, covering every step of the numerous 3D software options on the market. There are also paid courses online, like those from the School of Motion and Greyscalegorilla, where you can learn from professional tutors to level up your CGI skills. Software has become incredibly accessible as well, with subscription-based platforms like Maxon for Cinema 4D and ZBrush or free alternatives like Blender, which offer a powerful starting point for beginners.

Video via Blender Guru

The Donut tutorial — a classic. 

Formal education is certainly becoming optional for becoming a CGI artist. However, one advantage of formal education is the foundational knowledge it provides, especially when it comes to working within a professional pipeline. While a tutorial might teach you how to create a 3D octopus — something you couldn't do a week ago — it won't necessarily teach you how to work collaboratively, interpret feedback, or deliver assets to a client.

Additionally, colleges and universities are excellent for networking. They often connect with production houses and studios, allowing students to make contacts and explore job prospects once they've completed their qualifications.

However, it's worth considering the incredibly hostile nature of the visual effects industry. Countless monthly and quarterly reports exist about studios facing layoffs, mass closures, and ongoing struggles. And as noted in this article, the crunch culture surrounding CGI work has fueled the fight to unionize the visual effects industry.

Additionally, generative AI poses a particular threat to the CGI industry. While it's not yet flexible enough to create 3D assets from scratch that can be fully animated and rendered, for low-budget or low-priority content where the client might not care as much about the polished nature of the final product, generative AI can take on those jobs. So, if you plan to work in CGI, know that you're setting sail on stormy seas.

Conclusion

In the ever-evolving world of visual effects, CGI has revolutionized the way stories are told, bringing unimaginable worlds and characters to life. Unfortunately, unlike picking up your iPhone to start making a film, CGI is much more complex. You need the hardware, the software, the skills, and plenty of patience.

However, resources like TurboSquid offer free 3D assets, which can help mitigate some of the design steps when creating your CG scene. And remember, once you're ready to render and edit your sequence, you can use WeVideo to turn it into a polished video clip.