The term Visual Effects (VFX) usually refers to post-production work to integrate CG or model photography into live action footage. Special Effects (SFX), in turn, usually refers to practical, on-set effects like explosions, smoke, mechanical effects, etc.

When I first started working in the VFX industry way back in 2001, CG post-production had become the dominant force in VFX. Few productions were using miniature/model effects, and even fewer animatronics (to the point in 2004, Jim Henson’s Creature Shop packed up and moved back to Los Angeles from London).

MPC made strides in a number of CG innovations, including its innovative Alice crowd simulation software which was used on a number of big productions. Over the years, we’ve seen major improvements in motion capture, character rigging, CG modelling, compositing – generally all the usual fields that CG VFX is put to use.

But as film makes way for increasingly ambitious television productions, new techniques have to be found in order to reduce costs and speed up production turnaround.

The Star Wars films have always been a major innovator in the world of visual and special effects. Right from the first film back in 1977, George Lucas had the foresight to start his own company for the purposes of producing the complex visual effects needed to tell his story.

Those complex visual effects have mainly been restricted to film only – costs for producing something on that scale for television has been been pretty prohibitive. Attempts at the first Star Wars TV special were interesting to say the least:

Getting back to the the evolution of VFX tools, the games industry were also making progress with more and more complex video games. This too required substantial development and R&D, and you’ll find that both VFX and the games industry share many of the same tools and techniques – each industry complementing each other.

Fast forward to Disney’s “live-action” The Jungle Book and subsequent The Lion King. Director/Producer Jon Favreau and his team developed brand new techniques to help create the films. This in turn has been used in the first ever Star Wars television series, The Mandolorian.

In order to create the fantastical sets, landscapes and other backgrounds, the production brings the old technique of rear projection bang up to date by using giant LED screens to produce a highly detailed background. This allows filmmakers to shoot practically whereas before you’d typically shot on green or blue screens and replace the backgrounds in post-production. My biggest issue with this approach is that scenes which use green/blue screens never have fantastic lighting. If you’re shooting a scene which is, for example, set outside but shot indoors with green screen compositing, it never looks real. The lighting is a dead giveaway. Whereas with this new virtual set system, the lighting is a lot more accurate and realistic.

The virtual set system obviously requires some VFX to be produced pre-production to display on the giant LED screens, so this technique is bringing together the terms VFX and SFX – creating a blurred line between practical and post.

What’s interesting is that the system uses Epic Games’ Unreal Engine which is probably best known for the video games Unreal Tournament and Fortnite. Filmmakers can make live changes to landscapes and environments on the fly through the use of Unreal Engine. All these techniques are thanks to ILM‘s Stagecraft Virtual Production team, Epic Games and Jon Favreau’s Golem Creations.

It’s all very impressive, and I consider it a major game changer to the point that if they haven’t already, the entire team responsible should be given an AMPAS scientific award for filmmaking innovation ASAP. It’s certainly the most exciting FX technology that I’ve seen in past 20 years.

The Mandolorian airs exclusively on Disney+, the new streaming service from Disney, and will be available to watch in the UK starting the 24th March (I can’t wait!).