GhostFrameExtractor for Nuke : Part I – Concepts and Considerations.

Revolutionizing Visual Effect Work Flow with Ghost Frame

If you’re just looking for the Nuke Gizmo, you can download it on nukepedia or github. (links needed). Part II where I cover how I made the Gizmo. (link)

In the ever-evolving world of filmmaking, the pursuit of realistic and immersive visual effects has led to the adoption of cutting-edge technologies like LED volumes. While these massive LED stages offer directors, cinematographers, and actors a seamless way to integrate live-action performances with digital environments, they also present new challenges for visual effects artists.

Traditionally, live-action footage delivered to comp was shot on greenscreen. We’d spend a few hours keying it out and move onto comping in our CG or matte paintings or whatever. But with the advent of volumes, everyone wants to shoot with the backgrounds already embedded in the footage. This is great for everyone accept for the VFX artist. There is still plenty of work to do to each shot and generally we need the elements to be separate to do our work. That means roto and lots of it. The Mandalorian is famous for shooting on a volume, which made Mando’s armor look great, and then rotoscoping him out of every frame to change up the background a little or add volumetrics.

Shooting Ghost Frame :

The core concept is deceptively simple: instead of shooting at the standard 24 frames per second, the camera captures footage at 48 frames per second, with every other frame being a dedicated “matte frame” shot against a blue or green screen. This gives on set VFX artists instant access to both the married footage AND the matte.

In order to not drive everyone on set completely mad, you can just color the frustum.

Inside Stage 23 at Trillist. You can see just the frustum flashing blue.

Sky Ferren and Alex Coppedge wanted to test live on set compositing to speed up the workflow and give directors, DP’s, actors and producers immediate access to VFX looks. They brought ITAKI on board along with LUX and Trillist to shoot a spec piece written and directed by Alex.

Alex and Sky

The idea was simple. Shoot some ghost frame footage and deliver, in basically real time, both the footage and the Unreal project to onset comp artists who would do live composites bringing in all the elements we’d normally not see until much later in the process.

This definitely sped up workflow and some interesting challenges arose including dealing with timecode, onset editing and clip trimming, color matching problems and some other things that Sky and Alex address in their coming white paper. (replace with link when available)

Some of the footage we shot that day.

The Post Workflow :

By shooting on a volume we already we get our lighting and camera move for free. But could we also get a usable matte?

At first glance one would think “ok, great, we have the hero plate and a matte plate, we’re good right?” But we have one big issue. If our hero is moving, as they tend to do, then the matte frames don’t exactly match our hero frames.

All of you Nuke artists immediately figured out a solution to this problem. We can simply drop Kronos on the matte footage and slow it down by half. This interpolates new matte frames that should match the hero frames.

But not so fast! Kronos is great for quick timing effects but it doesn’t really do that great of a job actually matching the real frame.

Generated Hero Frame

Kronos interpolated matte frame

But we are on to something so let’s maybe look at a few different ways of interpolating new frames. For this I generated some new footage that would help us better see what’s going on. This is some free stock footage that Grimes released a few years ago which I just married to some free stock footage of a city after it burned down from wildfires.

Grime Stock Video from her video “You’ll Miss Me When I’m Not Around”

Let’s review some of tools we have for interpolating frames.

Interpolated using OFlow

OFlow is the older not so fancy brother of Kronos. It uses motion vectors, but tends to be splotchy. It’s very fast and is available in all versions of Nuke.

Interpolated using Kronos

Kronos is the go to tool for retiming in Nuke. It also uses motion vectors, but tends to smear a bit. It’s very fast and only available in NukeX and Studio

Interpolated using ABME

ABME (Asymmetric Bilateral Motion Estimation for Video Frame Interpolation) uses machine learning to estimate motion. It does exactly one thing which is slow footage down by half, which, luckily is exactly what we want to do. However, it’s impossibly slow and the quality is really just not that great. It’s available in the cattery.

Interpolated using Twixtor

Twixtor is a paid plugin specifically designed for slowing footage down. I’m using version 8 which has their new DNN model. It’s super fast and does the best job by far.

Interpolated using Boris Continuum Optical Flow

Boris has a retiming tool in their Continuum collection called “BCC Optical Flow”. I’m pretty sure it was just an afterthought because it does a pretty poor job of retiming every piece of footage. It uses motion vectors, but sometimes will just explode them all over your footage.

Interpolated using Boris Sapphire Retime

Boris also has a retiming tool in their Sapphire collection called “S_Retime” which is a pretty good tool. I’d say it compares to Kronos and may even do a better job with certain types of footage.

So this seems like a pretty good place to quit and honestly if I had installed Twixtor 8 earlier on, I may have stopped here. But I got to thinking maybe we can do better?

We actually have the missing frame. Maybe we can extract some information from it?

The Future Is Now! VFX

RiTE Media setup a large in house led volume that renders the background real time while tracking with the camera. They brought itaki on board to handle the integration between foreground and background. We pulled off a number of shots using volumetric fog and lights to marry the them together to make the shots foreground and background belong in the same world.

Goldbars Mocap, Unreal Engine, and Nuke

itaki teams up with RiTE Media again for a virtual production showcase. The final piece was includes over an hour of mocap fed into Unreal and then composited on live footage in Nuke. For this job we had to marry mocap takes with dialog takes. Then import tracked cameras from our miniature sets and light our avatar. Finally we’d export an Unreal render to be finally composited in Nuke.

ReadySet.Vote VoteCenter Spot

CoolCoolCool was commissioned to produce a “get out the vote” spot by the New Georgia Project. They brought us in to do the post. We decided to go with an early 80’s ESPN vibe. We edited the spot and then created early 80’s style graphics. Then completed the spot by wrapping the whole thing up in a CRT effect.

VA Health Connect VFX

We were asked to create a magical visual interface for the Veterans Administration’s Health Connect program. We did some 3D camera tracking, added the graphic elements and created the connection effect.