Ms. Understanding VFX

An overview of the visual effects development and work done on the short film, Ms. Understanding.

I. Introduction

I began work on Damien Krzesinski short film Ms. Understanding around 2 weeks before production was scheduled. In order to help meet the asthetic of the time period in which the film was set, year 2100, I suggested having the characters interact with futuristic holographic phones. Additionally, we had some vary ambitious goals in mind which as we learned later, were perhaps a bit overamibitious for the time in post production we were alloted. This article details the developmental process of the VFX shots that were able to make it into the premiere.

II. Preproduction

Virtual production testing

During preproduction, I tested a few different ideas for visual effects. First, with the help of The Mix at George Mason University, I created a virtual production demo using only the hardware we had on hand. The goal of this test was to see if using a virtual environment could produce photorealistic results. Unfortunately, due to the time constraints, I was unable to put a viable demo together. However, I do plan on exploring virtual production much more heavily with better hardware. 

Working with the director, we instead settled on the concept of the actors using futuristic looking transparent phones. This was my first attempt at such a concept but during my preproduction research, I estimated that with the help of new modern tools including AI, we could safely bet that this effect could be implemented into the film prior to the festival.

I knew that the best chance for this effect to work was to use a prop that would allow the actors something to physically interact with. We settled on screen protectors since they are transparent and are obviously relatively the same size as a phone.

Additional pre-production planning was done by myself in order to find the best way to track the screen so that an image could be overlayed on it. I stumbled upon the newly released (out of beta) GeoTracker for After Effects by KeenTools. This software looked incredibly promising and had worked well in a brief test albeit not with the footage of the transparent screen protector.

III. Production

As both cinematographer and VFX supervisor, I unfortunately stretched myself thin and ended up making decisions that did not help me down the road. For one, we drew Xs in marker on laminated paper. This was a "we'll fix it in post" moment but due to time constraints, we weren't able to fix it before the festival. Additionally, I now realize these marks would not benefit myself later down the road since the shots were static and they ended up serving only as a odd visual quirk on screen.

Another problem that I only noticed in post was that Marcus, the actor playing Boris, had used both hands when interacting with the screen protector. If I had my focus solely on workload during post, I would have asked he only use one hand to eliminate more rotoscoping work that had to be done later down the road.

There was no pre-viz for what the screen would look like or the animation so the actors were left guessing what to do. In the future, I would work with the pre-production team to create a plan for the motions the actor would perform.

IV. Starting VFX

No one on the post team had worked with RED footage before, including myself, bringing a whole new set of challenges for us to overcome. The footage was shot on the RED Komodo 6K in anamorphic 2x mode on the MQ quality setting. This netted ~1.3 TB of footage that had to be distributed among the various members of the post team. Proxies were created but I am now unsure what my thought behind not compressing them further was. Now reduced down to ~700 GB, the footage was put on a few hard drives and provided to the team.

If I were to start over, I would have worked with the editor immediately after the shoot to create more slimmed down proxies. Additionally, SSDs, though more costly, are such an improvement to the workflow thanks to their snappy speeds that I would stress their use, especially with high bitrate RED RAW and similar codecs.

As we had less than 2 months for post, I started working on the VFX I could based on the shots I knew the editor had basically solidfied. Prior the Ms. Understanding, I had only worked in Adobe After Effects and thus I had not taken any other options into account. As it turns out, this was not the right call and I ended up wasting so much time trying to coax After Effects into working.

The major issue that I had with After Effects was with its performance. As many AE users know, it will use all the memory you throw at it. Since many of the shots I was attempting to work on where a few seconds and I was working with the raw video, timeline performance was atrocious. Hardware did not matter either, I tried first with my M1 Macbook Pro then the GMU Favs Mac Pro (12 Xeon W, Radeon PRo W5700X 16gb, and 48GB DDR4). Neither computer was able to efficently handle the footage, performance was still holding me back. 

The second issue was that the AE GeoTracker plugin, although an amazing piece of software, has to reanalyze the footage everytime you perform an action on the footage. This led to hours of waiting for analysis, a process required before you can adjust the track of an object, in this case, the transparent screen.

Believe it or not, I ended up spending almost an entire month or so wrestling with After Effects. I had made very little progress and the deadline to submit to the GMU film festival was fast approaching. After another weekend of frustration with AE, I decided to give the industry standard compositing software Nuke a try.

V. Nuke

Nuke is a popular VFX compositing software which I had previously known very little about. After watching Hugo's Desk, a YouTube channel run by director and VFX supervisor Hugo Guerra, I was able to get a basic grasp on how I could use the software to accomplish what I needed.

Nuke was an immediate improvement in both workflow and performance. Additionally, I now had access to the same toolset that Hollywood uses. Even better, GeoTracker by KeenTools was orginally built for Nuke. Unlike the AE version, you can save the analysis file which meant I only had to analyze the footage once unlike before. 

There were still a few hurdles I had to solve:

Now, it was finally time to begin working with Nuke.

VI. Screen Replacement Breakdown

Screengrab of the in-progress Nuke comp

3D model of screen protector made in AutoCAD

Here's how I comped the holographic display in this shot:

VII. Cost Breakdown

In total, $81.07 to rent a powerful machine and with the time savings, not bad. Keep in mind this figure is without the Nuke license so after the 1 year is up, I will most likely need to invest in a Nuke Indie license at $500 a year. More stomach-able than $3,299 but still expensive, however, the versatility and things you can do with Nuke might make it worth it for some.

VIII. Conclusion

Working on Ms. Understanding was a great learning experience for me and I hope everyone else involved. I'm looking forward to finishing shots that weren't able to be done in time for the festival.

I will be working towards improving my visual effects skills with Nuke and hopefully getting into more advanced things like set extension using projection.

Special thanks to:

Damien Krzesinski (director) and Emilee Hayward (producer) for the opportunity to work on this short.

The instructors and staff at the George Mason Film department.

Andrew Jorgensen and Evan Bowen at GMU film who provided help and guidance throughout all the stages of this film during the semester. They allowed me to work outside the box and gave valuable feedback and advice.

Professor Russell Santos, who introduced me to node based VFX compositing and provided guidance during his Spring 2023 session of FAVS 399, Visual Effects at George Mason.