Sydney, Australia

Process: how I edited three short films at the same time

The first time I found myself in front of a program resembling a non-linear editor, the year was 2005 and its name was Windows Movie Maker.

Flash forward to the present and I now use more modern software, and have managed to cut three short films in the space of three months.

In this post I detail my workflow, the creative process in assembling the story of each, working with directors, and the boatload of visual effects work I also performed on one of the shorts.

Hopefully this can give you some insight into how manage your own technical workflow, and what to pay attention to when constructing stories.

Contents

Meet the films

MAAI: My Aaji and I

A granddaughter is plagued with serious regret for not realising the consequences of her teenage angst.

  • Short, drama, 10 min 47 s
  • Location: Penrith, NSW
  • Shoot finish: 6 January 2019
  • Picture lock achieved: 29 January 2019 (23 days)

Key crew

  • Director: Varuna Naicker
  • Producer: Marija Nikolic
  • Cinematographer: Vivian Tang
  • Cast: Shereen Nand, Uma Kali Shakti, Raj Bajpai

Technical detail

  • Camera: Blackmagic Ursa Mini 4.6K
  • Captured in: 4096 x 2160, 25 fps, ProRes 422 HQ
  • Delivery: 1920 x 1080, 25 fps, 2.35:1 Letterbox
  • Editing: Adobe Premiere Pro CC 2019
TNC: The Next Cycle

Body transfers, hologram, gunshots and blood. A cyberpunk thriller.

  • Short, sci-fi, 11 min 2 s
  • Location: Ultimo, NSW
  • Shoot finish: 11 January 2019
  • Picture lock: 31 January 2019 (20 days)

Key crew

  • Director: Harri Tran
  • Producer: Alexandra Patrick-Dunn
  • Cinematographer: Andreus ten Brink
  • Sound Designer: Furqan Cansiz
  • Cast: Talita Mollerup-Degn, Jierlyn Gregg, Jasper Bruce

Technical detail

  • Camera: Blackmagic Ursa Mini 4.6K
  • Captured in: 2048 x 1152, 50 fps, ProRes 4444
  • Delivery: 1920 x 1080, 25 fps, 2.35:1 Letterbox
  • Editing: Adobe Premiere Pro CC 2019
  • Visual Effects: After Effects CC 2019
AHTC: The Torchlight Collective

A comedy channeling The Office and Rostered On.

  • Short, mockumentary, 12 min
  • Location: Ultimo, NSW
  • Shoot finish: 22 January 2019
  • Picture lock: 5 February 2019 (14 days)

Key crew

  • Director, Producer: Ashleigh Hales
  • Cinematographer: Benjamin Gageler
  • Cast: Eli Gallagher, Adam Bowes

Technical detail

  • Camera: Sony FS5
  • Captured in: 4096 x 2160, 25 fps, Atomos Shogun, ProRes 422 HQ
  • Delivery: 1920 x 1080, 25 fps, 16:9
  • Editing: Adobe Premiere Pro CC 2019

Editing

How did you identify what your priorities in storytelling were?

In My Aaji and I (MAAI), it became clear that the change in Devika was what we wanted to show the most. Her grandmother and the associated backstory only are really revealed as a kind of catalyst to stimulate change within Devika’s self. Realising that made it easier to navigate the film’s central interview scene, a scene which had previously been scripted to delve very deep into the grandmother’s life story. The exposition given by the grandmother was enlightening but likely would have tested the audience’s patience.

In The Torchlight Collective (AHTC), the narrative priorities were not clear until the final cut stage. It took a few rounds of outside feedback to hone in on the idea that Tom being hired as a new usher was the only story thread that was relevant to our viewers. Thankfully it was still possible to explore the personalities of a few other characters (Ronald, Sam) without straying too far from the drama. The pitiful world of Larry needed very little touching. His character shone through in the footage and he guides the film from the first to the final laugh.

Editing The Next Cycle (TNC) posed a greater challenge for storytelling. The film was shot so tightly on coverage (very limited takes) meaning that the final product thankfully resembled a tight version of the script. My first draft cuts tried to protect the pace of moments like Mia’s city car ride because they provided the audience with background to understand the rest of the world. Much of the film’s other slow, laborious moments permit the viewer to go searching with their eyes to take on visual cues for what is happening. Even in spite of my attempts to mediate the story for viewers unfamiliar with the universe, TNC in its final format seems to largely follow what was penned to paper originally by Harri Tran. Comprehension of the body transfers and general plot seems to be a secondary task that only some viewers seemed to be able to grasp. Like he remarked to me on his first few watchings of Blade Runner, it may take multiple watchings to understand, and it certainly did for me. The passion each character has for each other at various moments, shown through their intense stares and glances, seems to stand out irrespective of the plot, or of the particular bodies they are occupying at a given moment.

What editing styles did you use?

Despite hailing from three distinct genres, the films all employed a fairly regular narrative style. Each story had its own creative demands, but they held in common a certain battle against the clock to engage the audience in a concise way, running between 10 and 12 minutes long each.

Continuity recording on set, and the need to delete significant parts of the script.

When we speak about coverage, we refer to the lines spoken by characters, but also to the repetition of their body movements, eye looks and pauses in speech. It is disjointing trying to piece together good performances when the performers change their movements from one shot to the next. It means that certain shots or takes cannot be neatly stuck together and give the illusion that it is one space and time. Each film suffered from its own suite of issues along these lines of continuity, although MAAI definitely broke the most rules.

MAAI saw its five-page interview scene reduced to essentially one to two minutes of screentime. It involved very careful piecing and slicing to put lines together that were not originally said together, separated by minutes of other diversions and topics in between. Devika’s reaction shots as well as B-roll footage of the pair interacting with photos helped cover these spots.

To map exactly what dialogue lines were covered and in which shots, I printed out a modified version of the script, pencilling in lines to show where takes started and stopped. Looking over all the pages, it was easy to see which moments had more coverage than others.

How did you balance the director’s vision with your own vision?

It was initially tough. For MAAI, I produced an assembly of all scenes prior to meeting up with director Varuna for a session. It gave me the space to work out the kinks of the footage, knowing the coverage and what could be combined and what couldn’t, meaning when she asked, I could be prepared. Editing sessions in person revealed many of Varuna’s ideas and visions, although at times it was tricky to know how to transition them into actual editing decisions. I was also at times reluctant to take on suggestions. “Why don’t we try extending that first take?”, might have been said by Varuna, with me knowing that extending it any further would reveal a mis-stepped line or some other unusable moment. “No that won’t work”, I might have replied. A now savvier me would know a better response: to simply try it. Sometimes you can only find out if a particular cut decision works after doing so. I then decided to be more open and collaborative during the other editing sessions going forward, and it paid off, as Varuna and I felt more comfortable with each other, having the freedom to fail. We were able to balance the best of the footage we had to work with.

With AHTC, I took a different initial route and produced the assembly from scratch with Ashleigh by my side. I anticipated that I wouldn’t be able to figure out her vision immediately and wanted to ease into it after first working with her. She knew her footage well and could remember where certain coughs, looks or side comments were made and in which takes. The priorities as we cut each scene became the looks, the jokes and continuity of movements; but also, the importance of just bashing a draft out, and getting it out there in order, and avoiding agonising over the details too early.

We did have occasional differences, mainly in relation to the side characters. The script had been crafted with an ensemble cast in mind, with the pilot supposed to plant the seeds for further episodes. Supporting characters in shows like The Office build the story environment and add realism, allowing viewers to discover that all workers in the company have their own element of ridiculousness. Ultimately as a short film, some supporting characters tended to distract from the narrative goal. They were generally other young ushers who spoke up to the camera in talking heads, with quirky bites demonstrating their personality. It felt brutal for Ashleigh to cut them but there was clear value in removing them as the story became punchier and more intelligible.

With director Varuna Naicker. We were actually a boss team. (Photo: Sophia McGregor)

With director Harri Tran. (Photo: Stefan Varvaressos-Abdi)

With director Ashleigh Hales.

Workflow

How did your organise the material?

I created merged clips according to scene and their shot names.

I preferred the Merged Clips workflow compared to Multicam or other methods in PrPro. I created a single sequence for a whole day’s footage, inserted all video clips, inserted all audio, broke them up to add space in between, and used Premiere’s Synchronise with waveform. Then, after trimming ends, I created a Merged Clip named after the shot name on the slate. A Wacom tablet, a consistent mouse and hand position, and keyboard shortcuts for Merge, Link and Synchronise came in handy and turned a repetitive task into quite light work.

Sofi Marshall beautifully outlines this particular workflow on Frame.io, but fundamentally it differs as it makes use of Multi-Camera Clips to automate the audio sync process, which was otherwise done by hand by me. Automated audio syncing was not possible on these projects as none of them used timecode sync for budgetary reasons: student films though, who even knows what those long flashing numbers mean!

The act of sorting clips into scene bins reveals an almost clean, perfect chronological rendition of the content of the film. Scrubbing through the clips once they are in scene view, for me became like rapidly previewing the story, a stark difference to sorting clips by their arbitrary recording time or day of shoot.

Merged clip names followed the format “2A/01 (B001/L002)”: Scene 2, Shot 2A, Take 01, Boom audio 001, Lapel audio 002. The addition of audio track numbers meant it was easy for the sound editors to refer to original source audio clips when required. The tape name and Daily Roll columns made it easier to refer to the original source video, and to distinguish between clips that are part of the same scene, but from completely different days.

I later discovered when delivering to sound editors that the forward slash was not a good choice for clip names: files produced by Premiere’s Send to Audition cut off all characters preceding the slash (turning clip names like 2A/01 (B001/L002) into file names like L002 (1).wav, which was just awful). Lesson learned, avoid slashes, which is an already obvious rule for filesystems everywhere but didn’t seem like it would be a problem for PrPro clip names. Perhaps 2A-01 would be sufficient, and storing the associated boom and lapel numbers in a different way (like metadata fields, hopefully native WAV metadata fields rather than XMP so that they can appear in DAWs).

Syncing video and audio directly in the timeline.

The resulting merged clips with their name format and their new home in Scene bins.

Brief look at the sound spreadsheet.

While watching back all the rushes or syncing, I’d add markers after action is called, and for any moment where action stops or something blocks the take from being usable. This was invaluable on longer recordings when needing to search for lines or coverage quickly.

Markers to indicate unusable moments in red.

I gave each revision of the film during editing its own number. After producing Edit 1 and showing it to a director, the sequence was duplicated and I then made changes to Edit 2. That way, Edit 1 was preserved and could be returned to at any time. Similarly, Premiere Pro project files were incremented every few hours or so and at least once per day, so that moving between computers wouldn’t introduce overwriting problems, or destroy the order of clips you made an hour ago in a previous version.

Moving between my workspace at home and the labs at UTS was an exercise that was strictly controlled by the compatibility police. Premiere Pro by design is not backwards compatible and my CC 2019 project files were not directly editable by CC 2018 on UTS workstations. Josh Cluderay’s downgrader tool came in handy and the simplicity of the project structure meant that there were no major hurdles in converting. Relinking media was simple as for each clip, PR can maintain two media paths for OS X and Windows simulaneously.

What strategy did you have to balance 3 projects?

Clear dates and delegation. But it was generally not destined to turn out a time management failure because I already knew the shoots were taking place at different points in the summer.

I set an editing ‘goal’ date and a picture lock date. The editing ‘goal’ would be the day we anticipated having a complete version of the story in a shape we are happy with. We met the editing goal on all three projects. Staying on time in editing allowed for the other stages of post-production to take place without intense pressure. Sound design on The Next Cycle for example needed three weeks itself and was still largely unfinished even by the time of screening.

Visual effects

What were the goals of visual effects?

They were supposed to augment the virtual environment Harri was trying to create. They also served critical plot cues and indicated to the audience what part of the story/time continuum we were actually in. They were supposed to be believeable but simultaneously like part of a futuristic world.

Elements: hologram interfaces, computer screens, glowing eyes and building augmentation.

What kind of planning went into them?

The elements were included in all of Harri’s early storyboard drawings. In December, we produced a spreadsheet outlining 19 shots that included one or more of the above elements. He knew to create a believable hologram emanating from a character’s wrist, some kind of practical light was needed during the recording. His research into motion graphics also prompted us to use tracking dots as frequently as possible. Although it turned out they were unnecessary in a lot of places and just regular tracking of moving objects or manual rotoscoping by hand were required.

A collaborative spreadsheet to manage the various VFX final products.

What resources or materials did you use, like 3D models, original drawings or additional shots?

For the computer interfaces, I made use of a lot of Creative Commons-licensed and public domain imagery from Wikimedia Commons. Images of brains and lungs were taken from medical diagrams and illustrations, with care to choose designs that have sharp contrast, clear outlines and a particular visual style that is not cartoony or modern. Curves, invert, glow and opacity blend modes gave the images realism when composited on top and around the actual video plates.

The stages of opacity mode blending for an image of a brain. (Diagram credit: Frederik Ruysch, 1744)

Opacity mode blending for the lung graphic during the cerebral transfer scene, combined with particle effect stock videos. (Diagram credit: Patrick J. Lynch, 1987-2000)

The cerebral transfer interface, which is the computer program running in the background performing the body transfer, was one of the most critical visual effects because it is presented full screen to the audience. I had at least two iterations of the design after I realised I could play with the idea of surrealist medical technology and link it symbolically with the physical art direction (the various cords and wires surrounding Adam), different to a basic, nondescript progress bar and a generic computer ‘hacker’ interface.

The first graphic interface I produced, which almost became the locked off final shot before I realised I could do better.

Drafts for a new design, more medical and clinical, involving a database but also elements of body organs, heart beats and clearer key text elements that aren’t competing for attention.

The final interface created from the skeleton drafts.

How did you manage integrate the visual effects with the rest of the film?

The visual effects shots begun in Premiere, selecting the clips and replacing them with After Effects compositions. This meant the exact portions of video used by the picture lock would appear as own sequences. Then, compositing and masking began and eventually the compositions were exported out as ProRes 4444 files. I brought them back into the edit, placing them over the top on a separate video layer, as they were exactly the correct duration.

Dynamic linking was impossible as I worked across my own machine and UTS workstations. Again, CC 2018 and CC 2019 act like estranged, identical twins separated by some kind of family dispute who refuse to recognise each other’s files. This in spite of minute differences in the way they structure them. Rendering out compositions as ProRes 4444 files became the way to go, but involved a fair bit of manual labor. The VFX spreadsheet having file names, timecodes and description of Practical & digital effects helped me understand and determine exactly what needed to be done to a particular shot. For the final export, After Effects’ command-line rendering tool aerender.exe and a .bat file helped automate the task. Colour grading of the final clips in Resolve performed by colourist Christopher Kolkaris seemed to provide a lot of visual authenticity to most shots.

Did the other two films (MAAI & AHTC) use visual effects?

Yes, but not to the same extent. In AHTC, I edited out a camera lens smudge, and added a vignette to Larry’s fingerpuppet torchlight. In MAAI, I masked out the appearance of a boom mic several times, and combined a shot of Devika and her father to perfect the continuity. The method is simple and can be achieved in either PR or AE: duplicate the video clip, mask out only the section you want to use (Devika’s side, camera left), feather that mask, then watch carefully to ensure it flies by unnoticed with no issues.

(1) A combined example: both shots are overlayed on top. The original shows Devika has already entered and sat on the step. However, her seated position is too early in terms of continuity.

(2) The final corrected shot: Devika enters while her dad is still speaking.

Lessons

What were the main lessons learned across all projects?

Audiences seem to care substantially more about the development of a character than many other considerations. Avoid attempting to develop too many characters at once if it diminishes how much we learn about one.

Camera tests and equipment familiarity prior to the actual shoot will save time on the set because you will not have to actively learn how to use them. Camera tests on TNC would have revealed that 50 fps was wholly unnecessary and that 4K or RAW would have been fantastic; the reverse of what we decided in pre-production. Even if most AE compositions could hypothetically be created from 1080P video clips, 4K originals would have been helpful to assist stabilisation, reframing and the accuracy of tracking. RAW originals would have helped correct the mistakes in camera, especially since Blackmagic’s new BRAW codec was available to us, which provides RAW at far more economical file sizes.

What is a takeaway piece of advice?

Limit yourself and delegate tasks. If you feel you are the only person capable of performing a specific task, you will end up better off if you take time out to teach that task to someone else.

By the evening that AHTC finished shooting, I was fiercely adjusting the picture lock for MAAI and TNC, and knew it wouldn’t have been productive to sit through two days of footage from a completely unrelated project and start syncing it. I brought on second-year UTS student Zac Agius, spending a couple of hours with him on the process of syncing and labelling. It took a fair bit of theory and teaching about what sync sound is but because it contextualised each step, he was able to work independently and finish the rest without much guidance. A hands-off approach to teaching — using words and occasional pointing instead of clicking the mouse yourself — was beneficial and helped build muscle memory.

There were no interested parties in filling the role of data wrangler on set for MAAI. I wrote a PDF guide with screenshots and provided guidance to the producer Marija Nikolic and her co-producer Connor McGlynn. They both executed wrangling flawlessly, using Hedge. This saved me from being on set and was clearly worth the time spent on educating.

What’s next?

While the directors of each film are interested in refining their film further, I’m off on holiday and then beginning work on a TV production as a data wrangler. Stay tuned for some workflow posts and cool tricks about how its multi-camera 4K footage will be handled.

Where can we watch the films?

My Aaji and I is considering festival routes and options over the next 6-12 months, so there won’t be any public screenings for a while.

The Next Cycle is available online now on Vimeo.

The Torchlight Collective is a concept episode of a web-series, but remains offline at this time. Episode 2 is under development.