Syncing Dailies

In 2011, hand syncing of dailies seems downright anachronistic. Doesn’t timecode make all that trivial? Yes, with digital cameras, automatic syncing is standard practice. But this inevitably involves two clocks, and that means they are subject to drift. It doesn’t take much drift to put you out of sync a frame or two. Production is supposed to jam (synchronize) their clocks several times a day, but in the heat of battle that doesn’t always happen. The result is that picture and sound slowly drift out of sync.

In my editing rooms, we always check sync using slates, and resync if necessary. This takes time, but sync starts with dailies. If you’re in sync there, you have a shot at staying in sync further down the food chain.

Media Composer allows us to sync in two ways. First, you can use Autosync to merge audio and video clips. If your clips are pre-synchronized, load them into the source monitor, select video or audio and subclip to separate picture and sound. Then mark the slates and autosync to merge them again.

Second, and even better, you can use the Perf Slip feature to sync to the nearest 1/4 frame. Perf Slip is slick and quick but it comes with some limitations. You have to turn on film options when you first create your project — even if you never plan to touch a frame of film. It only works in 24 or 23.976 projects. And it only works on subclips. It comes with a couple of other minor limitations, as well, but I used it successfully on my last Red show, and wouldn’t want to be without it.

Either way, you’ll have to check every slate by eye. That’s trivial, right? You just line up the visual slate closure with the sound clap and you’re all set. True, but many slates are ambiguous. How you handle them is crucial to good sync. When we worked with film there was plenty of debate among assistant editors about this. Today, it’s a lost art. Here’s my interpretation.

First, you can’t sync properly without checking at least three frames — the frame where the slate closes, the frame before it, and the frame after it. Only with that context can you understand what happened at the slate closure. There are three possible cases.

Case 1 — Normal

In the first frame, the slate is clearly open, in the second it’s clearly closed, and in the third, it’s closed, as well. That’s the standard situation — no ambiguity, no blurred images. We make the assumption that the camera is making its exposure in the middle of each frame. In frame one, the slate is open. In frame two, it’s closed. So the slate hit somewhere between those two exposures. Check the images below (click to blow them up). The waveform of the clap is lined up at the head of frame two. That’s as close as we can get.

Case 2 — Blurred but Closed

Here we see a blurred frame 2. To decide where to put the audio clap, we have to examine that blurred image carefully. Did the slate close while the shutter was open? Notice that within the blurred image you can see both the top and bottom of the closed slate. The shutter was open when the slate closed and the camera captured an image of the closed slate within the blur. The audio clap goes in the middle of that frame. (Click to blow it up.)

Case 3 — Blurred but Open

Here, the second frame is blurred, but if we look closely, it remains open. The camera captured the slate in motion, but not in its fully closed position. The first closed frame is frame 3. So we sync between frame 2 and 3.

Syncing with this kind of accuracy takes work — blurred slates are always somewhat ambiguous. But if you look carefully, you can generally assign all slates to one of these three cases. If you’re syncing to the nearest frame, you won’t be able to achieve this much precision, but at least you’ll know what you’re aiming for.

Keep in mind that in a 24-frame environment, the camera is typically shooting at about 1/50th of a second and that the exposure occurs in the middle of a frame that’s being displayed for a 24th of a second. With that idea in mind, you should be able to sync as precisely as anyone ever did in a film editing room.

If you’re interested in more Media Composer techniques like this, check out my new book, Avid Agility. You can find out more about it here on the blog, or at Amazon.

Explore posts in the same categories: Avid, Avid Technical Tips, Workflow

15 Comments on “Syncing Dailies”

  1. Loren Says:

    Steve –

    Ah, the good old days of film and the blurred slate pretending to be both open and closed, proving Heisenberg’s Uncertainty Principle! I usually treated these as frameline synch- between worlds– but not always ;-)

    Great tip, as usual. You should indicate where it is in your book– this one is on Page 95.

    I think Avid should release this feature to all sorts of projects and just call it Slip Audio. For example, in FCP, you can load audio into the Viewer and slip against image down to 1/100th of a frame (called SubFrame Synchronization)- without subclipping, either!

    Query: Have you used PluralEyes for any double-system synchup where you have scratch audio available from camera? PE compares audio waveforms to synch up. Not infallable, but a lot of editors praise it as a huge timesaver. Have you tried it or found a need for it?

    – lsm

  2. Steve Says:

    I haven’t tried PluralEyes, but I just checked their web site. It’s currently in beta for MC. They have a little, well-produced video that shows how powerful it would be for musicals, lining production up with playback and doing in seconds what used to take a music editor hours and hours. But I’m not sure how much of a timesaver it would be when adjusting dialog sync by a couple of frames.

    For those of you who are curious, the video is here:

    Thanks for the heads up, Loren!

  3. Randy Lee Says:

    One potential issue that I see when using Pluraleyes – and one that isn’t often discussed, is that if you look at the waveform from the on-cam mic of DSLRs, often the waveform shows the clap as much as 2 frames before the slate actually closes – I’ve had to make similar adjustments on almost every DSLR originated piece I’ve cut. Don’t get me wrong, Pluraleyes is great, and will get you close, but it’s by no means a substitute for doing yourself and making sure things are done right.

  4. Steve Says:

    That was a concern that I had, as well. Some cameras don’t record in sync!

  5. Chris Jackson Says:


    Great article. I’m one of those assistants who spent the pre-Avid days hunched over the bench light-table debating the blurred slates with my fellow assistants.

    My approach to sync is slightly different than yours. Going back to the moviola days: since the film in the camera/projector gate has to actually stop still and wait for the shutter to open while the sound moves continuously, I never wanted the sound modulation to hit the sound head before the shutter opened, so this meant having the clapper modulation in the middle of the frame instead of on the frame line. It seems more distracting to the eye to hear the sound before seeing it. An editor I worked with years ago had a theory about people when speaking: with the exception of the “m” “b” & “p” sounds, people often open their mouths before the sound is heard, therefore it is better to be a hair late than a hair early with sound.

    If you are lucky enough to have multiple cameras where you can see both slates you will sometimes find that if you sync one up perfectly the other slate will be slightly off. This illustrates the difference between the always continuous moving sound versus the stopping and starting of the film through the camera gate. If I am unsure about where to put the modulation on a blurry slate I will usually go into the take and try to find a “b” or “p” sound, then I will lay that big fat blob on the waveform in the middle of the frame.

    Of course it has now occurred to me that I am talking about projects shot and/or projected on film. With digital maybe the modulation on the frame line is ideal. The debate continues!

    The important thing is for assistants to know of the perf slip function and use to it since telecine is often a full frame or two off…and usually early in my experience.

    Oh, and your book is awesome.


  6. Steve Says:

    Thanks, Chris. For “normal” slates, there’s always been a debate about clap-in-center vs. clap-between, and it used to be a NY vs. LA thing. The logic I lay out here is a bit agnostic, but the main thing, as you say, is that somebody is checking.

  7. Chris Says:

    I’m an LA guy, but now that you mention it, it was a NY editor that showed me “proper sync” on a moviola!

  8. Alain Says:


    What I mostly do is to check the sync after the offline is finished. Saves a lot of time since you only need to check those clips that are in the edit. The ‘Find Sources’ command is great for this as it highlights the subclips used in the edit. I color code those after checking and correcting if necessary (3-5% of the shots are wrong).

    I also use the Perf Slip command for this but I never slip 1 or 2 perfs, always 1 frame (3 or 4 perfs). This is because on the features I worked on, audio is always conformed to the original multitrack files in ProTools using Pyramix or Titan. That means if you slip 1 or 2 perfs, TC wouldn’t change, so those corrections wouldn’t show up in the EDL. The waveforms of the offline and confo would be slightly different so this would just lead to confusion.

    Of course what would be the solution is to import all tracks, and sync and edit with all of them. But I prefer to edit with the 2-track on-set premix and use the isolated tracks when necessary.

    Excellent book by the way, Steve.

  9. Steve Says:

    Thanks for the compliments.

    Regarding sub-frame syncing, if your sound editors are using Titan, then you should be able to sync any way you want. There will be an initial subframe discrepancy with the EDL, but Titan lines up waveforms and will slip the conformed tracks to match your guide track down to the sample.

    I wish sync were totally automatic and always perfect. But we aren’t there yet. The human eye is still needed!


  10. Alain Says:

    Ok learned something new then, thx !!!

  11. Matthew Gilna Says:

    For adjusting sync by whole frames on already synced clips, I’ve found the easiest way is to drop the clip into a new sequence, use the segment mode to adjust the sync, and then just auto-sync the sequence, which turns it into a subclip.

  12. Nisha Demir Says:

    Hello please help,

    i have problems with workflow Avid mediacomposer 5.5. I shoot on Sony Ex1 with seperate Audio. Now i want to autosync audio and video files.
    I worked with AMA. Is it possible to autosync with AMA? do i need some software.

  13. […] MM:  Hmm, think I disagree – MC is only frame accurate either way?  I guess I’m saying I’d prefer auto sync as it changes the MXF timestamp relatively, enabling PT WAV to retain it’s true pos ref.  Manual sync doesn’t change the MXF orig timestamp (correct?) so PT WAV will move with it:  out of sync to nearest frame.  We feel safer that way coz our WAV now matches the AAF, but it (& the AAF) are now both fractionally out of sync.  This is all putting aside the additional ambiguities & potential for human error that are possible with manual sync. […]

  14. robgwilson Says:

    I think the only thing being left out here is how to read the TC on the Smart Slate in addition to judging by clap/waveform. I’ve found that when you sync by source timecode, often you can be left with the issue of the Smart Slate reading TC a frame later than the TC on your audio TC. I have followed Steve’s rules, but I augment them by making sure the TC on the slate matches the TC on my audio tracks.

Leave a Reply to Loren Cancel reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: