Timecode for ML Users (and abusers)

Started by dfort, March 24, 2016, 06:09:06 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

dfort

In the late '90's a friend working as a post production supervisor at Sony Pictures called to hash over a problem. He had a project that was being shot on the brand new Sony DVCAM format. The idea was to shoot four different stories with four cameras without stopping and put them all on the screen at the same time. The plan was to shoot just one take and make no cuts. Problem was, the cameras kept drifting and the timecode wouldn't match up so they were trying to get it to work by cutting out a frame here and there to get it to sync--they were also on their second week of shooting. They finally resolved it on the sixteenth take. The name of the movie?

Timecode

So if a major studio with the resources of a giant multi-national corporation had problems with timecode there's a good chance you're going to encounter some issues too. Let's get started.


I borrowed this image from an old post on the editingstandards, a long abandoned site started by a couple of very talented assistant editors. I'll keep referring back to it.

Recently I've been having conversations with some Magic Lantern users and developers and it seems that most people here already have a basic understanding of what timecode is (a standard maintained by the Society of Motion Picture and Television Engineers), how it is displayed (hours:minutes:seconds:frames) and some of the ways timecode might be used in an MLV workflow so I won't bore you with the basics. Instead let's jump into the areas where most people seem to having difficulties.

Look at two timecode displays in the upper right of the image. Notice one is in 24fps and is called "Record Time" while the other is counting in 30fps and is called "Tape Time." Now a quiz--what fractional second is the timecode displaying? Well the 30fps timecode is showing 10 frames and the 24fps timecode is showing 8 frames so 10/30 or 8/24 will give you the answer--0.33333... seconds.

This sample was originally shot on 35mm film at 24fps and transferred to NTSC 29.97fps videotape in a telecine bay using a 3:2 pulldown. The timecode on the videotape is non-drop frame. I'll get into why we don't use drop frame timecode a little later but first look at the extra number to the right of the video timecode. It is both a 1 and a 2 interlaced together. These numbers are burned into the separate video fields identified as upper/lower or field 1 and field 2. There are other ways that these fields can be displayed without having to add extra characters to the timecode display. One popular method is to use a period as the delimiter between the seconds and frames for field 1 and a colon for field 2 so in the above example one field would be 01.10 and the next 01:10.

Our cameras have a setting that allows us to shoot either 50p or 60p (actually 59.94) the "p" standing for progressive or full frames so they are not interlaced (limited to 1280x720 on ML supported cameras and up to 1920x1080 the 7D Mark II). So--it is understandable if two fields make up one frame then that one frame is identified as a frame in the timecode but why isn't every progressive frame identified as a frame in the timecode? Some developers think that because we're in the digital age we should be able to change timecode to accommodate this higher framerate material and indeed some systems do stray from the SMPTE specifications but there are reasons why the standard that was established about 50 years ago hasn't changed yet. One reason is because the broadcast signal is still being interlaced. Now I'm not an engineer and have not worked in broadcasting so you might want to run a fact check on that. The bottom line is that even when we work with 50p and 60p media, we have to stick with 25 and 30fps timecode.

dfort

Trivia question--when you're watching a movie that's being shown on a film (not digital) projector, how many times per second is the projector flashing images on the screen? The obvious answer would be 24 (unless it is an old silent movie) and that might be true with home movie projectors but that's not how it works with 35mm and 70mm film projectors. The shutter actually has two blades in a "butterfly" pattern and each film frame is flashed twice on the screen. Why? Because 24fps would cause too much annoying flicker.

This is important to understand because modern digital projectors and HD television sets also refresh at rates higher than the 29.97 NTSC or 25 PAL frame rates. Once again referring to the sample image notice that the key number field on the lower right has an interlaced A & B. I won't go into how the NTSC film to video 3:2 pulldown works, there's plenty of information online that explains it, but what is important to understand is that TV sets have screen refresh rates that are multiples of the frequency of the line current so B and D frames from 24p material will be on the screen longer than A and C frames. The thing to keep in mind is that even with 24fps media the screen refresh rate is running at 60, 120 or even 240fps on some of the higher end displays.

What does this have to do with timecode? Quite a lot because broadcasting in NTSC is still done at multiples of 29.97 (actually it is an irrational number if you do the math, 30/1.001 or 60/1.001 or 120/1.001) and SMPTE engineers have developed a timecode standard called drop frame timecode so that NTSC video timecode doesn't drift at the rate of about 108 frames per hour. The way it is done is that the timecode field jumps two frames every minute (no frames are actually dropped just numbers) except for every 10 minutes. This makes the timecode accurate to "real" time. Well almost, it is still off by about 75 milliseconds per day or approximately ½ second per week.

While drop frame timecode might be useful for broadcasters trying to keep a tight schedule it introduces lots of problems in post production. In the 20+ years I've worked in editorial I have seldom seen any delivery requirement that specified drop frame timecode on videotape. In fact for digital delivery I have never seen drop frame timecode. In addition 24p video even when adjusted to 23.976 for NTSC broadcast doesn't have a drop frame timecode standard.

The message here is that unless you have a very specific reason to use drop frame timecode you should probably avoid it like the plague.

Back in the days of videotape we used to edit copies of the master tapes with timecode burned into the image area. After making an "offline" edit we would create an edit decision list (EDL) that was used to "online" the master tapes. What is important to understand is that in addition to timecode we also need the tape number in the EDL. Note once again in the image on the first post there is a tape number on the upper left side of the frame. We don't use tape these days but in addition to timecode we need some other piece of information, file name, shot name, shoot date, something that can help uniquely identify each clip. This is especially important if you want to go back to your original MLV files and extract DNG frames for color grading.

Ok--so the most obvious way to assign timecode to DNG frames is to start with frame 00000, give it a timecode value of 00:00:00.00 and keep incrementing the timecode with the frame numbers. Right now that's what is being done with MLVFS and MLP. I'm not sure if the timecode in the DNG files are being used by the editing systems but that's a good place to start albeit redundant because the timecode always matches the frame number in the DNG name.

There are other ways to use timecode. Back to the sample image note that the audio timecode along with the sound roll is burned into the lower left of the frame. In the days before we went all digital we used to rebuild the audio from the original tapes much the same way that videotape was "onlined" from the master tapes. In an MLV workflow it might be possible to sync an audio file from a separate recorder to the WAV file extracted from the MLV camera original and use the audio timecode from the field recorder.

It is also possible to add additional timecode tags in DNG files (up to 10 according to the Adobe specifications) so we could have a separate timecode tag that starts with the time that the clip was recorded. Of course this will not be frame accurate but it will be within about a second. This would help editorial to match the timecode of the audio recorder or even multiple cameras. Of course auto syncing technology like PluralEyes or what is built into Premiere and Final Cut Pro is much easier but I've had it fail on several occasions and would have welcomed anything that would help with the sync.

Other uses for timecode? How about taking four cameras set to UTC (Coordinated Universal Time, a.k.a. GMT Greenwich Mean Time a.k.a Zulu Time) and shoot parallel movies happening in different parts of the world then showing them on one screen. I'll bet you wouldn't even notice any shift in sync. Ok--that might be a crazy idea but maybe it will spark others to start thinking about timecode and whether or not it belongs in their MLV workflow.

Addendum #1 - Record Run and more on Time of Day

There's a lot more to timecode than what fits on a couple of forum posts and it was inevitable that something important got left out.

I have already covered time of day timecode but what I failed to mention was that if there was anything that could benefit from drop frame timecode it is time of day (a.k.a. TOD). Does that mean that I'm changing my tune about avoiding drop frame timecode like the plague? Not really, at least not in the way that TOD might be implimented in cameras that don't really have built in timecode capabilities. Remember, we're using the camera's internal clock to create the timecode in post. That means we can only be accurate to within a second at best. Forget about being frame accurate. Drop frame timecode starts to behave differently from non-drop frame after 1-minute when it drops two timecode frame numbers. What do you gain from that adjustment when you are already off by perhaps 24 frames or maybe even more?

One more thought on TOD timecode, we could achieve frame accurate timecode by shooting a timecode slate and manually entering the numbers in post. This would be especially helpful if your audio recorder can also run in sync with the timecode slate. Again you don't really want to be using drop frame timecode in this situation and it isn't necessary to sync the timecode to "real" time. This type of timecode is called free run timecode because once it starts you just let it run.

It used to be that you needed an expensive timecode slate to do this:

But now there are apps that can run on your phone or tablet:

Record run is another method of using timecode that I forgot to mention earlier. The image I kept referring to on the first post used record run timecode in the telecine bay. Basically if you string together all of the shots in the order they were shot, the timecode would be contiguous. In other words, unbroken timecode. This used to be important when we were using video decks that needed pre-roll in order to make an accurate edit but it can also be used to uniquely identify all of the shots done in one day, or one card, just by using timecode alone. It also helps when determining in what order the clips were shot.

To implement record run timecode in an MLV post workflow the MLV shots would first be sorted by creation date then the timecode applied to each DNG frame in order.

Addendum #2 - Back to the Future

Digital audio was in use in the film industry years before digital video. The early digital audio workstations (DAW) would use a videotape with a timecode track to "chase" the sync. Video decks with timecode were expensive so many sound editors would use 3/4 inch U-matic or VHS videotapes and lay down a timecode track on one of the audio channels. It worked quite well as long as the tape was playing. Jogging frame by frame or shuttling the tape was out of the question.

Laying down a timecode track on a DSLR is also possible. There are devices that are designed to do this like the Lockit Buddy and the tentacle sync.



There are also phone and tablet apps that generate audio timecode which can be recorded to one of the audio channels of your camera--click on image for link.


In post production the audio can be transformed back into timecode. This is something that is built into the Avid while other systems may require third party software.

It isn't a perfect solution. SMPTE timecode requires a sync signal to line up the timecode at the start of every video frame and that isn't possible with Canon cameras that are supported by Magic Lantern.

One more time, referring back to the image at the top of the first post--notice the multiple timecode windows. There could be another one showing drop frame timecode so we can see the actual running time in an NTSC broadcast. Multiple timecode tracks are also possible in DNG and QuickTime. According to the Adobe Specifications up to 10 time codes are possible in DNG.

It also looks like Adobe took some liberties with the SMPTE specifications. In fact if you shoot some 50 or 60 fps video and look at it in Premiere you'll see that the frame counter goes from 00 to 59. I'm not familiar with FCPX but in FCP 7 the timecode could be displayed in either 30 or 60 frame mode according to Apple's FCP documentation. In addition, Apple allows negative timecode values in QuickTime so they also deviated from the SMPTE specifications.

So what does this mean for Magic Lantern users and developers? Should you stick to the SMPTE specifications or take liberties like Apple and Adobe? You can have it both ways--in fact you can have it ten ways.

DeafEyeJedi

Nice to learn that I happened to be one of the 'abusers' and am actually glad I'm reading this to gain a better understanding of what's at stake when it comes to TC.

Thanks for starting this thread as it's definitely much needed and worth the knowledge as we progress along.
5D3.113 | 5D3.123 | EOSM.203 | 7D.203 | 70D.112 | 100D.101 | EOSM2.* | 50D.109

arrinkiiii


Danne

Read this post of yours twice going on my third...

Quoteapproximately ½ second per week
8)

DeafEyeJedi

Quote from: dfort on March 24, 2016, 06:19:50 PM
Other uses for timecode? How about taking four cameras set to UTC (Coordinated Universal Time, a.k.a. GMT Greenwich Mean Time a.k.a Zulu Time) and shoot parallel movies happening in different parts of the world then showing them on one screen. I'll bet you wouldn't even notice any shift in sync. Ok--that might be a crazy idea but maybe it will spark others to start thinking about timecode and whether or not it belongs in their MLV workflow.

I actually like this concept ... That way we can use 3-4 DSLR's all running ML and recording at the same time (maybe few seconds off in between if I have to push the recording button on each DSLR) unless I have 1-2 partners on set with me to make it more precise but then it wouldn't matter because this concept of yours would solve this issue, correct? [emoji1]
5D3.113 | 5D3.123 | EOSM.203 | 7D.203 | 70D.112 | 100D.101 | EOSM2.* | 50D.109

flostro

It would be great to have TC in your converted Files. It already works with raw2cndg and resolve.
But i would like to get it to work with mlrawviewer. It uses ffmpeg which has a command to set timecode, but i haven't figured out how to pass the tc info of the mlv into the hh:mm:ss:ff format that ffmpeg's -timecode command needs.
Another great thing would be to pass all the metadata to the converted files ( iso, shutter speed, lens etc.) even if it's just a burn in text, like in the image from the first post of this thread.

dfort

The MLRawViewer developer abandoned the project: http://magiclantern.fm/forum/index.php?topic=9560.msg135613#msg135613

You should take a look at MLVFS and if you're on a Mac, MLP, both are in active development and will save the metadata from MLV files into the DNG's. You don't need burn-ins to see the metadata. To read the metadata in MLV files use mlv_dump with the -m and -v options and for DNG files use exiftool.

Oh--and staying on topic MLVFS and MLP save timecode to the DNG files. They both work fine with 23.976 MLV files. I haven't thoroughly tested out the other video frame rates yet.

dfort

Added to second post: Addendum #1 - Record Run and more on Time of Day

Figured it is best to keep tacking on any additions to the first couple of posts than to have it lost in the discussion.


DeafEyeJedi


Quote from: dfort on March 25, 2016, 04:37:48 PM
Figured it is best to keep tacking on any additions to the first couple of posts than to have it lost in the discussion.

Agreed!
5D3.113 | 5D3.123 | EOSM.203 | 7D.203 | 70D.112 | 100D.101 | EOSM2.* | 50D.109

dfort

I just posted Addendum #2 - Back to the Future, in the second post.

Hope this is useful information for users and developers.