Jamie Tongue
October 30th, 2013, 07:07 AM
I have a hypothetical/theoretical question about timecode. I am not asking for any practical reason, and I have no timecode enabled equipment to test this on myself.
I'm pretty sure I understand the differences between timecode and genlock/sync, namely, that timecode alone does not force synchronised recording speeds between devices, it only stamps synchronised time numbers over the top of the recordings.
EG, I am aware that if one jam-syncs the timecode of multiple devices at the start of shooting, then disconnects the timecode cables, after a while, the timecodes (and thus, the recordings themselves) can/will drift out of sync.
But what happens when the timecode is constantly being fed, not a single jam-sync? Say, for example, you have a single audio recorder and a single camera. The audio recorder is sending timecode to the camera constantly via cable. What happens when the camera 'tries' to drift out of sync? The timecodes cannot drift out of sync (as they would if only jam-synced at the start of shooting), as that is the whole point of having constantly connected timecode cables, right?
I have been reading up online about this, and different websites seem to imply two different possibilities:
A) the recording of the slave device (the single camera, in my above scenario) will simply drift out of sync from the master (the single audio recorder, in my above scenario), even though the timecodes remain the same throughout the recordings.
Or B) as the slave recording 'tries' to go out of sync by a full frame, it 'tries' to take the timecode with it, but it cannot, so it 'jump' corrects itself. EG, as the camera 'tries' to run 1 frame slower than the timecode it is being fed, it forcibly corrects itself by adding a frame (and/or the opposite, of course). I have come across references to 'green flashes' in footage as a result of a slave camera 'trying' to drift from its incoming timecode. I guess these green flashes are the result of the camera 'adding' a frame (of green?) to force correct itself to match the incoming timecode?
Or something else entirely?
I'm pretty sure I understand the differences between timecode and genlock/sync, namely, that timecode alone does not force synchronised recording speeds between devices, it only stamps synchronised time numbers over the top of the recordings.
EG, I am aware that if one jam-syncs the timecode of multiple devices at the start of shooting, then disconnects the timecode cables, after a while, the timecodes (and thus, the recordings themselves) can/will drift out of sync.
But what happens when the timecode is constantly being fed, not a single jam-sync? Say, for example, you have a single audio recorder and a single camera. The audio recorder is sending timecode to the camera constantly via cable. What happens when the camera 'tries' to drift out of sync? The timecodes cannot drift out of sync (as they would if only jam-synced at the start of shooting), as that is the whole point of having constantly connected timecode cables, right?
I have been reading up online about this, and different websites seem to imply two different possibilities:
A) the recording of the slave device (the single camera, in my above scenario) will simply drift out of sync from the master (the single audio recorder, in my above scenario), even though the timecodes remain the same throughout the recordings.
Or B) as the slave recording 'tries' to go out of sync by a full frame, it 'tries' to take the timecode with it, but it cannot, so it 'jump' corrects itself. EG, as the camera 'tries' to run 1 frame slower than the timecode it is being fed, it forcibly corrects itself by adding a frame (and/or the opposite, of course). I have come across references to 'green flashes' in footage as a result of a slave camera 'trying' to drift from its incoming timecode. I guess these green flashes are the result of the camera 'adding' a frame (of green?) to force correct itself to match the incoming timecode?
Or something else entirely?