Genlock

Started by Ash McKenzie, July 24, 2012, 03:54:38 AM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Ash McKenzie

Just Wondering with the understanding of the Sensor (FPS Overide etc) Is Genlock/Sync Pulse in or out possible?, or even custom setting of the clock ???


a1ex

FPS can be slightly sped up / slowed down on the fly, so... maybe.

I'm not familiar with this protocol though - hopefully it's too complex to decode, and I don't know what extra hardware is used.

Michael Zöller

Maybe just sync out? Depends on how stable a clock one can get from the EOS cameras...
neoluxx.de
EOS 5D Mark II | EOS 600D | EF 24-70mm f/2.8 | Tascam DR-40

a1ex

You can also fine-tune the clocks if you know the exact difference between the two cameras (not easy, but can be done right now).

Here's a tool that may help: http://www.g3gg0.de/wordpress/uncategorized/eos-timergen-tool/

3pointedit

Is it possible to teather 2 cameras to the same PC and match their frequency? I guess that it would not stay that way for long? The issue is that while you can vary the frame rate dynamically the only input is via buttons or PC(?).
550D on ML-roids

mkrjf

never going to happen
might be able to implement audio track carrying a genlock and feed the headphone out to other device...

chmee

are "we" able to readout the ir-port? its pulsed with about 36-38kHz, so it could be able to handle a simple VSync, including start/stop :)
[size=2]phreekz * blog * twitter[/size]

daxiid

If accesing ir port and using an external pulse generator would be possible, then the possibilities are really great. A feasible RAW stereo3d capturing system comes to mind.


Theoretically it seems quite simple. If ML RAW_rec gets a pulse from IR, capture 1 frame, and after another capture another. But my ignorance and stupidity could be endless.... ::)

dmk

Was this a dead-end?

a1ex

Not dead end, but not a very easy task either. It's doable in my opinion, but not a priority for me. CHDK does it, and there are a few threads around with more details.

dmk

Thanks!

I'm starting to put the pieces of the puzzle together from this and other posts... from http://www.magiclantern.fm/forum/index.php?topic=16112.0 and http://www.magiclantern.fm/forum/index.php?topic=8912 and chmee's comments above, would the following work:

1) write a module to parse readout on the ir-port

2) given the target command over ir - tell the mlv-rec module to blink the LED

3) use arduino or something to output ir to both cameras at consistent framerate around 24-30fps. Something based on https://github.com/z3t0/Arduino-IRremote or  http://sebastian.setz.name/arduino/my-libraries/multi-camera-ir-control/ might work

Is this the right idea? If IR is too flimsy, since recording times in RAW are generally short anyway, might be enough just to do that for start points - or maybe manually trigger "re-sync" every couple minutes. This assumes the above makes sense of course

EDIT: is USB an alternative or is it too slow? Something like https://github.com/felis/PTP_2.0 may come in handy...


dmk

Oh, seems I misunderstood, the led blink thing was just for outputting a signal for each frame grab, right?

Once recording is going (e.g. green in RAW mode) is there a way to sortof tell ML "reset the clock now" ? i.e. so when both cams get the ir signal they'll forget about the current cycle and start over simultaneously?

dmk

Just getting my feet wet looking at the source... but how wrong am I thinking that raw_rec_cbr_started() could be patched to wait for an external cue... would that be enough for this to work? (at least for really good synchronized start, and hopefully drifts won't happen past that point)?

a1ex

You can, but the timings will still not match, because of the LiveView clock (which is controlled by Canon). You can alter it (either speed it up or slow it down) with the FPS timers, but getting it right is not very easy.

Then, communication between tasks introduces additional delays: you may have to do the sync part directly in the LiveView task (vsync hook, or you may need a new one - see a1ex.bitbucket.io/ML/states/ ).

I'm not sure if enabling LiveView at the same time, or powering on the cameras at the same time, will sync the clocks. This needs to be tested. I think the second idea was already tested by others, but I'm not sure what the result was. However, since the startup process is not deterministic (lots things are happening in parallel), I don't expect the timings to be be repeatable (but you have chances to get perfect sync with some probability).

I'm not sure how fast the half-shutter event from the remote port is received from the main CPU, since, to my knowledge, this is handled by the MPU (see MPU communication). But I think it's worth checking, especially if you place a hook in mpu_recv.

IR is probably a bit more difficult; I did some experiments a long time ago on 550D (see here), but it requires serious reverse engineering.

You may find better luck with some device where you can access the ADC directly (for example, the ambient light sensor in 5D2/5D3, see LightMeasure).

Or, you may also be able to sync the two LiveViews manually, using some flickering light. No need to code anything on this one, just play with FPS override until you get both images in sync. It can be automated as well, from the vsync hook.

dmk

Whoa... lots to digest... thanks for the in-depth feedback!

I'm still missing a few fundamentals. Is there a primer somewhere on the relationship between recording rate and LiveView clock? I would (naively) think that recording happens from one clock, and LiveView tries to keep up with that and drops frames as needed... but I guess I'm wrong :)

The last option sounds really good, especially if liveview is truly a 1:1 match with what's being recorded. If it's good enough to the eye on liveview - it's good enough to the eye on playback :D Though I'd imagine matching those in the lcd display would be really hard... would probably need some dual-hdmi monitor which allows overlays, no?

Following up on that option - if one is going at 24 fps and then I adjust the other to be 24.5 fps till they match - won't they drift out of sync again? Can you explain a bit more about how you see automating from vsync hook (is it to grab the timing from one and print it to the screen for setting on the other?)

a1ex

The LiveView state machine is driven by the FPS override timers, see the VideoTimer page on ML wikia. The main LV task is called Evf on recent cameras, and LiveViewMgr on older models. As many other things in Canon code, they use a state machine (EvfState/LVState). The exact mechanism is pretty complex and I don't fully understand it, but the vsync hook gets called once for every frame (the exact moment is unknown). Most CPU-intensive processing happens in other tasks. Overall, this design allows for a precise frame acquisition timing, regardless of what processing we are running in background (overlays, recording tasks, exposure tools, whatever).

Luckily, you probably don't need to understand all that; the interesting bits for this purpose are the vsync hook and the FPS timers.

LiveView is a downsized version of what's being recorded, and the so-called HD buffer is exactly what's being recorded in H.264. So, it should be fine for manual sync.

Indeed, the documentation on this is pretty sparse and incomplete.

dmk

Thanks for your help again!

Seems internal/manual will be much easier and better than an external cue. Is the following right?

Assume "sync value" is some measure such that two cameras with the same sync value can be said to have their sensor recording happening simultaneously.

Module: Config menu allows manual input of target sync value. Upon recording, it will automatically re-sample the internal sync value and add or subtract from the FPS override until internal value matches the target. Once it hits that target, it optionally instantly turns off FPS override (alternatively this is an ongoing process). It can output to screen or maybe blink the LED or something to let us know when it hits the target the first time.

If that's wrong, or you have a better idea, I'd appreciate your insight. Similarly, if it's right, do you know the best method for getting that "sync value" (e.g. (real time - vsync time) would be great!)?