Through a strange (and wonderful) turn of events, I've ended up with two 5D3s in my hands, and I plan to have some fun with them for an upcoming short film. Things I'd like to play with range from high dynamic range capture, stitching for double resolution, fusing lens effects, to fusing different parts of the EM spectrum. Most of these require close to pixel-level alignment of objects in the frame for good results. In this context, I'm not too worried about parallax error since most scenes would be shot at long distance, but inter-frame synchronisation between the two cameras is going to be an obstacle.
Based on examples I've seen (e.g. https://joancharmant.com/blog/sub-frame-accurate-synchronization-of-two-usb-cameras/), I think I'll need sub-millisecond synchronisation to have acceptable alignment and avoid excessive loss of detail/double images. At 25fps, I can expect anywhere from 0 to 20ms of misalignment with close to random initiation of recording at this temporal resolution. Launching randomly (that is, random with respect to this high temporal resolution), I make it that I will have to make about 50 attempts to have a 90% chance of happening upon a sub-millisecond inter-frame difference between the two cameras' streams --- not practical.
Two questions then arise: 1) how I can know whether I've achieved this and 2) how this hit rate can be improved upon.
For 1), I can use the process described in the above link: set up a strobe synced to the frame rate and play with the duration such that I can align the resulting banding between the two cameras.
For 2), does anyone have any ideas about which of ML's features can be leveraged to get pretty close to sub-millisecond synchronisation, such that I might only require a few attempts to get a very small gap between the two streams? I'm wondering if the pre-record option in the RAW video menu combined with the recording trigger option might ready the buffer such that on launching recording with a Y-split remote release cable or similar, the latency might be low enough to consistently get low single digit offsets, for example, from whence I can simply retrigger manually until I get something acceptable.
P.S. I'm optimistically assuming that there won't be any precession of one stream in relation to the other over time, but I have no idea whether or not this will indeed be the case.
Based on examples I've seen (e.g. https://joancharmant.com/blog/sub-frame-accurate-synchronization-of-two-usb-cameras/), I think I'll need sub-millisecond synchronisation to have acceptable alignment and avoid excessive loss of detail/double images. At 25fps, I can expect anywhere from 0 to 20ms of misalignment with close to random initiation of recording at this temporal resolution. Launching randomly (that is, random with respect to this high temporal resolution), I make it that I will have to make about 50 attempts to have a 90% chance of happening upon a sub-millisecond inter-frame difference between the two cameras' streams --- not practical.
Two questions then arise: 1) how I can know whether I've achieved this and 2) how this hit rate can be improved upon.
For 1), I can use the process described in the above link: set up a strobe synced to the frame rate and play with the duration such that I can align the resulting banding between the two cameras.
For 2), does anyone have any ideas about which of ML's features can be leveraged to get pretty close to sub-millisecond synchronisation, such that I might only require a few attempts to get a very small gap between the two streams? I'm wondering if the pre-record option in the RAW video menu combined with the recording trigger option might ready the buffer such that on launching recording with a Y-split remote release cable or similar, the latency might be low enough to consistently get low single digit offsets, for example, from whence I can simply retrigger manually until I get something acceptable.
P.S. I'm optimistically assuming that there won't be any precession of one stream in relation to the other over time, but I have no idea whether or not this will indeed be the case.