Got some half-decent results with this:
https://github.com/pathak22/pyflowFor each frame:
- selected a few nearby frames (choice not critical, e.g. i-3 ... i+3 is fine, but some difficult frames will need more)
- computed a warped version from the nearby frames, using their demo script (only changed the file names)
- averaged the current frame and its warped versions => a temporally denoised frame
- from the difference image (denoised - original), kept only the vertical pattern (column medians) and discarded the rest
Seems to work as long as the camera is moved horizontally (otherwise it's hard to tell the difference between pattern noise and vertical objects in the image). When the camera motion stops, the pattern reappears (need to pick different frames).
Can you upload a few frames before and after
your test image, graded in the same way? 15 JPEGs before and 15 after should be fine, just to run a quick test and have something to compare with. I've got
this, but it's hard to judge when they don't have the same grading.