Method for Getting High-Quality Photos from Crappy Lenses

Started by blade, September 30, 2013, 04:28:52 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.


1%

This would be awesome for those cheap macro rings... I'm downloading the "code" zip... but is there a functional implementation of this?

wolf


hookah

5D3, Sigma Art 35mm 1.4, Tamron 24-70mm 2.8 VC, Tokina 11-16 2.8, Canon 50mm 1.4 + 100mm 2.8 macro + 15mm

JCBEos

This is just very smart... and awesome!

My brain hurts a little after trying to understand those mathematics...

but what if we could apply this to a high-end lens, would this make a better super sharp lens???

My Youtube Channel -> comments and likes welcome ;)

g3gg0

Help us with datasheets - Help us with register dumps
magic lantern: 1Magic9991E1eWbGvrsx186GovYCXFbppY, server expenses: [email protected]
ONLY donate for things we have done, not for things you expect!

blade

Quote from: JCBEos on September 30, 2013, 09:02:05 PM
but what if we could apply this to a high-end lens, would this make a better super sharp lens???

As this is way out of my expertise, I cant say that that is true, however they present some shots with a normal lens :
Standard camera lens at f/4.5 (Canon Zoom EF 28-105 Macro) 

Hover above the picture to toggle the effect on and off...  It looks great.

http://www.cs.ubc.ca/labs/imager/tr/2013/SimpleLensImaging/standard_lens_and_multispectral_results.html

However, it does seem to take some CPU power, so in camera seems to be a no go. However it may be combined with other post stuff like dual iso. (again I am out of my league to claim such a possibility as a real world option) 
eos400D :: eos650D  :: Sigma 18-200 :: Canon 100mm macro

1%

From what I gather you would need to shoot their chart for each lens and then post process with that data on "normal" images. Seems very doable and the results look good. I wasn't thinking in camera but everyone is already doing PP anyway, what is one more step, esp on hazy pics you would toss.

maxotics

Yes, PP is the key.  If you can accept that, all kinds of things can be done. (Doubt it would add much time to your workflow).  I used the aForge library a bit when I was doing panorama robotics and it's amazing all the kinds of things it can do.   There is a fair amount of open source code out there which could be integrated into ML PP.

AFAIK there are really only two major type of distortions, or problems.  1.) Different wavelengths of light travel at different speed through glass so bend in unequal amounts and 2.) the image must fall on a flat plane (sensor).  Lens makers pull every kind of mechanical trick with coatings and then often add lenses just extend certain properties, like wide-angle, into the sensor cavity.  I don't see, theoretically, why you can't do what the article says.  In fact, I believe the camera makers are already doing something similar with pancake lenses.

Many image processors probably work mostly with 24-bit color, instead of the bayer data.  I believe you can be really effective with bayer data.  Also, the MLV format would allow us to save information to configure later PP. 

I would think if you locked the mirror up on many of the Canons, and made lenses that go into the cavity, and used software like that, you'd have not only a cheap RAW video solution, but a SMALL one--at almost every focal length!

Very cool stuff indeed!


SpcCb

Very interesting, indeed.
Looks like a Richardson-Lucy deconvolution.

We need high precision PSF of the couple camera + optic (sampling dependant). Like it could be done in a lab.
And if the subject moves... To take a super computer to make deconvolution iterations. ;)


PS: JCBEos > already done :) -> http://www.eos-numerique.com/forums/f67/plan-large-sur-la-californie-216558/

arrinkiiii


1%

QuoteWe need high precision PSF of the couple camera + optic (sampling dependant). Like it could be done in a lab.

So you can't compute the PSF at home?

SpcCb

Quote from: 1% on October 01, 2013, 01:21:58 AM
So you can't compute the PSF at home?
If you don't expect high precision PSF/results it could be done 'at home' with a special laser spot and an optic analysis software like WinRoddier.
Or by night, using a star instead of the laser. Actually, I prefer to take datas from a star, it's easer, even if turbulence etc. affect readings on a star, because we need a very small laser spot to get good datas, or place the laser very far to have a very small/coherent spot.

This process is well known in astronomy to test optics or to get PSF and aberrations for post-process deconvolutions.
But I think it could look complicated for regular users (?).

ItsMeLenny

Chromatic aberration is caused by the fact that colours have different wavelengths.
The cheap 50mm from Cannon, despite having multiple elements, has some minor aberration.
Also, holga creates plastic lenses, which the idea of them is to get a shitty look,
however I wonder if this equation would work on holga lenses as they are plastic rather than glass,
they would have a different refraction number or what not.

SpcCb

Chromatic aberrations are caused by wavelengths and refraction index of the lens inside the optic.
More the index is, less they are. And more the wavelength is, less they are.
Beside, in the real world, more the focal length is long more it's easy to correct aberrations (because of the length of the displacement f/[angle]).

Optical equations are working for all cases, but it's very hard to compute the equation for optics with many lens; it depends of the optic formula..
If you look for the formula of the DO Canon optic, where you can find plastic lens arranged in accord of the Fresnel lenses formula, it's not simple than with a single BK7 lens optic. Same thing with high end optics with frontal low dispersion triplet or quadruplet + IS & mobile focus groups.
Plastic or BK7 or Fluorite, and different wavelength, this is not what it's complicated. This is just different values in the same equation.

But here, the meaning to correct those aberrations is different.
It's like 'reverse engineering' by analysing how the optic is 'working' on a reference (PSF) and compute the inverse calculation to recover the reference without aberrations.
So if the optic has only one lens or 20, the computation is the same.

g3gg0

hmm how i understand the PSF, it is comparable to a 2-dimensional CDMA code.
this code is the inverse function to the distortion the lens causes.

what i wonder now - what is the effect of the unavoidable sensor noise?
will it - as some single-frequency noises - get averaged out, so the noise is mainly cancelled?
or will it cause severe trouble to the recovery algorithm?
Help us with datasheets - Help us with register dumps
magic lantern: 1Magic9991E1eWbGvrsx186GovYCXFbppY, server expenses: [email protected]
ONLY donate for things we have done, not for things you expect!

Shield

Maybe this software could save some of the crappy soft Sony a99 footage I took last year?  The a99 looked like the example on the "left".  ;D

1%


g3gg0

have matlab licenses, embedded coder, simulink, targetlink - but i dont use any of them :)
today we had a closer look at the sources with some colleagues.
we will perhaps try to generate C code from the scripts if we have some spare time.

i could access the licenses over VPN from home with my business laptop, but i dont have any free space for matlab on it :)
Help us with datasheets - Help us with register dumps
magic lantern: 1Magic9991E1eWbGvrsx186GovYCXFbppY, server expenses: [email protected]
ONLY donate for things we have done, not for things you expect!

Doyle4

see this on a popular photo forum yesterday... i must say soon as i saw i thought of the ML team haha.

ItsMeLenny

I'd like to see it run on this video clip, pretty sure they used an old lens to achieve the desired look, but I'd prefer it corrected and crisp.
Warning: Taylor Swift http://youtu.be/RzhAS_GnJIc

600duser

Astronomers take pictures in the most abysmal conditions imaginable. They figured out the future long ago.

Computational Photography for the win.....see Hubble deep field image for details !


Step 1 point

Step 2 press button

Step 3 your camera whirs quietly as it stacks a dozen or so frames at a thousand frames per second

Step 4 slowly increase the aperture of your smile as the 'quick preview image' re crystallizes into gigapixel heaven

You could shove a crisp packet into a toilet roll and use it as a lens....for some astronomers that would be a welcome upgrade given the paltry number of photons they have to work with 24/7/365

Landscape Astrophotography
and with the right software and hardware the whole thing can be automated
http://www.youtube.com/watch?v=Rydg7JGTAbw



You can also get frame stacking software for video. As many cameras like the 600d can take video at 30 or 60fps if you want to get a good picture of stuff that isn't moving then taking a few seconds of 'shaky hand' video is all the data you need. Frame stacking video turns your budget camera into an EOS MK III annihilator !...heck even a webcam can blow a $10,000 DSLR out of the water via frame stacking video..

Computational photography exchanges photon capture time for money. When you combine frame stacking with focus stacking and HDR stacking then that photo zoom scene in Bladerunner becomes a reality.

Future photographers will simply wave a wand about in the general direction of the subject and 'puter' will do the rest.  Imagine a device the size and shape of a packet of tic tacs.

The future of photography ENJOY !

http://www.youtube.com/watch?v=KCO4hO7CQ6A


g3gg0

@600duser:
the thing the original poster talks about, is totally different from drizzling and median.
it is about lens correction. you better read the paper.
Help us with datasheets - Help us with register dumps
magic lantern: 1Magic9991E1eWbGvrsx186GovYCXFbppY, server expenses: [email protected]
ONLY donate for things we have done, not for things you expect!

600duser

did i mention the trouble with Hubble ? i think i did

Lens correction is often used by astronomers, nothing stopping DSLR users from setting up calibration tests or even writing their own lens correction code. Ive done that myself for several projects.

Its good to see that Canon have included some basic lens correction features with the bundled software.

Digital Photography is about to undergo a major revolution.  A few more doublings of sensor size/pixel density and CPU power and the barn doors get kicked open.

painya

@600duser
Mind sharing some of those correction codes with the forum?
Good footage doesn't make a story any better.