Quality difference between the new 10/12bit raw and good old 14 raw

Started by Levas, November 27, 2016, 10:00:20 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Levas

Hello pixelpeepers,

I was wondering if people already seen/tested the difference in video quality of 10bit magic lantern raw vs 14 bit magic lantern raw.
I've used 10 bit raw today in the real world. And I'm used to color aliasing, and in most cases I have no problem in removing the color aliasing.
But the footage I shot today got very hard to remove, or almost impossible to remove, color aliasing.
Could it be explained in the difference between 10bit vs 14 bit, could that give more color aliasing ?
Or was it just a coincidence, since I shot a lot of bricks today, while visiting an old castle :P


dfort

What camera at which video mode? Less aliasing in crop modes and with 5D3.

Levas

Canon 6d, so I'm used to some color aliasing.
Just wondering if there could be a scientific explanation if 10 bit could create more color aliasing.
There are less color gradations in 10 bit...but it could also be that there were just too many bricks in my shots today.
So curious if other people see any difference in 10 vs 14 bit

a1ex


Levas

Real live shooting, so lots of family in the video  ;D
But I have found a frame without people in it, uploaded it to:
http://drive.google.com/drive/folders/0B1BxGc3dfMDaRmtKc2tOa3dHMTA?usp=sharing

I only shot 10bit that day, so I don't have any frames from that day in 14 bit to compare it with.

Thanks for the interesting link to the forum on EOSHD.
Good to see some comparisons between 10 and 14 bit. 

a1ex

Definitely not caused by the bit depth. Trim 2 bytes to get 8-bit linear raw => this DNG.

BTW, why not trying some super resolution algorithms on this footage?

Levas

Color moire/aliasing still the same in the 8 bit dng, so must be caused by shooting bricks...

Do you mean you want to try some super resolution algorithms ?
I have about 45 frames of this castle, before I start panning to the right...
Uploading frame 0 to 45 right now, takes about half an hour.
Same link as before.
http://drive.google.com/drive/folders/0B1BxGc3dfMDaRmtKc2tOa3dHMTA?usp=sharing








erikbaldwinson

Quote from: Levas on November 27, 2016, 10:00:20 PM
Hello pixelpeepers,

I was wondering if people already seen/tested the difference in video quality of 10bit magic lantern raw vs 14 bit magic lantern raw.
I've used 10 bit raw today in the real world. And I'm used to color aliasing, and in most cases I have no problem in removing the color aliasing.
But the footage I shot today got very hard to remove, or almost impossible to remove, color aliasing.
Could it be explained in the difference between 10bit vs 14 bit, could that give more color aliasing ?
Or was it just a coincidence, since I shot a lot of bricks today, while visiting an old castle :P

Just recorded 14bit, 14bitlossless and 12bitlossless. 14bit looks better in every way. The differences are most noticeable with motion and colour as opposed to pixel peeping for sharpness and noise or something - although the noise is definitely lower with 14bit as well. There is depth and strength there that just does not exist with lossless. Lossless is great though, the small file size to quality ratio is awesome and practical.

Levas

Are you sure 14bit lossless looks not as good as 14 bit ?

Because lossless means, lossless as in no quality loss...
For compression with quality loss the therm 'lossy' is used.
I think the 14 bit lossless files should be identical to the normal 14 bit files.
As far as I know it uses an non destructive algorithm to make file size smaller.


a1ex

After decompression, 14-bit lossless is identical to uncompressed.

(6D is an exception - each channel differs by a constant offset, but other than that, the image data is identical here as well)

Quote from: a1ex on April 15, 2017, 01:12:36 PM
(That means, 14-bit lossless will give raw data identical to uncompressed 14-bit (100% identical, not one single bit different, contrary to some earlier report), while "12-bit lossless" should be interpreted as "14-bit to 12-bit conversion - lossy by definition - followed by lossless compression", so it should have the same number of useful levels as uncompressed 12-bit. Please note the 12-bit lossless is not identical to 12-bit uncompressed - they differ by a constant value and possibly by some round-off error, and the same is true for lower bit depths.)

erikbaldwinson

Quote from: a1ex on November 22, 2017, 10:55:41 AM
After decompression, 14-bit lossless is identical to uncompressed.

(6D is an exception - each channel differs by a constant offset, but other than that, the image data is identical here as well)

Yes this is my mistake. The differences I am witnessing are between 14 and 12 bit.

a1ex

Do you mind sharing a short clip that shows the difference?

If possible, also a few DNGs.

erikbaldwinson

Quote from: a1ex on November 22, 2017, 01:56:33 PM
Do you mind sharing a short clip that shows the difference?

If possible, also a few DNGs.

https://www.dropbox.com/sh/67vforsr9ovuc86/AAB_ON_F-UNEugiSXh7AnCuta?dl=0

14bit shadows pull better and overall image looks better - more depth.

a1ex


erikbaldwinson

Quote from: a1ex on November 23, 2017, 09:49:32 AM
Unable to compare, please use a tripod...

Hey Alex you're welcome to run your own test on your own terms. If you want to save the time, open the dngs in photoshop and raise the shadows. The result is apparent. Also, 14-bit footage looks better in general.

a1ex

I'm not going to - it's your claim and your job to prove it.

The two DNGs you've posted do not even have the same brightness, so they simply cannot be compared for this case. All I've asked for was a test where this difference is visible.

erikbaldwinson

Quote from: a1ex on November 23, 2017, 08:13:49 PM
I'm not going to - it's your claim and your job to prove it.

The two DNGs you've posted do not even have the same brightness, so they simply cannot be compared for this case. All I've asked for was a test where this difference is visible.

I think I have. The difference is mostly in the quality and definition of the shadows. I watch the footage on my monitor and the 12-bit doesn't have a definition in the shadows that the 14-bit does. The additional noise of 12-bit is obvious when you lift the shadows of both dngs.

I think both are awesome.

a1ex

From https://www.magiclantern.fm/forum/index.php?topic=19300.msg193613#msg193613 - 10-bit lossless looks cleaner to me than 14-bit:

Quote from: a1ex on November 24, 2017, 09:52:13 AM
14-bit, 10-bit lossless black=2048, 10-bit lossless black=2047; rendered with dcraw -a -b 8:


When the brightness in the two images is different, the noise will also be different. Make sure the bit depth is the only variable that changes in your test images.

6D_ML

Here is an example clearly demonstrating the difference in quality between 10 & 14 bit recording after raising blacks & shadows in post: http://www.magiclantern.fm/forum/index.php?topic=5601.msg195034#msg195034

viral kapadia

Hi I can not convert 10bit MLV to DNG using RAW2DNG or MLV_DUMP. I convert it using MLV converter but image looks pink and noise

qqqavi

How do you change the bit depth?


Enviado desde mi iPhone utilizando Tapatalk