5D mkIII bit deph test 12bit lossless & 10bit lossless shows artifacts

Started by The_bald_guy, October 06, 2017, 12:30:03 AM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

The_bald_guy


https://youtu.be/rbLxVeVvSts

After filming in 12bit and 10bit lossless and seeing artifacts pop up, I made a test with the different bit depth. I realised that 14bit lossless is the best setting in general. You save 51% space on your card with a bit rate of 42 mb/s to 90 mb/s and there is no artifacts or glitches.

The ML version used was: magiclantern-crop_rec_4k.2017Sep03.5D3113


hyalinejim

Can you post stills direct from the timeline, please? It's a little difficult to spot subtle differebces once YouTube compression kicks in.

The_bald_guy

You should be able to notice weird specs on contrast edges and a white layer on the sheet of paper. That white is the highlight recovery but it's unusable. And it's not there in 14 bit lossless. There is just overall glitches on contrast edges.