Magic Lantern Forum

Showcasing Magic Lantern => Share Your Videos => Topic started by: The_bald_guy on October 06, 2017, 12:30:03 AM

Title: 5D mkIII bit deph test 12bit lossless & 10bit lossless shows artifacts
Post by: The_bald_guy on October 06, 2017, 12:30:03 AM

https://youtu.be/rbLxVeVvSts

After filming in 12bit and 10bit lossless and seeing artifacts pop up, I made a test with the different bit depth. I realised that 14bit lossless is the best setting in general. You save 51% space on your card with a bit rate of 42 mb/s to 90 mb/s and there is no artifacts or glitches.

The ML version used was: magiclantern-crop_rec_4k.2017Sep03.5D3113

Title: Re: 5D mkIII bit deph test 12bit lossless & 10bit lossless shows artifacts
Post by: hyalinejim on October 06, 2017, 07:54:09 PM
Can you post stills direct from the timeline, please? It's a little difficult to spot subtle differebces once YouTube compression kicks in.
Title: Re: 5D mkIII bit deph test 12bit lossless & 10bit lossless shows artifacts
Post by: The_bald_guy on October 09, 2017, 04:08:39 PM
You should be able to notice weird specs on contrast edges and a white layer on the sheet of paper. That white is the highlight recovery but it's unusable. And it's not there in 14 bit lossless. There is just overall glitches on contrast edges.