Author Topic: 5D mkIII bit deph test 12bit lossless & 10bit lossless shows artifacts  (Read 2493 times)

The_bald_guy

  • New to the forum
  • *
  • Posts: 40


After filming in 12bit and 10bit lossless and seeing artifacts pop up, I made a test with the different bit depth. I realised that 14bit lossless is the best setting in general. You save 51% space on your card with a bit rate of 42 mb/s to 90 mb/s and there is no artifacts or glitches.

The ML version used was: magiclantern-crop_rec_4k.2017Sep03.5D3113


hyalinejim

  • Member
  • ***
  • Posts: 131
Re: 5D mkIII bit deph test 12bit lossless & 10bit lossless shows artifacts
« Reply #1 on: October 06, 2017, 07:54:09 PM »
Can you post stills direct from the timeline, please? It's a little difficult to spot subtle differebces once YouTube compression kicks in.

The_bald_guy

  • New to the forum
  • *
  • Posts: 40
Re: 5D mkIII bit deph test 12bit lossless & 10bit lossless shows artifacts
« Reply #2 on: October 09, 2017, 04:08:39 PM »
You should be able to notice weird specs on contrast edges and a white layer on the sheet of paper. That white is the highlight recovery but it's unusable. And it's not there in 14 bit lossless. There is just overall glitches on contrast edges.