https://youtu.be/rbLxVeVvSts
After filming in 12bit and 10bit lossless and seeing artifacts pop up, I made a test with the different bit depth. I realised that 14bit lossless is the best setting in general. You save 51% space on your card with a bit rate of 42 mb/s to 90 mb/s and there is no artifacts or glitches.
The ML version used was: magiclantern-crop_rec_4k.2017Sep03.5D3113
Can you post stills direct from the timeline, please? It's a little difficult to spot subtle differebces once YouTube compression kicks in.
You should be able to notice weird specs on contrast edges and a white layer on the sheet of paper. That white is the highlight recovery but it's unusable. And it's not there in 14 bit lossless. There is just overall glitches on contrast edges.