Hello, Masters,

I just did a simple math, to see which part will change its value while converting from 14bit to 12bit or 10bit, and I use this formula:

We will try to find out which brightness value in 14bit world, after converting to 12bit by using int formula like x * 2^12 / 2^14, will change to the smallest value, which either 1 or 0 in int world:

int x; //the value to be found out

formula:

x * (2^12) / (2^14) = 1

the result:

x = 4

So any value in the original 14bit data smaller than 4 will be converted into 0 in the final 12bit data format, which totally consists 4 values: 0, 1, 2, 3

That means that in the original 14bit data, only brightness level from 0, 1, 2, 3 will be converted into value 0 in the final 12bit format. And these 4 values only consists only minimum fraction comparing with 14bit world, which is about 0.024%(4 / 2^14) of all the values in 14bit world, and its final form, which is total black level value = 0. What the good news is this part of data will not be transfer by HDMI interface(I heard that HDMI usually will not transfer brightness level from 0 - 5 in all the 0 - 255 values to the final equipments of display (like TV, projectors...).

To see the 10 bit scenerio:

formula:

x * (2^10) / (2^14) = 1

x = 16

There will be totally 16 values converted into 0 while using int formula x * (2^10) / (2^14), that consists 0.098% of all the values in the 14bit data format, and these values might be neglected by HDMI interface anyway.

So, I think that using int formula x * (2^12) / (2^14), or x * (2^10) / (2^14) into 10bit respecfully, might be very viable methods.