Author Topic: Uncompressed 14-bit RAW video testing - 5D Mark III  (Read 534684 times)

allemyr

  • Member
  • ***
  • Posts: 171
  • http://www.allemyr.com/
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #800 on: June 07, 2018, 06:29:22 PM »
I understand that the internal recording of the 5diii is 4-2-0 and the hdmi output is 4-2-2

Yes but we are disqussing bit depth?

katrikura

  • New to the forum
  • *
  • Posts: 13
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #801 on: June 07, 2018, 08:05:34 PM »
I just wanted to show the difference, in the quality of the hdmi output

allemyr

  • Member
  • ***
  • Posts: 171
  • http://www.allemyr.com/
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #802 on: June 07, 2018, 08:12:45 PM »
I just wanted to show the difference, in the quality of the hdmi output

Yes ok, that's true. Just to clearify, both (HDMI, internal Canon H264) is 8 bit. But as you say HDMI is 422 8-bit.

tupp

  • Freshman
  • **
  • Posts: 71
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #803 on: June 08, 2018, 06:13:35 AM »
This product claims to use supersampling which results in a pseudo-10-bit color sampling for improved color reproduction. Does this mean that encoding with this product will result in better quality than the original 8-bit hdmi?
Bit depth can be genuinely increased by binning together adjacent pixel groups.  However, resolution is sacrificed with such a method.

Also, such an increase in bit depth will not improve the color depth, as the color depth of an image can never be increased unless something artificial is added.  Contrary to popular belief, bit depth is not color depth.

Furthermore, artifacts in the original image (such as banding) will remain even if the bit depth is increased.

allemyr

  • Member
  • ***
  • Posts: 171
  • http://www.allemyr.com/
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #804 on: June 08, 2018, 10:15:00 AM »
Bit depth can be genuinely increased by binning together adjacent pixel groups.  However, resolution is sacrificed with such a method.

Also, such an increase in bit depth will not improve the color depth, as the color depth of an image can never be increased unless something artificial is added.  Contrary to popular belief, bit depth is not color depth.

Furthermore, artifacts in the original image (such as banding) will remain even if the bit depth is increased.

Ok, but if the resolution is the same in a case, color depth and bit depth has the same value. It is if you do this downscale from a image that the value differ.
Don't really get the point of binning pixels? The image will still use the same amount of storage? Say if you go from 1080p 8-bit - to binning pixels and 10-bit?

In film you always talk about bit depth and the image resolution is known, in my world its not wrong as it affect color depth and always get same value if the resolution is the same.

But yes this pixelbinning thing is possible, I understand how it affect bitdepth and color depth if you alter resolution, but why use it? Just because its possible?

"Furthermore, artifacts in the original image (such as banding) will remain even if the bit depth is increased."
Yes and even if the colordepth is increased? If you record your 8-bit signal within a 16-bit file, it won't be better then the original 8-bit/8 bits per channel (color depth).

Kharak

  • Hero Member
  • *****
  • Posts: 809
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #805 on: June 08, 2018, 10:45:03 AM »
I think the only way of getting pseudo 10 bit from 8 bit, would be if you did a log conversion e.g. Cinestyle to Log(of your choosing), this can be done with Cinelog colour transforms. Then do noise reduction, particular on the low luminance channel, add fine grain and render to a 10 bit container. Just make sure you are working in a >8 bit space, in after effects set working space to 16 bit.

I've had sone really good results with this method where i mixed h264 with cdng. This method completely removed banding and in resolve i felt that the footage could be pushed more before falling apart. Ofcourse you are not adding any information, but with noise reduction and fine grain it interpolates the colours as to "creating" new colours between two values.

Oh yeah, it helps a lot that you at the same time upscale the footage to make room for the "new" colours.
once you go raw you never go back

allemyr

  • Member
  • ***
  • Posts: 171
  • http://www.allemyr.com/
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #806 on: June 08, 2018, 11:07:05 AM »
I think the only way of getting pseudo 10 bit from 8 bit, would be if you did a log conversion e.g. Cinestyle to Log(of your choosing), this can be done with Cinelog colour transforms. Then do noise reduction, particular on the low luminance channel, add fine grain and render to a 10 bit container. Just make sure you are working in a >8 bit space, in after effects set working space to 16 bit.

I've had sone really good results with this method where i mixed h264 with cdng. This method completely removed banding and in resolve i felt that the footage could be pushed more before falling apart. Ofcourse you are not adding any information, but with noise reduction and fine grain it interpolates the colours as to "creating" new colours between two values.

Oh yeah, it helps a lot that you at the same time upscale the footage to make room for the "new" colours.

?

Yes you can use noise reduction to reduce banding... I'am surprised everyday on this forum that some people give tip and have ideas about workflows like this...

Common!

Kharak

  • Hero Member
  • *****
  • Posts: 809
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #807 on: June 08, 2018, 02:06:11 PM »
Maybe you should put a list out with everything you know and people can cross check with it before they post.

I was responding to 'pseudo 10 bit' and gave my take on how to get a pseudo 10 bit file from an 8 bit file. Banding removal was one part of that process.
once you go raw you never go back

tupp

  • Freshman
  • **
  • Posts: 71
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #808 on: June 08, 2018, 03:44:49 PM »
Ok, but if the resolution is the same in a case, color depth and bit depth has the same value.
Well, resolution and bit depth are the two, equally-weighted factors of digital color depth.  So, if the value of one of those factors remains constant (resolution, in your case), the color depth will change with the value of the other factor (bit depth, in your case).

Likewise, a 10-bit HD image will have less color depth than a 10-bit 4K image.

Also, an 8-bit 8K image will have more color depth than a 10-bit HD image. 

Don't really get the point of binning pixels? The image will still use the same amount of storage? Say if you go from 1080p 8-bit - to binning pixels and 10-bit?
Yes.  The "storage" required is essentially the color depth of a digital image.

In film you always talk about bit depth and the image resolution is known, in my world its not wrong as it affect color depth and always get same value if the resolution is the same.
Yes.  As acknowledged above, if the resolution remains constant in a digital system, the color depth increases/decreases with a change in bit depth.

However, with actual "film," there is no "bit depth" -- analog film is not digital.  So color depth of actual film comes from resolution (grain size) and from dye/emulsion properties (which, in combination, can be considered the analog "equivalent" of bit depth).

But yes this pixelbinning thing is possible, I understand how it affect bitdepth and color depth if you alter resolution, but why use it? Just because its possible?
In any situation in which one is down-scaling an image (such as the common 4K-to-HD conversion), it makes sense to bin the pixels to retain the color depth (and to also reduce noise and to avoid moire/aliasing).

Yes and even if the colordepth is increased?
The color depth of a digital image can never be increased, unless something artificial is added.

If you record your 8-bit signal within a 16-bit file, it won't be better then the original 8-bit/8 bits per channel (color depth).
Yes.  The image will not improve merely by putting it into a file/system of a higher bit depth.  On the other hand, in doing so you will technically have an image of a higher bit depth.

ibrahim

  • Member
  • ***
  • Posts: 167
  • 5D3 / 600d
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #809 on: June 08, 2018, 05:31:14 PM »
NO! 8 bit is 8 bit.

Ok thanks. So basically the successes in 3.5-4K 12-bit (crop mode) will not be able to be transferred through the HMDI.
So there is no use to buy a recording monitor that supports 4K signal other than one that supports 1080p 8-bit to 12-bit 4:2:2 such as the old atomos Ninja 2 or the new blackmagic video assist?
Two Canon 5D Mark IIIs & Canon 600d (Magic Lantern nightly builds) | Ronin-M | Rokinon 35mm T1.5 Cine AS UMC | Samyang 85mm T1.5 UMC AS Cine VDSLR II   | Canon EF 24-105mm f/4L IS USM | Canon EF 50mm f/1.8 II | etc
Dual sound system: Tascam DR-60d MKII | Audio Technica AT899 | Sennheiser MKE 600

allemyr

  • Member
  • ***
  • Posts: 171
  • http://www.allemyr.com/
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #810 on: June 12, 2018, 10:16:05 PM »
Ok thanks. So basically the successes in 3.5-4K 12-bit (crop mode) will not be able to be transferred through the HMDI.
So there is no use to buy a recording monitor that supports 4K signal other than one that supports 1080p 8-bit to 12-bit 4:2:2 such as the old atomos Ninja 2 or the new blackmagic video assist?

Yes exactly. The HDMI will always output a 1080p 8-bit 422 signal.  So saving in a format above those spec would be a waste of storage. 8-bit to 12-bit? 8-bit is 8-bit :)

hjfilmspeed

  • Senior
  • ****
  • Posts: 479
  • 5D III and IV
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #811 on: July 27, 2018, 08:09:44 PM »
So I just have to say that RAW video on the 5d3 keeps getting better and better. I am still just blown away with 1920x1080p RAW on the 5D3.
My current favorite is:
1920x1080 23.976 14bit Lossless or uncompressed with sound or proxy. Either way, so good!!!

After converting though MLV App™ to DNG I bring the clips into DaVinci Resolve 15. I then use Resolve's color management to change the input and timeline to Canon cinema gamut c-log 2. After that I usually just white balance in the RAW tab on the color page and then just adjust the lift, gamma, gain, contrast and saturation. And boom!!! All those epic canon colors just burst out.

Resolves editing is getting so much better so I have been cutting on right on the timeline with the native DNGs.

Now I am super scared if my 5D3 dies because I am so attached to RAW video. I already opened it up and fixed a loose live view switch. I also had the shutter replaced. I'm contemplating picking up another used 5d3 just in case.

I doubt canon will step up there video game in the suspected mirrorless FF that much. I hope they do but I am not getting my hopes up. I have more faith in the 5dIV getting ML RAW video before any Canon sub $6000 body gets it. Or gets a decent codec.

RobinF

  • New to the forum
  • *
  • Posts: 3
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #812 on: September 07, 2018, 03:08:24 PM »
Has anyone found the ultimate workflow for stretching compressed footage?

Been filming with a 5diii in 50fps raw with a aspect ratio of 2:35 to 1 in full hd so 1920 x 817 and editing with premiere pro cc 2018
Magic lantern compresses the footage which results in 1920 x 482.

The mlv app does a fantastic job handling mlv files, the option to stretch your image by 1,67 is also included but when exporting in cdng this information is only included as metadata in your cdng sequence and premiere pro doesn't translate this metadata into your imported clip. So when you're previewing a file in your source monitor you see a rather deformed and compressed image which makes it really hard to accurately judge your clip.
You can of course manually stretch your clip once it's in the timeline but this doesn't help the source monitor problem and is quite a lot of unnecessary work.

I'm having quite some difficulties finding a clear answer but I can not believe no one has figured out the optimal workflow for this?

Thanks

allemyr

  • Member
  • ***
  • Posts: 171
  • http://www.allemyr.com/
Re: Uncompressed 14-bit RAW video testing - 5D Mark III
« Reply #813 on: September 07, 2018, 05:02:58 PM »
I'm having quite some difficulties finding a clear answer but I can not believe no one has figured out the optimal workflow for this?
Thanks

Hi, not to many people use Premiere Pro directly with CDNG here. Most use Adobe Camera RAW (Lightroom/AfterEffects/Photoshop), MLV App and I use Davinci Resolve. I have a very good workflow for this but thats Resolve, just one click for all slowmotion shot I have in that timeline mixed with 1:1 speed footage.

I'am right now reading thru the raw2cdng app thread for my own purposes and I've read there that PP isn't taking any advantage of the bitdepth of RAW it treats it as a low bit depth footage, that was a post from 2014 tho so things might have change but I know when people work with Red and Arriraw footage they use respective brands raw plugin inside PP to process that.

It might be very hard to find as I said because very few use PP with cdng files. I've used Premiere Pro for many years and worked with a XML round trip between PP and Resolve, but since Resolve got great editing tools some years ago I skip PP now. I don't say you should change app, just explaning why you don't find anything.

In fact MagicLantern isn't squeezing the footage, it the raw feed from the camera that looks like that. I would say that ML 1920x648 is much better looking then the 1280x720 50-60 fps you get without magiclantern, way better!