Some (many?) Sony Vegas users have probably already figured this out...
But I've just wasted a few days trying to figure out why video from my Canon 600D (T3i) DSLR didn't seem to colour correct or export from Sony Vegas Studio 12 well - the resulting (rendered-out as H264 .mp4) finished video when played back on my PC or uploaded to Vimeo was crushed blacks/detail and blown highlights.
Why...read on below (this contains some generalisations, but you'll get my point)
A Canon EOS DSLR's video files are Computer RGB (cRGB 0-255) I use a Canon 600D but I believe the same applies to 7D, 550D, etc
h264 is YCbCr (YCC for short) they don't create RGB, two different color models. They're not cRGB whatever that is, 'Computer RGB', Sony speak I guess.
When you view the .MOV clips straight from the EOS camera on a PC using windows media player etc all is well. Windows media player etc handles playback of cRGB 0-255 video correctly.
What WMP does depends on the underlying DirectShow codec decompressing the h264 and that can vary. But what happens generally is that the media player handles the YCbCr to RGB conversion, which can be in combination with the graphics card, hardware assisted and as a result levels handling can vary.
The Canon MOV's are encoded into the h264 with full 8bit levels, well luma is chroma not. The feed to the camera h264 encoder in camera is raw JFIF YCC that is chroma and luma normalised over the full 8bit range, it isn't RGB.
The MOV files are flagged 'fullrange' by the encoder to flag to the decompressing codec / media player to squeeze the h264 full range levels into 16 - 235 YCC before it's converted into RGB. This is because full range 8bit YCC doesn't fit in 0 - 255 RGB. 8bit speak.
Problem is that older NLE's / codecs / media players may ignore the full range flag or simply handle the source as 16 - 235 and not squeeze the levels before conversion to RGB and chop off levels below 16 and above 235, expand the remaining levels over 0 - 235 and the contrast looks wrong, is wrong. So trusting a media player to give the right results or to extract screengrabs is not a good idea unless it's color managed like Media Player Classic or test first. VLC is a prime candidate.
Here's a test that contains a fullrange flagged file and non fullrange flagged file:http://www.yellowspace.webspace.virginmedia.com/fullrangetest.zip
VLC, depending on settings will not necessarily show the output correctly, if you don't see the 16 & 235 text then VLC isn't setup right. If the 16 - 235 text does show you're good to go. same for any media player.
Also media players RGB conversion might not even use the right luma coeffs, ie: BT601 or BT709. So pinks can go to orange. Typical result of doing a transcode of BT601 flagged Canon MOVs to any other YCC codec like DNxHD which are all assumed BT709 at HD resolution, resulting in orangey skin tone for example. Ok in camera but not in the mangled transcode and playback. mediainfo will show what luma coeffs are used or assumed. Earlier Canon's like T2i are BT601. 5D MK III is BT709 anyway. Best check before conversion to RGB.
If you then load the .MOV clip into the timeline of Sony Vegas whilst using a standard default 8-bit vegas project (I don't think this applies to 32 bit projects? But these aren't the standard and I don't use them due to PC speed) the cRGB video clip is automatically converted by Sony Vegas from cRGB 0-255 into sRBG 16-235 (Studio RGB). There is no warning message or notification, his sRGB fact isn't made clear anywhere as far as I know.
The difference between 8bit & 32bit projects is to do with YCC to RGB conversion that any NLE does these days and precision.
First the YCC to RGB conversion, as said full range 8bit YCC doesn't fit in 8bit RGB, in fact 16 - 235 YCC doesn't fit completely into 8bit RGB, probably only 35% is transferable the rest can generate invalid RGB values in 8bit RGB and can appear as gamut clipped white artifacts, but NLE's work RGB so what to do?
Live with it or work in a 32bit RGB workspace, this allows the full YCC to be transferred and color processed in RGB, the idea at 8bit is that 16 - 235 YCC maps to 0 - 255 RGB the whole 0 - 1 thing in 32bit speak, anything outside of 16 - 235 can create negative values: ie: below 0 and values above 1, these can be held and used at 32bit wihout gamut clipping, allowing us to grade the output into the 0 - 1 RGB range for encoding back out to 16 - 235 YCC.
Display of coarse is still 8bit and should be based on a 16 - 235 YCC to 0 - 255 RGB conversion, not 0 - 255 YCC to 0 - 255 RGB which Vegas appears to do then at least in 8bit project, but at 32bit it's only the display, not the processing, so there is far far less chance of gamut clipping in 32bit, giving us room to grade without loss from dodgy YCC to RGB conversion.
YCC encoded output from the NLE should always have 16 - 235 levels unless h264 and we reflag the output 'fullrange', that's only possible to do with h264 which has a VUI Options extension to the specification. Annex 5.
For 8bit RGB image sequence output as long as the correct initial luma squeeze is done in YCC to RGB and a BT601 color matrix used, then the levels / contrast etc will be ok in RGB for Canon MOV's. Everything falls in the 0 - 255 RGB (0 - 1 range).
For cameras that shoot 16 - 255 YCC and don't flag the source full range, ie: outside of 16 - 235 like the Sony NEX5n, FS100 etc in an MTS container, I don't think are flagged full range, so it can be more important for those camera sources to be used in a 32bit workspace.
For RGB image sequences converted at 32bit precision in the NLE where YCC levels go beyond 16 - 235 or for Canon MOV's when no levels squeeze has been done then a image format like EXR is required that can store RGB levels range < 0 and > 1.
The 32bit precision part of the process maps at the 8bit YCC range is mapped into the 65536 levels range of 32bit it allows greater precision in color processing.
So what you may ask...
Well the Sony Vegas preview monitor and the playback monitor both work as standard in cRGB colour space.
There's no such thing as cRGB, it's sRGB in the monitor, sRGB / BT709 are the primaries that define the gamut. What you seem to be inferring is that Vegas displays the sRGB from the YCC (rec709) to sRGB (RGB) source assuming full 8bit levels rather than 16 - 235 limited range?
So you are now viewing a sRGB video in a cRGB viewer. The end result is a dull, washed out look to someone viewing it. As a user unaware of why this is being caused, you then alter the levels and colour to correct the washed out look (wrongly as there is actually nothing 'wrong' with the video clip itself, just your viewing of it in cRGB space) and you end up applying completely wrong levels/correction as a result. Due to Sony Vegas Studio not having any meters or histograms, it makes the whole situation even more confusing.
The underlying problem is the decompression of the original h264 source and handling of that by any vid card intervention that Vegas may use, if the fullrange flag is respected the YCC will be squeezed into 16 - 235 YCC, then converted to 8bit RGB (16 - 235 to 0 - 255 RGB) for playback and that will be correct preview. Going back to the full range luma test files.
When you then render out the Vegas timeline as a H264 video, the resulting render has the blacks/detail all wrong (amongst other things).
The workaround answer, is to apply a 'Studio RGB to Computer RGB' level correction (there is a preset) on the main Vegas preview window. Carry out all your edits, levels and corrections, in the knowledge that your preview window is now visually 'correct' and then when you have completely finished, then remove the 'Studio RGB to Computer RGB' level correction preset, just before you render out the timeline (ie. you must remove the 'Studio RGB to Computer RGB' level correction preset you previously applied to the preview monitor).
Hopefully this info will save someone else having the same hassle.
Apologies to anyone who thinks the above is obvious and to all those out there using 'proper' software with meters etc ;-)
I've heard this too, but had to respond as all the Sony speak about cRGB etc is just plain bad.