best format to publish raw video?

Started by favvomannen, August 08, 2017, 04:08:01 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

favvomannen

whenever i export from raw dngs to .mp4 (dual vbr) 150mbps- 200mbps via aftereffects or  premiere pro i can still see the quality degradation.

is there just the way it is?

PaulHarwood856

Hey favvomannen,

     Are you on Mac or Windows? I recommend extracting your DNGs (Cinema DNG Lossless is best) and bring into After Effects. Use the Smart Import 2 Script for After Effects (found here on the forum), and this will load all of your clips and sync the audio files. I recommend converting to a flat profile in Adobe Camera Raw (Cinelog-C is the best I've used - you can purchase online for $50), and convert all the sequences into ProRes 4444 XQ with Alpha if on Mac, or GoPro Cineform RGBA 12 Bit if on Windows. Then grade and edit in Premiere Pro. This is called transcoding, and will be easier to edit on your computer and solves a lot of headaches with Adobe dynamic link. I hope this helps, and please let me know if you have any questions.

- Paul

bpv5P

Do you want to publish or archive?
1- Publishing = lossy codec
2- Archiving = lossless codec

Good practices:
1- H.264. Alternatively, use VP9 (very slow) or H.265 (not supported for streaming)
2- Lagarith Lossless

For lossy, you should not use the high bitrate you're using, since the algorithm is not meant to be used that way. For reference on bitrate settings, see this youtube page:
https://support.google.com/youtube/answer/1722171?hl=en

For better results, you can encode with Lagarith, then use Handbrake:
https://handbrake.fr/

You can also use a general compression tool for long term archiving, like .7z (LZMA2):
http://7zip.org

As for ProRes or Cineform, as @PaulHarwood856 suggested, you can also them too, but they are proprietary, and ProRes cannot be used outside of Mac OSX platforms.
Another suggestion is to use DNxHR instead of Lagarith, but it's not trully lossless.

favvomannen

Neither publish nor archive. Want to give a client a usb with a high quality mp4 for them to play on pc or to tv.

200mbps mp4 dual pass from is a bit irritatingly far away from cinemadngs in terms of quality.

1. will filmgrain help when compressing to mp4 . (keeping the organic feel of raw)?

2. will upscaling to 4k before compressing to +200mbps help maby, or both? (this ive heard is good for publishing on youtube). 4k bitrate higher online.

any other tips?


PS PAUL hartwood. im on win10, i actually have bought  cinelog c, havnt gotten into it yet, but i really like the natural colour of acr with a slight saturation gain for a start. really magical canon colours, i place dng folders in a 8bit sequence in aftereffects then export as a totally uncompressed AVI file straight from ae.

im guessing its alot like a prores 444 file?

then edit in premiere. but the mp4 leaves some to be desired. its good but i still think it could be better. im sitting 2 meters from a 75 inch tv when editing an watching content, maby thats why i see the degradation so well. hmm.

bpv5P

Quote
1. will filmgrain help when compressing to mp4 . (keeping the organic feel of raw)?
No, it will actually increase the size of your file.
Quote
2. will upscaling to 4k before compressing to +200mbps help maby, or both? (this ive heard is good for publishing on youtube). 4k bitrate higher online.
Absolutelly no. Where did you heard it?
Quote
any other tips?
Export on Lagarith. Import on Handbrake Nightly build. Encode using 2-pass 30MB/s (no more). Enable Sharpen filter and deblocking in filter tab.
This will produce a smaller file size and the loss of quality will be unperceptible for basically any person (assuming your footage is 1080p - if it's 2-4K, you need higher bitrate; 50MB/s is enough).
Your assumption that a 200MB/s H.264 encoded video is loosing quality is probably placebo (or nocebo).

Also, no need to buy Cinelog-C. The MLVProducer with Alexa-Log does basically de same, and the author could not provide evidence that his DCP is better than free alexa-log implementation.

PaulHarwood856

Hello favvomannen,

    I recommend transcoding with Cinelog-C to GoPro Cineform 12 bit before editing in a timeline. That way when the finsl project is exported to mp4 there is less degredstionnin quality.

Hello bpv5P,

     I've used MLVProducer to extract Cinema DNGs (no compression) for one project. It was good, and I still recommend taking Cinema DNGs from MLVProducer and bringing into Adobe Camera Raw through After Effects. Andy600 gave you an explanation about Cinelog-C. I am glad I purchased it, and have no regrets.

- Paul

favvomannen

ahh. ive shoulda done that. i like that tip.

can i just aswell make a 16bit sequence in ae and import the dng to that sequence? (to suit my workflow)?  then render to .avi? it should be as good as cineformgopro to edit with?

aftereffects avi is uncompressed an  keeps the bitrate?

favvomannen

Quote from: bpv5P on August 08, 2017, 11:50:40 PM
No, it will actually increase the size of your file.Absolutelly no. Where did you heard it?Export on Lagarith. Import on Handbrake Nightly build. Encode using 2-pass 30MB/s (no more). Enable Sharpen filter and deblocking in filter tab.
This will produce a smaller file size and the loss of quality will be unperceptible for basically any person (assuming your footage is 1080p - if it's 2-4K, you need higher bitrate; 50MB/s is enough).
Your assumption that a 200MB/s H.264 encoded video is loosing quality is probably placebo (or nocebo).

Also, no need to buy Cinelog-C. The MLVProducer with Alexa-Log does basically de same, and the author could not provide evidence that his DCP is better than free alexa-log implementation.

thx, i have my work cut out for me learning handbrake and all these new things. thx for all tips very much.

1. its not placebo, but im sitting very close to a 75inch tv while editing, im pixelpeeping.

2. my "thought" is that 4k downscaled to 1080p is scientifically prooven to increase quality, therore i thought that upscaling to 4k, render in 4k would gain me more bitrate per second  while watching it downscaled on my tv, thus more total quality per second.     

(i will try this and post results)

(but im theorizing here)

Andy600

Quote from: bpv5P on August 08, 2017, 11:50:40 PM
Also, no need to buy Cinelog-C. The MLVProducer with Alexa-Log does basically de same, and the author could not provide evidence that his DCP is better than free alexa-log implementation.

@bpv5P - When have I ever said I 'could not' provide evidence?  ::)

As I said in a previous post, I am happy to publish comparisons but these things take time to do properly and I need to slot it in around my other work and commitments. I will publish comparative test results of Log-C produced in MLVProducer vs Log-C derived using Cinelog DCPs asap so you can see for yourself which is 'better'.
Colorist working with Davinci Resolve, Baselight, Nuke, After Effects & Premier Pro. Occasional Sunday afternoon DOP. Developer of Cinelog-C Colorspace Management and LUTs - www.cinelogdcp.com

Kharak

As soon as you import your footage as 8 bit in AE, the damage is already done.

Set your Project Settings to 16 bit.
once you go raw you never go back

favvomannen

but is the damage already done really? i never get 8bit banding in sky importing it into a 8bit sequence. colours are also amazing. making me wonder? anyways i dont have a hdr screen.

DeafEyeJedi

Quote from: Kharak on August 09, 2017, 01:49:34 PM
...Set your Project Settings to 16 bit.

Instead of the recommended 32-bit Float?
5D3.113 | 5D3.123 | EOSM.203 | 7D.203 | 70D.112 | 100D.101 | EOSM2.* | 50D.109

favvomannen

again i dont get any color banding or other typical 8bit problems when importing dngs to a 8bit timeline in ae. in fact the color is pristine when exported to uncompressed avi.

do i really degrade footage?

i see the degradation after .mp4 export though.

bpv5P

Quote from: PaulHarwood856 on August 09, 2017, 06:44:46 AM
I still recommend taking Cinema DNGs from MLVProducer and bringing into Adobe Camera Raw through After Effects.

Why would you do that? CDNG is raw, it will take no effect.

Quote
Andy600 gave you an explanation about Cinelog-C. I am glad I purchased it, and have no regrets.

No, he did not. No evidence.

Quote from: Andy600 on August 09, 2017, 12:00:51 PM
I will publish comparative test results of Log-C produced in MLVProducer vs Log-C derived using Cinelog DCPs asap so you can see for yourself which is 'better'.

Would be really good to see it. Also, public, unaltered DNG's, so we can replicate your test...

bpv5P

Quote from: DeafEyeJedi on August 09, 2017, 06:56:21 PM
Instead of the recommended 32-bit Float?

The DNG produced by MLV is only 14bit. 32-bit float is only for HDR images, like the ones produced on HDRMerge or similar software.

PaulHarwood856

Hey bpv5P,

     I'm simply giving my suggestions to favvomannen. Your workflow might be different. I personally find it easier to edit (I can still bring back highlights like raw CDNG) a log profile (Cinelog-C is what I use) in a friendly codec. GoPro Cineform RGBA 12 Bit in After Effects will work great for favvomannen. And I've read online working in 32 bit float is beneficial vs 16 bit. Again everyone has their own workflow.

- Paul

Hey favvomannen,

     I would recommend for future projects working in After Effects in 32 bit float. I think you will really like the workflow of GoPro Cineform in Cinelog-C. It's friendly to edit with, you can pull back highlights, and Adobe works well with it. I hope this helps you out, and please let me know if you have any other questions.

- Paul


bpv5P

Quote from: PaulHarwood856 on August 10, 2017, 11:42:03 AM
And I've read online working in 32 bit float is beneficial vs 16 bit. Again everyone has their own workflow.

Ok, I don't want to be mean, but this is not religion or philosophy. 32bit, in this specific case, has no value, and this is not a "difference", you're just using it the wrong way.
You can keep doing it, though, I'm not stopping you from anything, I'm just bringing this to your attention.

Danne

32bit-float environment is recommended for certain reasons. You can find some of them here:
http://wolfcrow.com/blog/the-advantages-of-working-in-32-bit-float/

bpv5P

Quote from: Danne on August 10, 2017, 12:48:30 PM
32bit-float environment is recommended for certain reasons. You can find some of them here:
http://wolfcrow.com/blog/the-advantages-of-working-in-32-bit-float/

Ironically, the article itself agrees with me. Here:
Quote
None of this requires anything more than 12-bit maximum. The average professional requirement is 10-bit (or even 8-bit).

The article talks about VFX, since we're talking about 14bit image, I'll ignore that.
Be here we go:
Quote
1. Greater precision in calculations and color operations
2. A lot more colors to choose from, and results which are visible on a high-end monitor
3. Allowance for a greater tonal range, which helps in giving footage a better highlight or shadow roll-off, for example, that mimics how tones behave in the real world

You're already working with Red.709. Your color is already limited. 32bit will not allow your to have "a lot more colors to choose". Also, your monitor is already limited too, unless you have professional HDR-prepared monitors and, even with that, most people watch videos on normal monitors.
It will also not give you "better highlight or shadow", because your footage is in 14bit, not an 32bit HDR.

Quote
True compatibility with wide gamut color spaces, including the CIE XYZ space for DCI

Cinema works with Rec.709 or new Rec.2020. No one works with ProPhoto RGB.

Here's the best:
Quote
Hopefully, in the future, the entire pipeline – from camera to screen – will be in 32-bit (or even 64-bit)!

64-bit? Really? Are you doing astro-cinematography? No? Then don't.

Andy600

I think there may be some confusion here about image bit depths, workspace bit depths and Adobe specific architecture.

While the DNG files are 14/12/10bit, ACR can export in 8 or 16bit and processes in half-float space. If you set ACR to 8bits you will always get a lossy image and likely some posterization. This may only become apparent when you start grading and pushing the image around. Always set ACR to 16bits when dealing with raw images and 16bit TIFFs. The setting is accessible when launching ACR from Photoshop or Bridge and is persistent.

Even though your image may be developed and stored or transferred in 8 or 16bits, setting After Effects workspace to 32bit float is important for the operation of some plugins. It is also essential for VFX and avoids clipping the signal path.

Yes, 32bit containers are used for HDR images but, unless I'm missing something, that is not what is being discussed here. Incidentally a 16bitf .EXR is perfectly capable of holding extreme HDR imagery with lossless precision over 30 stops (i.e. 10^9 - well beyond what any camera is capable of and beyond any HDR images you are likely to make) and much more economically than 32bit files but all this is overkill unless you work in VFX and use it for interchange when sharing between artists and studios i.e. an ACES pipeline.

You can also store HDR in lower bit depths through log encoding and done properly you will maintain a relationship with linear light ;)

Quote from: bpv5P on August 10, 2017, 04:00:15 AM
Why would you do that? CDNG is raw, it will take no effect.

Maybe because he want's to develop the image in ACR!?

Quote from: bpv5P
No, he did not. No evidence.

Not in this topic but there is plenty of info in the Cinelog topic. As for 'evidence' see below.

Quote from: bpv5P
Would be really good to see it. Also, public, unaltered DNG's, so we can replicate your test...

Yes, and the source MLVs (I don't think MLVProducer can develop DNG image sequences!?) and of course detailed process information. That's what I mean by doing it properly  ::)


Honestly, with your level of interest I'm beginning to wonder if you are secretly working for us using reverse psychology ??? ;D
Colorist working with Davinci Resolve, Baselight, Nuke, After Effects & Premier Pro. Occasional Sunday afternoon DOP. Developer of Cinelog-C Colorspace Management and LUTs - www.cinelogdcp.com

bpv5P

Quote from: Andy600 on August 10, 2017, 01:13:51 PM
Honestly, with your level of interest I'm beginning to wonder if you are secretly working for us using reverse psychology ??? ;D

I'm not interested in your product specifically or doing the "devils advocate" here, I just don't want people to spread false information. Remember that people will read this in future, through findings in search engines.
But, if you show us the results of your comparison, and it really has noticiable better DR and color than Alexa-Log, I'll for myself advocate for Cinelog-C and say "I was wrong about your product" here.

Andy600

Quote from: bpv5P on August 10, 2017, 01:05:26 PM
Cinema works with Rec.709 or new Rec.2020. No one works with ProPhoto RGB.

Oh lord  ::) @bpv5P maybe you should step back and do some research before posting such nonsense.

Cinema is not and never has used Rec709 or Rec2020 colorspaces. These are ITU HDTV and UHDTV broadcast standards. The standard Cinema colorspace currently in use is DCI-P3. ProPhoto is not used for broadcast or Cinema but photographers work in ProPhoto all the time.

The article may be about VFX but most of it applies to raw workflows because, in both, you are dealing with scene-referred linear imagery. Even when a raw image is developed, viewed and processed in a display space, maintaining a mathematical connection to linear light (a fundamental of physics), especially where color and apparent realism is is important, is still very desirable.

Quote from: bpv5P
You're already working with Red.709. Your color is already limited. 32bit will not allow your to have "a lot more colors to choose". Also, your monitor is already limited too, unless you have professional HDR-prepared monitors and, even with that, most people watch videos on normal monitors.
It will also not give you "better highlight or shadow", because your footage is in 14bit, not an 32bit HDR.

This is only relative to images that have been rendered in Rec709. Raw images do not have this limitation and this is why you set your working space bit depth as high as possible to retain all the color and dynamic range of the material. This also applies to properly encoded, perceptually lossless log material.
Colorist working with Davinci Resolve, Baselight, Nuke, After Effects & Premier Pro. Occasional Sunday afternoon DOP. Developer of Cinelog-C Colorspace Management and LUTs - www.cinelogdcp.com

Andy600

Quote from: bpv5P on August 10, 2017, 01:34:15 PM
I'm not interested in your product specifically or doing the "devils advocate" here, I just don't want people to spread false information. Remember that people will read this in future, through findings in search engines.

Of course. Which is why I try to remain factual, stand by what I say and acknowledge if/when I am wrong.

Quote from: bpv5P
But, if you show us the results of your comparison, and it really has noticiable better DR and color than Alexa-Log, I'll for myself advocate for Cinelog-C and say "I was wrong about your product" here.

and that I have no problem with :)

DR is only a one part of it. You should also pay attention to color ;)
Colorist working with Davinci Resolve, Baselight, Nuke, After Effects & Premier Pro. Occasional Sunday afternoon DOP. Developer of Cinelog-C Colorspace Management and LUTs - www.cinelogdcp.com

bpv5P

Hi Andy600.
First, let me assume my error: in fact, cinema does not use Rec.709 as the working space, but for the broadcast. My fault.

But, here we go:

Quote from: Andy600 on August 10, 2017, 01:36:28 PM
The standard Cinema colorspace currently in use is DCI-P3.
Maybe high-end cinema ("US-American film industry", as defined by technicolor), yes.
ACR does not uses it by default.



ACR needs you to define it for yourself and most people don't do it.

https://helpx.adobe.com/after-effects/using/color-management.html

A quote from the adobe page above:

Quote
For best results, when working with 8-bpc color, match the working color space to the output color space. If you are rendering to more than one output color space, you should set the project color depth to 16 bpc or 32 bpc, at least for rendering for final output. The working color space should match the output color space that has the largest gamut. For example, if you plan to output to Adobe RGB and sRGB, then use Adobe RGB as your working color space, because Adobe RGB has a larger gamut and can therefore represent more saturated colors. To preserve over-range values, work in 32-bpc color for its high dynamic range.

Suggestions for working color space choices:

    SDTV NTSC or SDTV PAL is a good choice if you're making a movie for standard-definition broadcast television, including standard-definition DVD.

    HDTV (Rec. 709) is a good choice if you're making a movie for high-definition television. This color space uses the same primaries as sRGB, but it has a larger gamut, so it makes a good working space for many kinds of work.

    ProPhoto RGB with a linear tone response curve (gamma of 1.0) is a good choice for digital cinema work.

    sRGB IEC61966-2.1 is a good choice if you're making a movie for the Web, especially cartoons.

So, this page shows a least some people work on ProPhoto RGB (although I would also not advocate it), you've said "ProPhoto is not used for broadcast or Cinema".

Now to the main point. The same page above says that you "To preserve over-range values, work in 32-bpc color for its high dynamic range.". I do not agree with that. Adobe is probably talking about VFX, and our point here is digital camera RAW files.

Quote
This is only relative to images that have been rendered in Rec709. Raw images do not have this limitation and this is why you set your working space bit depth as high as possible to retain all the color and dynamic range of the material.

Most people watch in Rec.709. Displays used today are not capable of reproducing wide-gamut.
The concept of maintaining DR through bit depth seems totally bullshit for me. If you have configured a working color space, you've already limited that (unless you're working with tangential wide gamut, and that's not the case since you've said DCI-P3). Bit-depth precision may calculate more precisely the colors (although it would not be perceptible), but it cannot preserve more luma DR, it's already limited to 14bit, your greys is already there, you can't create this information from nothing.


tl;dr you're talking about an idealistic POV about workflow, but people watch our content on realistic devices, that are not HDR, not precise and most people will give zero shit about a 0,2% shift in your red spectrum on their 4-inches Iphone screens, because most people can't even perceive this. Even on 4k displays you can feel it. I know because I've tested.
You all want to work with 32-float, wide-gamut, galactic-fucking-computer-expensive-workflow, go ahead, I'm not stopping you from anything. But I'll not follow this.

Related meme
]

P.S: Another quote, this time from:
https://forums.creativecow.net/thread/2/1026156

Quote
RED footage is interpreted as Rec 709 in After Effects by default. All transformations to/from other color spaces will be done with that in mind unless changed manually.


Andy600

Quote from: bpv5P on August 13, 2017, 01:11:30 AM
Hi Andy600.
First, let me assume my error: in fact, cinema does not use Rec.709 as the working space, but for the broadcast. My fault.

Thanks for acknowledging your error.

Just to be clear, my reply below is not intended to mock you or belittle you in any way. I simply have to correct some of your misunderstandings as some of your opinions that may be perceived by some readers to be based in fact are incorrect.


Quote from: bpv5P
But, here we go:
Maybe high-end cinema ("US-American film industry", as defined by technicolor), yes.

DCI-P3 (by Digital Cinema Initiatives) is the standardized colorspace of digital cinema projectors and was published by SMPTE. If you are delivering media for projection in a cinema (usually as a DCP - Digital Cinema Package in XYZ colorspace) it is transformed to DCI-P3 on projection. This is the universally accepted standard around the world. Technicolor don't really come into this other than their DI dept. adhering to the standard.

You can convert Rec709 to XYZ or DCI-P3 but you will have already clipped colors.

Quote from: bpv5P
ACR does not uses it by default.

I never said it did. ACR is a raw photo developer. I (Cinelog) retask the app to do something it was not specifically designed for.

Quote from: bpv5P


Quote from: bpv5P
ACR needs you to define it for yourself and most people don't do it.

This is only when using ACR with Photoshop. The colorspace setting has no affect when dynamically linking ACR with After Effects (and then onto PPro if desired) which is one of the many reasons Cinelog exists ;)

Quote from: bpv5P
https://helpx.adobe.com/after-effects/using/color-management.html

Quote from: bpv5P
A quote from the adobe page above:

So, this page shows a least some people work on ProPhoto RGB (although I would also not advocate it), you've said "ProPhoto is not used for broadcast or Cinema".

Adobe state that Linear gamma/ProPhoto is a 'good choice' for digital cinema work and in the absence of anything else I would agree simply because ProPhoto has a significantly larger gamut than Rec709 (i.e. less clipping of colors).

Color clipping also occurs in ProPhoto and this is another reason for Cinelog - because Cinelog-C's scene-referred log transfer and virtual primaries keep the color information in-gamut for the transfer to After Effects and can effectively store 13.5+ f-stops in as little as 10bits. The transfer boundaries are too small for a true linear transfer which is why Adobe state 'linear gamma'.

The term 'Linear' is a much abused term. Linear (to light) and Linear (gamma) are not the same thing.

Quote from: bpv5P
Now to the main point. The same page above says that you "To preserve over-range values, work in 32-bpc color for its high dynamic range.". I do not agree with that. Adobe is probably talking about VFX, and our point here is digital camera RAW files.

Again you are assuming anything HDR has to be VFX related. This is not true.

Any digital camera from the past 10 years that produces raw images can capture HDR images relative to most current displays (with the exception of maybe a Dolby Pulsar). Your own camera can likely take images of 10+ f-stops and in Rec709 this is HDR because the data can not be represented linearly within that colorspace.

There are also many transient pixel-level operations in plugins that can produce over-range values - these would be clipped in an 8 or 16bit environment.

Regardless of this misunderstanding. What applies to VFX workflows generally also applies to raw image workflows because of the physical properties of light.

Technically speaking you can work in 8bits if your deliverable is 8bits, the same colorspace and any plugins or pixel manipulations in the app are handled in an internal float space but you will always irretrievably lose information if your source material is 10/12/14bit raw. Obviously, a 14bit raw image will fit in a 16bit workspace but can still clip with some operations so switching to a 32bit float space is desirable to avoid any data loss and important if you are targeting other colorspaces (Adobe even say this).

Quote from: bpv5P
Most people watch in Rec.709. Displays used today are not capable of reproducing wide-gamut.


Just because most viewers may watch in Rec709 or sRGB does not mean that you, the cinematographer, DIT, colorist, editor or software engineer should not aim to maintain data integrity, color fidelity and dynamic range throughout the entire process until output. There are plenty of monitors, TVs and even smart phone screens that now support wider gamuts (including DCI) and uptake is accelerating. HDRTV is currently in a standards battle but it will eventually become the norm and I suspect sooner than expected.

By maintaining scene-referred masters, either raw, .EXR (ACES) or economical log you can future-proof your videos and take advantage of new HDRTV standards and displays.

Quote from: bpv5P
The concept of maintaining DR through bit depth seems totally bullshit for me. If you have configured a working color space, you've already limited that (unless you're working with tangential wide gamut, and that's not the case since you've said DCI-P3). Bit-depth precision may calculate more precisely the colors (although it would not be perceptible), but it cannot preserve more luma DR, it's already limited to 14bit, your greys is already there, you can't create this information from nothing.

What may seem like bullshit to you is the bread and butter of any reputable VFX house and colorists who have a basic understanding of color science.

Defining a gamut does not necessarily restrict dynamic range! Scene-referred colorspaces (ACES 2065-1, Log-C, Cinelog-C, SLog3/S-gamut3 etc) maintain the relationship to linear light (and thus maintain quantifiable linear dynamic range) and have linear RGB values that extend well beyond Rec709, sRGB, ProPhoto, Adobe 1998, DCI-P3 and other display-referred colorspaces.

It's not about preserving more, it's about preserving what is there and, where log colorspaces are concerned, preserving what is there while compressing the signal in a visually lossless manner for lower storage costs - with the convenient benefit of a film-like response to light and simple mathematical inversion to a linear-light representation of the data i.e. identical or close to identical pixel values of the debayered, white balance raw image.

A colorspace is the only way our eyes can make any visual sense of the captured data and so the data must be brought into a colorspace (preferably a bigger colorspace than it will ever be displayed in) in the least destructive way then transformed to a display space in order to reproduce the image as faithfully as possible in terms of perceptual color appearance and contrast, relative to the device (and it's colorspace) used to display the image. In other words there is no getting away from colorspaces.

Your eyes have a colorspace and the idea is that digitally captured data should be transformed into a colorspace resembling what your eyes see (to the best of your display's capabilities). To be clear, it's not your particular eyes but the eyes of a set of observers who participated when the standard CIE models were derived. There will always be differences between how each person sees the same image or color for a multitude of physiological and technical reasons but we use CIE standards as a foundation in color science (for instance, as a model of how the eye perceives light (CIE-1931) or a connecting space i.e. CIE-XYZ etc). This at least tries to maintain some predictability and uniformity to everything we do in the digital cinema, video and photographic arts.

DR (Dynamic range) is only one part of this process. Obviously you cannot reproduce 15 stops of DR on a display that can at best display only 6-7 stops but you can manipulate the data in such a way as to map the higher dynamic range to something that looks perceptually correct to most viewers - this should be the very last thing you do in the processing pipeline because once that mapping happens there is no way to get the extended data back.

Quote from: bpv5P
tl;dr you're talking about an idealistic POV about workflow, but people watch our content on realistic devices, that are not HDR, not precise and most people will give zero shit about a 0,2% shift in your red spectrum on their 4-inches Iphone screens, because most people can't even perceive this. Even on 4k displays you can feel it. I know because I've tested.
You all want to work with 32-float, wide-gamut, galactic-fucking-computer-expensive-workflow, go ahead, I'm not stopping you from anything. But I'll not follow this.

Related meme
]


That is quite a statement.

I am talking about a workflow that has been developed through years of research and is in use by anyone who actually cares about their work regardless of whether it is for a tentpole Hollywood blockbuster or a Youtube video of your cats, seen on an phone screen or an iMax screen.

That 0.2% 'shift' in the red spectrum you talk of may actually be the logo of a huge corporation in an advertisement across multiple media outlets. You may not see it on that phone but what happens when the ad is viewed on something better, something calibrated like a cinema screen? - also, there are already millions of wide gamut 4k devices in the hands of the public. The shift becomes more and more apparent the better these device become and somehow these tiny, inconceivable differences start to matter.


There is a great deal of psychology involved with color perception and accurately translating color in a controlled, calibrated environment leads to less error down the line but that is a whole other topic.

Quote from: bpv5P
P.S: Another quote, this time from:
https://forums.creativecow.net/thread/2/1026156

Apart from that article being from 2012 and related to After Effects CS5 I don't understand the relevance of you highlighting it.

RED footage may have been brought into AE in Rec709 but a) the colorspace can be changed (hell, you can even convert it to Cinelog-C if you want ;) ) b) assuming the workspace is 32bit float the RED footage is held in an unclipped float space so no color is lost and c) most RED users and colorists use RED's own tools to produce intermediates anyway.
Colorist working with Davinci Resolve, Baselight, Nuke, After Effects & Premier Pro. Occasional Sunday afternoon DOP. Developer of Cinelog-C Colorspace Management and LUTs - www.cinelogdcp.com