Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - Andy600

Pages: [1] 2 3 ... 48
1
Found this neat open source software. Might be useful of everyone working with LUTs:
https://lattice.videovillage.co/


Lattice is great but it's not open source and it's Mac only.

You should check out https://cameramanben.github.io/LUTCalc/. The online version is free but the Chrome and OSX versions are only a couple of dollars. You'll need to understand what you're doing to get the best from it but it has a comprehensive feature set and the source code is available if you wanted to go deeper.

2
Edit: Sorry, this may be counter intuitive to my earlier reply but to clarify*:


@Andy600 Thanks for the explanations. So ColorMatrix2, when inverted, describes a transform from "CameraRGB" (debayered) to XYZ with D50 white point.(?)

Not exactly. A CIE white point isn't yet assigned but you can theoretically assume it is (or later will be) D65 white because the temperature of the illuminant under which the calibration is taken/made is ~6500K. DNG math works in XYZ space (CIE D50 white or little x 0.3456 little y 0.3585 big Y 1.0000) so the color matrices need adapting to D50 to make DNG WB math work. CameraRGB doesn't have a defined white point other than what the calibration illuminant dictates, there is a point in the matrix where R,G and B would = 1 so this is used as white.

Quote
So... when that image in XYZ is transformed to sRGB (for viewing) it will be the same as having the white balance slider set to ~5000K in a raw converter and looking at it?

That depends entirely on the raw converter and how it calculates/interprets white balance. CameraRGB is not itself neutral and you will be viewing it on a monitor, likely to be using a D65 white point (so the wp should have been adapted for it) and there are white balance multipliers to factor into it which control the slider so the app slider should reflect the as shot color temp (there will likely be a difference in what the WB is interpreted and displayed as depending on which method of white balancing the app uses).

Quote
And it makes a lot of sense that the sensor would respond differently at lower colour temperature, never thought about that before. But I'm a little confused as to what the temperature of ColorMatrix1 (the ones we have in ML code) is... where do I find out?

ColorMatrix 1 is ~2850K (the approximate temperature of an old school tungsten filament incandescent light bulb). AKA CIE Standard Illuminant A

for how DNG works look in the Adobe DNG SDK :)

(*Might have to clarify some of this further as it's from memory. I need to refer to my notes to be sure :-\)

3
@Andy600 I have been wondering for a very long time, is ColorMatrix2 D65 white point? I have assumed this, and it seems to match(??), but not tested it with actual comparison. If you could tell me definitely if it is D65 or D50, that would make me quite satisfied. These are adobe matrices I think (is this right bouncyball?)

No. It's not that simple unfortunately. Technically speaking it's D50 but would be observed as green on a display because of the bayer pattern.

The color matrices describe a transform from XYZ (D50) to non-white balanced camera RGB. The D65 part only references how the color calibration was performed i.e. D65 is a calibration under a daylight illuminant (~6504K) and Standard A is under a tungsten illuminant (~2856k). The sensor behaves differently depending on the spectral power distribution of the light source hence why it's a good idea to have 2 calibrations under different temperatures.

4
It's the "AlexaV3_K1S1_LogC2Video_Rec709_EE_aftereffects3d". Parameters are Photometric Scaling, LUT Dimension 65^3 mesh and Bits set to default.

Does it include a colorspace conversion?


Quote
If you mind, do you have any resource where we can get this matrice for tungsten?

It's in the camera_matrices.c you posted ;) (the second rows of each set are the Tungsten/StdA matrices)


Quote
Trying to save you some time, here is the camera_matrices.c:

Ok after a very quick look through  it looks as though the full set of Adobe matrix coefficients are there (in camera_matrices.c) but only the second matrix (D65) is assigned. It also shows a Dx0 matrix for the 5D2 ??? (is this used for everything?). The Adobe DNG SDK has everything needed for raw color all in one place so it escapes me why devs continue to cherrypick non-standard info from the internet. It must be a coding thing!?

Another pet peev of mine is XYZ colorspace being assumed to have a D65 whitepoint (as with that xyz2rgb matrix). XYZ colorspace, as referred to in Adobe DNGs and ICC profiles and most apps built on ICC, has a D50 white point and all the math uses D50 with chromatic adaptation where necessary to change the white point. You have to be very careful to not mix up D50 and D65 matrix math or you will get the wrong colors. I'm not saying that's what's happening here but there is a mixture of methods in use and I would have to pick through the code to see how it's working.

Quote
I just don't know how it assign each matrix. Through MLV metadata? I've found mlv metadata not to be so reliable (in the past... don't know if anything changed in past year).

You could add it manually with Exiftool, override it with a DNG profile in ACR or ask the dev to enable it. The actual difference to color is usually very small (but can be more with certain lighting) but the second matrix is preferable for white balancing non-daylight sources (as with your footage for example).

The Log-C math looks correct but where and when does it get applied? Is it in float or int and before/after colorspace is assigned. This can all make a difference. AlexaLog should match Log-C in other apps (at least the gamma part because MLVApp is limited to sRGB primaries from what I can tell).

The white balance multipliers are Canon's and wouldn't be relevant to DNGs if the app utilized the SDK. Adobe's white balancing is IMO far superior and more importantly, neutral but it's a bit more complicated to implement.

5
Yes, I suspected it was more complex than I thought.
MLVApp uses AlexaLog from this paper, it is indeed EI800. The ProRes color output is bt609 from ffmpeg. I think Premiere Pro reads it normally by default. The color matrix MLVApp is using came from ACR (actually you're credited in the source code for helping), so it's probably "precise" enough...

Do you know which ARRI lut you are using exactly? If you're unsure you can send me it or tell me the parameters you selected in the LUT generator and I'll check. If it's the full transform then there's your main problem. Your footage is in BT.601 or BT.709 (there's no bt609) and you are transforming it as if it were in a much wider gamut. This will not only cause over saturation but hue rotation because the primaries lay on a different axis and, because the gamut is a lot smaller than Alexa Wide Gamut, you're also likely losing some color information when rendering to Prores 'AlexaLog' in MLVApp.

The matrices are originally from Adobe. MLVApp looks to be writing a single matrix (D65) which should be ok for most daylight shots but white balance accuracy would be improved a bit if it also included the tungsten matrix.

Quote
Yep. Also, the background and the hair tones are in the same shade of grey, so when I try to get the background less magenta the hair just changes together  :'(
I'm also using Lumetri from Premiere for this and not Resolve. My fault, I can't expect very much from Lumetri. [Edit: I should just buy a 18% grey card already, I know]

You can get decent results with Lumetri. Try using secondaries and, if necessary, multiple instances of Lumetri to isolate and grade problem parts of the image (after your primary grade).

I think the #1 piece of color related advice I would give is to always shoot a reference/target. A simple grey card can be very cheap and once you have that in shot you have a reference for exposure and white balance. I would say it's essential for any commercial shoot and most casual shooting really benefits from it.

Quote
After your last reply I changed it a little, but I was doing it because the skin tones just get's too dark after applying (linear) contrast. I can just apply a general gain, but highlights will clip. I'm using a curve like this (you can see I'm quite agressive in the highlights):



If you have any pro tip for me, I take it :)

Try adjusting the contrast pivot point lower if possible. Alternatively try increasing overall gain/exposure then pull down shadows and rolloff highlights i.e. a classic s-curve. This should give a more natural look.

Quote
Thanks. Indeed, the noise reduction (NeatVideo) is way too strong.

Yes. I have a rule with NR and that is to only use it if it's really necessary and then use as little as possible, limiting it to whichever channel(s) the noise is most apparent i.e. R,G,B or Luminance. Neatvideo is extremely good but is very easy to overcook if used as a broad stroke across everything. I also tend to limit sharpening to luminance or use highpass filtering on skin but that's always subjective.

Quote
Thanks a lot for helping Andy, I'm learning very much these days...

Your welcome :) I'll check out MLVApp's color when I get bit more free time.

6
I suspect 'AlexaLog' is only the Log-C curve (1D) so if you're using an official Alexa 3D lut that transforms both the gamma and gamut from Log-C to Rec709 you will get these types of color problems.

You need to know the gamut that the image is in before applying a specific technical 3D lut or you are only compounding your problems. If you don't know the gamut it is safer to use only a 1D lut to get from Log-C to Rec709 and then add your own color correction (before the lut).

This also assumes also that 'AlexaLog' is actually using Log-C math and that there are no colorspace or levels issues in the app. Also, you really need to know which Log-C curve is being used because it changes relative to exposure. The default as used in most NLE's and color grading apps is EI800.

Where can I find AlexaLog? I'll check it when I have some time.

Re: WB. Maybe. I didn't do any grading, only set a WB. I'm going only by what I can see on a vector scope and there is no neutral target in the shot. The model has a pink complexion and the beautician/make-up artist is more of an olive color so cooling WB will tend to make the pink hues more blue and less life-like, especially under mixed lighting. If I were grading this I would certainly be using qualifiers to isolate and treat the different skin tones independently.

Try doing a basic grade without a lut and see if you still get clipping. +3 on mid-tones is quite extreme and yes, will likely cause some banding, especially if done after the lut. Why are you pushing mid-tones so much?

This is purely a subjective observation and you may actually be going for that look but I find the skin smoothing (possibly extreme noise reduction?) in 01.png to be way too much. It completely loses any texture in the skin and looks very unnatural. Try dialing back on the effects and you'll get a much better look ;)

Are you using Lumetri in Premier or After Effects?

7
@50mm1200s

The problem is not the white balance unless you used As Shot or Auto.

I dialed in WB at 3850k (no tint) for a reasonably neutral balance but you won't get it precise without knowing the lighting or having a gray/white card target in the shot.




On the scopes it looks like you're also using a film lut or film look preset too? That is adding some heavy saturation to reds and magenta. The lut/look is also clipping highlights (quite badly) in the other shots and there's some unpleasant banding in the highlights. I would suggest trying to grade the look yourself or try a different lut/look but I see no significant problems with the DNG.


8
My 50D is out on loan at the moment but I'll answer a couple of your questions.

1. I have not had this happen personally. Have you tried different cards? Carefully cleaning the contacts etc?

2. I think you can re-assign the record button!?

3. Dual ISO does not work for raw video in the 50D.

4. You can fine tune shutter settings but I don't get exactly 180 degrees either. It's either slightly more or slightly less (I choose slightly less). If very much doubt you can tell any real difference in cadence and motion blur against something shot with a shutter at exactly 180 degrees. And, in some 'pro' cinema cameras, although it may say 180 degrees on the settings screen they too can be one way or the other depending on the internal clock frequency.

5. It's likely not just in mid-tones but that's where you're most likely to see it. It could be CA or white balance. Try other apps for processing first. The white balancing algorithm in some apps can cause magenta contamination (I have seen it happen with raw footage in in Resolve) so that's where I would put my money. If its happening across multiple apps then try shooting a repeatable test without any tint offset being set in camera, try a different lens with a different focal length, try a UV filter, basically try everything you can think of in camera then try all the apps again.   

6. Sounds like a bug.

re: 10bit (and 12bit) it is not supported in the nightly builds (yet) but there are some working 10/12bit builds for the 50D. https://bitbucket.org/daniel_fort/magic-lantern/downloads/.

re: ISO values - no. For that you would need to find them using the ADTG build then edit and compile your own build. It may be possible with a script but I don't have a clue there.

9
General Help Q&A / Re: DNG files playing back faster than real time
« on: April 04, 2018, 08:09:54 PM »
Ok, should have said 'container' not metadata  ::).

Agree with the 'Sequence Footage' suggestion. It's set and forget most of the time.

The 'Interpret Footage' function is still used if you have different footage with various frame rates.

10
General Help Q&A / Re: DNG files playing back faster than real time
« on: April 04, 2018, 07:24:33 PM »
Right+click on some imported DNG footage and choose 'interpret footage'. Set the FPS and click ok then right+click on it again and select remember interpretation. Then select your other footage and choose 'apply interpretation'.

You need to manually interpret any footage that has unrecognizable or absent metadata such as most raw and JPEG image sequences.

11
Hi @saf34,

You shouldn't be bumping up exposure in ACR with Cinelog as this adversely affects the math. If you need to offset exposure only do it in AE as described in the user guide.

My first suggestion is to make sure you are only using Raw metering when shooting raw. If you are consistently adding 1+ stop in post to your own taste you could try offset metering -1 stop then expose according to what you see on the LV screen however this is not best practice for achieving the best SNR and dynamic range for the chosen exposure (i.e. what Cinelog is basically designed to do). If you typically expose only using LV you will often clip highlight or shadow information unnecessarily (except where HDR situations force you to do so). It is far better to retain as much information as possible when shooting and make these kind of aesthetic decisions in post rather than clip it when recording and limit your options later.

The Picture Style through which you are monitoring is a little different to Cinelog Rec709 and has dynamic controls for altering contrast, saturation and color tone whereas Cinelog Rec709 is a fixed output (with a tone curve derived from a math function) and assumes the input is a properly metered exposure,  however both increase perceptual brightness by approximately the same amount (~1 EV). It's also worth noting that the LV screen is not calibrated to produce a Rec709 image in the same way as a display or monitor.

There is no one definitive Rec709 look so don't be afraid to experiment. Have you tried using a different look e.g. the Cinelog DSLR looks which are built to better simulate Picture Styles?

12
Raw Video Postprocessing / Re: Question regarding ML RAW > ACES
« on: March 23, 2018, 11:03:29 AM »
The ACES container format file (EXR) require a camera-specific IDT because the colorspace and white balance are already defined as (or should be) Linear ACES AP0 primaries. DNGs do not have a defined colorspace as such but the color matrix/matrices and white balance multipliers describe a transform to XYZ space and from there the ACES app can put the data into ACES AP0 colorspace. Problems arise when the white balance matrices are not correct/accurate or the implementation of color management/ACES in the app is not handled correctly.

Re: MLV App. Assuming the colorspace transforms are correct in the app (haven't tested so I'm not sure if it is), the corresponding IDT for your chosen output colorspace settings should work in other apps (Resolve, Nuke, Fusion etc). If you are creating intermediate log files you should aim to retain as much color information as possible so choose ProRes XQ or 444.

You should note that all PC based MLV apps are typically built with FFMPeg/Libraw libraries so the codecs are not official Apple ProRes and are limited to 10bit. This may be an issue if you create content for TV broadcast.

If you really want to use ACES I would suggest converting your MLVs to DNGs for use directly in DaVinci Resolve. However, initial color accuracy will depend very much on the MLV2dng conversion and how the converter implements white balancing, color matrices etc because they are not all the same. Keep your MLVs safe until your are satisfied with the raw2dng conversion.

13
The 50D build however goes a bit wonky for me. About every 30ish frames I'll get one frame where the bottom 3rd of the frame offsets to the left slightly. This happens in both 10 bit and 12 bit.

What build are you using and which app for MLV>DNG?

Can't reproduce it here.

14
Channel#16 has the buffer I use for UHD , but I see I bigger buffer at EDMAC#12 that I can try out , lot of info -- need to study more

Show EDMAC channels (EDMAC module settings) and push the joystick up. Does the channel flicker green? If it does you can't use it.





e.g. Ch2 here does flicker green but Ch1 does not.

15
Camera-specific discussion / Re: Canon 5D Mark IV
« on: December 14, 2017, 02:29:08 PM »
As an owner of a 5D4 I was waiting impatient to see ml raw support on my camera.Unfortunately this is not going to happen as it seems.
I had a 5D3 before and I have used many times raw.Great times!
For me there is nothing anymore to hold me on this forum.Alex and the rest of the team did amazong job in the past but the technology goes beyond and there are new cameras that need imo to get ml support.I wouldn't mind to pay a good amount of money for ML support for my 5D4 and Im sure that most of us would do the same also.
I don't care for 50d, 7d, 500d updates and support... they belong to the past! Every 4 years Canon announces a new 5D, it should be the first camera to receive ML support.
Nevermind it doesn't make any difference...

There is so much here that a1ex, other devs and forum contributors could take issue with but lets just say farewell.

16
Quote from: a1ex
- without raw video and without photo raw overlays enabled (e.g. global draw off, no other modules loaded)

all logged at 50u

Without LV

With LV

Quote from: a1ex
- with raw video enabled, old method (run it on top of regular nightly or raw_video_10bit_12bit)

still to do

Quote from: a1ex
- with raw video enabled, new method (on top of dfort's modified version)

using dfort mod

Quote from: a1ex
Also, for my own curiosity: log a still photo, short exposure, outside LiveView, with 500us

500u

Quote from: a1ex
(or 1 ms if that doesn't cover the entire process).

1ms

Quote from: a1ex
Look at the blue LED to see when it logs and when it stops.

Stays on then flickers briefly before showing preview and 'logged' message.

Quote from: a1ex
Take a look at the EDMAC connections screen as well (Show EDMAC channels, scroll to the right).

Ok. what am I looking for?

17
Solved lock-ups in edmac.mo and polished it a bit - changes on the "edmac" branch. Merges cleanly into raw_video_10bit_12bit, if you prefer to try it there.

PR open.

Working now :)

Not sure what you need from this but I ran the EDMAC test until it finished and also enabled logging. Uploaded the log files here.

18
@dariSSight read the last few posts and do not use any 5D2 build that uses EDMAC channels 0x01 or 0x02 for raw recording!!

More research needs to be done.

50D testers - the 50D builds using EDMAC 0x01 should be ok.

19
Does it help if you increase LOG_INTERVAL?

Nope. Tried 500ms, 1sec, 2sec, 5sec intervals with LV on/off, movie mode enabled/disabled etc and it still freezes.

21
... Whenever there's green on any of the channels, even for a very short time, that means "stay away from it".

Got it. 0x02 flashes green but 0x01 doesn't. I'm using 0x01 on the 50D so I assume it's safe!?

@dfort this needs checking on the 5D2 with the EDMAC module.

22
Edited my post because the first image was with movie record disabled ::). See the second image.

The scan says 'seems to work' for channels 2-6, 8, 10-13, 19-22 and 24-29.

Logging still freezes with LV off. Where do I set the interval?

23
@a1ex - I just compiled the edmac module. Is there anything useful I can provide you from the 50D?

movie mode was off for this:



and on for this:



0x02 looks good and compared to the 7D where the largest channel appears to be 0x10 it makes sense with my findings!? (unless I'm not properly understanding this :-\)

also, the camera freezes when I press 'log EDMAC usage'

24
@Andy600 - Do you feel lucky punk? Just kidding. Same question as above. Is 0x1 or 0x2 really better than 0x10 on the 50D or do you just want to be different from everybody else?

To put it simply, 0x01 and 0x02 work with every setting and resolution when set to mv1080 but I had a couple of bad frames with a random LV glitch when set to mv640 mode[/url]. I have tested more as @a1ex suggested and, so far, I've had no additional issues. Call it a favorable race condition, luck or whatever but LV and recording on the 50D appear to be very stable when it's set to mv1080 with channel 0x01 or 0x02. I'm still testing.

There appears to be no difference between these channels but I can only test empirically and have nowhere near @a1ex's understanding of how these cameras work. He may well know of good reasons why those channels shouldn't be used.

0x10 has some issues with crop mode recording and glitchy, unstable LV.

I get that it would be simpler for all Digic IV cameras to use the same channels but I think the 50D is going to be a bit different by virtue of it's 'added' movie modes.

25
1584х on 50D in no crop ? i see only 1568 on my camera  :-\

1584 is because I'm testing 10/12bit with the MLV-Lite module.

Pages: [1] 2 3 ... 48