Please write when it is ready to test.
Etiquette, expectations, entitlement...
@autoexec_bin | #magiclantern | Discord | Reddit | Server issues
This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.
Show posts MenuQuote from: a1ex on September 27, 2017, 06:40:39 PMI say obvious because:
For me it's not obvious at all. Except for the lack of clip warnings at ISO 160 1" (which I'm looking into), I don't see anything wrong with the QR zebras.
QuoteYou can probably compare the size of the rightmost values in the histogram to those just left of it and if it is above a certain threshold that should indicate a clipping. That may work for the "OVER" indication in the histogram but how will you propagate it back to the image (zebras)? Sounds like computation consuming.
and that means I should imagine some other heuristic for detecting the peaks
QuoteSure. QR is more important for photo.
The LiveView RAW overlay are not very exact, but not trivial to fix - let's figure out the QR ones first.
Quote from: a1ex on September 27, 2017, 05:47:35 PMThanks for explaining. It is possible that during the test I have had the settings to "Photo only". Now I tested again with "Always" and they are horizontal and unaffected by WB setting (histograms too).
Note the LiveView RAW zebras have horizontal lines (not diagonal), they show the same color as the clipped channel(s) (or black if all channels are clipped) and they have "square" edges for speed reasons (they operate on a very low-res image).
QuoteDoes that mean you can also make it to work in Play-button mode?
In QR (after taking a picture), speed is no longer an issue, so they are computed for every displayed pixel.
Quote
The Luma zebras (YUV-based) are diagonal red. The YUV RGB zebras also have horizontal lines, but thicker, and fully overexposed areas are solid black.
However, your LiveView histograms are RAW-based.
Can you upload your ML/SETTINGS directory so I can try to reproduce the issue?
Quote from: a1ex on September 27, 2017, 04:52:26 PMYes.
I'm unable to reproduce the black zebras - are they from Canon?
QuoteThanks for the tip. However sometimes it is easier to use my phone and just send the image via bluetooth instead of moving the card back and between the camera and card reader.
To capture the LiveView images, you may use Debug -> Dump image buffers (a 14-bit DNG and two 8-bit YUV422).
There's a screenshot option as well (no need to photograph the camera screen). It won't capture the fast (YUV-based) zebras well (as these are computed by the display controller), but should work fine with all other kinds of zebras (including the RAW-based ones).
QuoteYour LiveView screenshots show YUV zebras.My settings are (taken using the screenshot option):
Quote from: DeafEyeJedi on September 26, 2017, 04:39:21 PMThat would be very nice of you. Could you please also explain how you do it? I.e. are you doing it in QEMU or in camera and how is the crop_rec_4k version to be installed?
I can do this test for you if you still insist on it @heyjoe?
QuoteI suppose it is so but because I am very very new to ML I am extra careful (maybe extra paranoid too).
The probability of bricking is extremely low and almost impractical to not give it a shot either way.
Quote from: a1ex on September 26, 2017, 10:53:51 AMOk, I tested again (yes, RAW zebras is set to always). It seems zebras don't change on WB change in LiveView, only in Play+Light view. So if you can make Play+Light display raw histograms and zebras/clip warnings (or at least as an option) it would be great.
In your LiveView screenshots, the histogram is raw; just double-check "Use RAW zebras" is really "Always". If that still doesn't give raw zebras, it's a bug, but I'm unable to emulate LiveView in QEMU yet. A video might be useful to figure out what's going on.
QuoteThanks for explaining.
Safety-wise, they are about the same. These systems don't have a MMU, any task can write anywhere in the RAM, and Canon code saves their settings at shutdown by... reflashing the ROM.
QuoteYou are doing a great job.
The crop_rec_4k branch
... is *probably* a little safer.
QuoteI still haven't had the time to install that and try it out but I read that not only LiveView doesn't work in it but also Image capture and review without which I guess there is really nothing to test in QEMU, right?
In any case, the strongest safety net we have is the ability to emulate the firmware in QEMU
QuoteWhat is that?
(with the user's ROM),
QuoteHow? What if my camera gets bricked and there is no way to diagnose?
so if something goes really wrong (such as camera not booting or acting weird), I should be able to look into it.
QuoteIs that something I *must* do before installing the crop_rec_4k build? Please explain as for a layman as this whole thing with so many links to different long threads is a little overwhelming for an ML newbie
And, of course, the ROM dumper from bootloader.
QuoteI would be happy to but of course I am cautious not to cause a damage.
Well, somebody *has* to test this before it goes into mainline
Quote from: a1ex on September 26, 2017, 01:31:52 AMThanks.
Typo - reply #1 (as printed by the forum).
QuoteIt seems you have put the new heuristic in code which is great and I am curious to test it.
Check the crop_rec_4k builds in a few hours.
Quote from: a1ex on September 25, 2017, 11:44:29 PMReply #1 is from Walter.
See replies #2, #11.
Quote from: a1ex on September 25, 2017, 09:20:23 PMI have "Use RAW zebras: Photo" and the help info says "Will use RAW RGB after taking a pic". After reading your current reply I set it to "Always" but looking at the shots which I shared with Play->Light still shows different zebras and histograms.
You are looking at YUV-based zebras. Try setting "Use RAW zebras: Always".
QuoteIn Histogram type I have RAW-based (RGB) for which the help info says "Will use RAW RGB in LiveView and after taking a pic"
Also, when the histogram doesn't have vertical bars (stops), it's YUV-based.
QuoteAre you suggesting to simply wait for (the) next version?
Good point; however, I have some ideas to detect the presence of such a spike when checking the white level (a better heuristic). On Canons, the clipping is harsh (many pixels at the maximum value), with the possible exception of a few hot pixels.
Quote from: a1ex on September 25, 2017, 04:41:53 PMObviously neither you, nor I are mind readers, so it would be easier if you ask directly to avoid misunderstanding. Anyway here is the proof:
You said:
I'm not aware about such behavior, nor I can reproduce it, so it's your duty to provide some sort of proof. That was the reason for the link.
QuoteA way to correctly ETTR (even if it would mean not using ML).
As for your main question, I'm afraid I don't understand it. What should I recommend?
QuoteI can get within that range without ML (using UniWB JPG histograms) but I am looking for something more accurate as the JPG histograms are really bad (I have tried so may variations in Picture style but still they don't show accurately the saturation).
On 5D3, at full stop ISOs, the autodetected white level can be underestimated by log2(15283-3000-2048) - log2(15283-2048) = 0.37 stops (worst case), iff there are no overexposed pixels in the image. So, you may get false clipping warnings within that limit. If that's good enough for you, then just use it.
QuoteUnfortunately I am not a programmer and if I dare to touch your code I am risking to damage something. As you correctly pointed out - this would be a maintenance nightmare and there would be more and more questions. That's why considering your much bigger expertise I was hoping that we can diagnose the issue together and hopefully you, knowing your code, could fix it properly (or advise for a way to correctly ETTR in a different way).
Otherwise ...
QuoteFor which channel is that? Compared to previously mentioned max of 9960, this looks much closer to what RawDigger shows.
13307, 11388 (ISO 160, 1/8" - off by 0.27 EV)
QuoteI don't know Lua and I don't understand everything you do (I installed ML just 2 days ago). So please provide the steps for what you need me to do.
Do you have the patience to get a matrix of these values? You may use a Lua script if you wish.
QuoteI don't know why but RawDigger shows different values:
FYI, the max values from your raw file are:
octave:1> a = read_raw('_MG_5911.CR2');
octave:2> prctile(a(:), [10 50 90 99 99.9 99.99 99.999 100])'
ans =
3353 8279 11886 12639 12848 12943 13006 13102
and here's how the clipping warnings would look with white level 11388 (Canon's heuristic):
Quote from: a1ex on September 25, 2017, 02:16:55 PM
https://www.chiark.greenend.org.uk/~sgtatham/bugs.html
Enough chit-chat for now.
Quote from: a1ex on September 25, 2017, 07:35:40 AMThanks. I will look into it.
Sorry I didn't include more links - my display is defective, so I am/was using a smartphone to mirror the main desktop (so I can type quickly, but reading or looking up links is slow).
First post from QEMU thread gives you a README.
QuoteI already read this. My test confirms a slight improvement in DR after installing ML. Here are some calculated values and graphs showing the difference (diff = value_with_ML - value_before_ML):
CMOS/ADTG aka ISO research: http://www.magiclantern.fm/forum/index.php?topic=10111
QuoteWhat do you need to make it work accurately? I would be glad to provide test data if you don't have the time for it. Just let me know what I have to do.
Hardcoding clipping point: I know how to find it for short exposure times, but I don't know how it changes with long exposures. I only know it gets lower, from past bug reports, but didn't do a controlled experiment to find out by how much.
The white level detection is also explained in the iso160 thread. It starts from a low guess, then scans the image for brighter pixels, then backs off a bit - all this to make sure it shows the clipping warnings no matter what the correct white level might be. Your example case is either a bug or an edge case (didn't look yet).
In other words, I was just trying to come up with something that works on all other supported ML camera.
QuoteI am curious to try how it would work on photos but I don't find any module called iso_regs in the ML menu?
Sure, but the digital gain will get burned in the CR2 (so you'll be losing levels without gaining additional highlight detail in the raw file). You will gain more highlight detail in the JPEG preview, but I don't see it a good enough reason to implement and maintain this "feature". Rather, the CMOS/ADTG tricks do actually capture additional highlights (effectively increasing DR), and that's currently available in the iso_regs module for 5D3 (just not very user friendly, but at least you can tweak all these sensor gains from the menu).
QuoteThanks. So it is the light button that does the thing.
https://www.magiclantern.fm/forum/index.php?topic=8309.0 (old answer)
https://www.magiclantern.fm/forum/index.php?topic=15088.msg186141#msg186141 (updated)
QuoteI am not sure if we are talking about the same thing. The two links don't mention anything about logarithmic raw data. What I am talking about is not to look for a way to HDR but to use the actual DR of the sensor but instead of having it convert the light to linearly encoded data, to have the raw data in a logarithmic way. The advantages of that would be:
The closest approximation I can come up, besides dual ISO, would be this:
https://www.magiclantern.fm/forum/index.php?topic=19315.0
Alternating short/long exposures is doable, but not trivial. There are routines for doing arithmetic on raw buffers on Canon's image processor - documented here:
https://www.magiclantern.fm/forum/index.php?topic=13408.
If you can understand the sensor internals, you can probably change all its registers from adtg_gui. However, besides tweaking some gains at various amplifier stages, and overriding LiveView resolution to get 3K/4K/fullres LV, I wasn't able to find anything useful for controlling the clipping point beyond what's already documented in the ISO research thread. I'm not saying there isn't - I'm just saying I'm not familiar with sensor electronics, so maybe I don't know where to look.
There are some CMOS registers that appear to adjust the black sun protection, but didn't look much into them.
So, feel free to grab adtg_gui and understand/document what some of these registers do.
QuoteI don't know why it should be a rough approximation. I am using this method as described by the libraw's developers. You can check the link which I gave, it describes it. I will look at your link too.
It is my understanding that log2(max/stdev) is only a rough approximation, especially at high ISOs. Rather, I prefer to read it from the SNR curve - detailed answer: https://www.magiclantern.fm/forum/index.php?topic=13787.0
QuoteI don't know how exactly RawDigger calculates the sigma values. So far I have tried two kinds of dark noise samples:
Also, how are you computing the noise stdev?
QuoteI am not familiar with Roger Clark's method and I don't know what FWC is. Links?
For DR measurements, I have some confidence in raw_diag's 2-frame SNR analysis (a method inspired from Roger Clark's method); however, white level is still detected using heuristics (that may fail if the clipping is not harsh). For FWC and read noise, I have a feeling finding them from the SNR curve may be an ill-conditioned problem.
Quote
What would you recommend to someone who needs correct raw ETTR for photo using the best of what the sensor is capable of?
Quote from: heyjoe on September 24, 2017, 03:54:19 PM
What would you recommend to someone who needs correct raw ETTR for photo using the best of what the sensor is capable of?
Quote from: a1ex on September 24, 2017, 09:29:55 PMIt would be nice to be able to do it when pressing the Play button of camera. Is that possible to implement? Also is there a way to have bigger histograms? These look super small. I can't really imagine how I would get a visual feedback if I shoot in bright sunny day outdoors. Maybe we need a beep to tell us "It is overexposed"?
Very easy - I've emulated the image capture process in QEMU and used your CR2 as input data.
That's exactly what I did - on the virtual camera.
QuoteI was rather wondering how they were determined before being put in your code, i.e. what physics they are based on and why they are so different from actuality.
From dbg_printf's from raw.c.
QuoteI understand it is a work in progress. Of course I would be glad to help with what I can. However I am not a programmer per-se (I do some coding for simple scripts and web only). Is it really difficult to put the right saturation values and recompile? Or does it need more research in order to have them really accurate?
The findings from the ISO research thread are not yet included in ML, sorry about that. The code is generic; I was hoping to cover all ~15 (soon ~20) camera models with the autodetection.
...
So yeah - it's not perfect, manpower is an issue and contributions are welcome.
QuoteYes, I saw this thread which took me to the wikia article. My findings are in this spreadsheet. It seems the difference is smaller than 0.1 stops.
On recent models, they do - about 0.1 stops according to my measurements.
On older models (5D2 generation), they don't - they are the same.
http://www.magiclantern.fm/forum/index.php?topic=9867.0
QuoteIs it possible to make it available for photos too?
Movie -> Image fine-tuning. Not applicable to photos.
QuoteYou mean the thread about iso160 multiples or? Please clarify what is your idea. I have been experimenting with RawDigger for the last few days, trying to answer my main question. If you think any test would help you to make ML do what I want - of course I would be glad to help.
BTW - are you interested in experimenting with the ISO research tools and hopefully reviving that thread? Most of that stuff applies to photos, and there is a significant DR improvement that can be achieved. You may start with the raw_diag and iso-regs modules, and maybe cross-check the results with other software.
Quote from: a1ex on September 24, 2017, 12:52:05 AMWhere do these values come from?
On this image, ML assumes the clipping point at 9960 for ISO 160 and at 13200 for ISO 100.
QuoteI downloaded and installed ML 5D Mark III 123. How come a version specific for that model does not know the clipping points for it (considering also the factors you mention)?
It doesn't know the true clipping point, which is not be the same across camera models
QuoteMy test confirms that saturation values depend on these factors too but the values for ISO 160 are about 11400 in RawDigger which is quite far from 9960. Why is there such a huge difference?
and is affected by many factors, including exposure time and aperture (!), so it's trying to guess it from the brightest pixels in the image - raw.c:autodetect_white_level().
QuoteDial ISOs from ML menu/keys. ISO 160 from ML is better than ISO 160 from Canon controls.but my test shows that there is no difference. Could you please explain?
Quote from: DeafEyeJedi on November 17, 2016, 11:06:22 PMThen I don't think I have ever used anything else than exFAT because my smallest card is 32GB. BTW I use mainly Linux and (unfortunately) sometimes Windows
exFAT just makes things run smoother. Bypasses the 4GB limit for continuous recording. Can be done in OS X via Disk Utility. Note any CF/SD cards higher than 128GB will retain it's exFAT after format inside camera. This can be useful.
Quote
Nope. Because then it won't be able to run (since the camera will be looking for the bootflag) so if you want to be able to run clean-vanilla Canon (with no ML whatsoever) then please do the protocol I mentioned above.
QuoteBeside the fact that you are using a 5D3 -- why put yourself in a position to be bottlenecked by the SD slot's limited writing speeds in camera comparing to the CF's slot hence the reason why I recommended to get more CF cards to make your workflow easier otherwise.
Quote
If you want to run a clean 5D3 (vanilla canon) without ML involved then please do the follow:
Run firmware update (in canon menu) and let it do its thing.
Then once it starts counting down from 60 seconds -- please stand by and wait until it gets to 0 seconds.
Camera should restart on its own then you'll have yourself ML free camera. Hope this helps.
QuoteBut if you don't mind me asking why would you NOT want to use ML after all, just curious?
Page created in 0.129 seconds with 13 queries.