Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Messages - heyjoe

#26
Please write when it is ready to test.
#27
Great. Looking forward to it.
#28
Quote from: a1ex on September 27, 2017, 06:40:39 PM
For me it's not obvious at all. Except for the lack of clip warnings at ISO 160 1" (which I'm looking into), I don't see anything wrong with the QR zebras.
I say obvious because:

ISO 160, 0.6s is quite underexposed. The maximum value in RD is far below saturation (~11400), perhaps about -0.55EV. QR shows R=4 and G=2 clipping for this shot (6047).

ISO 160, 0.8s is just 0.02EV overexposed in RD (R=12%, G=6.8%, B=0%, G2=6.7%). QR shows R=33, G=20 overexposure. Assuming that ML's values are % too, the difference is about 3 times.

etc.

Quote
and that means I should imagine some other heuristic for detecting the peaks
You can probably compare the size of the rightmost values in the histogram to those just left of it and if it is above a certain threshold that should indicate a clipping. That may work for the "OVER" indication in the histogram but how will you propagate it back to the image (zebras)? Sounds like computation consuming.

Quote
The LiveView RAW overlay are not very exact, but not trivial to fix - let's figure out the QR ones first.
Sure. QR is more important for photo.
#29
Here is a proof that LV (RAW) zebras and histograms don't match those of QR (supposedly also RAW):

#30
Quote from: a1ex on September 27, 2017, 05:47:35 PM
Note the LiveView RAW zebras have horizontal lines (not diagonal), they show the same color as the clipped channel(s) (or black if all channels are clipped) and they have "square" edges for speed reasons (they operate on a very low-res image).
Thanks for explaining. It is possible that during the test I have had the settings to "Photo only". Now I tested again with "Always" and they are horizontal and unaffected by WB setting (histograms too).
ETA: But still LV and QR show different overexposure warnings.

Quote
In QR (after taking a picture), speed is no longer an issue, so they are computed for every displayed pixel.
Does that mean you can also make it to work in Play-button mode?

BTW there is an issue in QR mode. If previously (before taking the current shot) a picture was Played and with a few presses of the Info button it was set in a mode to display Canon's JPG histograms, ML's stuff overlays all that and the view becomes a mess. Example:



The rectangle on top left is the reduced image which Canon shows and right and below of which Canon's histogram and info display.
ETA: same with Play->Light.

Quote
The Luma zebras (YUV-based) are diagonal red. The YUV RGB zebras also have horizontal lines, but thicker, and fully overexposed areas are solid black.

However, your LiveView histograms are RAW-based.

Can you upload your ML/SETTINGS directory so I can try to reproduce the issue?

https://drive.google.com/open?id=0B2Mb7hSVSnnFVjV4RjFIMEZUb1U

Please explain what is needed to test further to improve the accuracy as it obviously still cannot be used for correct ETTR.

Also why the accuracy in ISO 160 is worse than that for 100?
#31
Quote from: a1ex on September 27, 2017, 04:52:26 PM
I'm unable to reproduce the black zebras - are they from Canon?
Yes.

Quote
To capture the LiveView images, you may use Debug -> Dump image buffers (a 14-bit DNG and two 8-bit YUV422).

There's a screenshot option as well (no need to photograph the camera screen). It won't capture the fast (YUV-based) zebras well (as these are computed by the display controller), but should work fine with all other kinds of zebras (including the RAW-based ones).
Thanks for the tip. However sometimes it is easier to use my phone and just send the image via bluetooth instead of moving the card back and between the camera and card reader.

QuoteYour LiveView screenshots show YUV zebras.
My settings are (taken using the screenshot option):



So what did we learn from this test? Is there anything more that can be done regarding the accuracy of ML for ETTR? Or anything else to test?

It would be really great if the LV histograms are really raw and to have raw histogram/zebras in Play button too (not only in image review).
#32
Thanks @DeafEyeJedi!

Here is the result of my test: CR2 files.

All images are @f/2.8. The screenshots are arranged left to right LiveView (LV), Image review (IR), RawDigger (RD).

ISO 160

0.6s


0.8s


1.0s


1.3s


ISO 100

1s:


1.3s:



1.6s:


2s:


The closest to perfect ETTR shots seem to be:

ISO 160, 0.8s: RD shows overexposure of about 0.02EV, IR shows R=33, G=20 overexposure
ISO 100, 1.6s: RD shows overexposure of about 0.02EV, IR shows R=45, G=33 overexposure

Does this prove the new heuristic to be more accurate?

Disclaimer: the light in the room where I shoot may not be perfectly constant.
#33
Quote from: DeafEyeJedi on September 26, 2017, 04:39:21 PM
I can do this test for you if you still insist on it @heyjoe?
That would be very nice of you. Could you please also explain how you do it? I.e. are you doing it in QEMU or in camera and how is the crop_rec_4k version to be installed?

Quote
The probability of bricking is extremely low and almost impractical to not give it a shot either way.
I suppose it is so but because I am very very new to ML I am extra careful (maybe extra paranoid too).
#34
Quote from: a1ex on September 26, 2017, 10:53:51 AM
In your LiveView screenshots, the histogram is raw; just double-check "Use RAW zebras" is really "Always". If that still doesn't give raw zebras, it's a bug, but I'm unable to emulate LiveView in QEMU yet. A video might be useful to figure out what's going on.
Ok, I tested again (yes, RAW zebras is set to always). It seems zebras don't change on WB change in LiveView, only in Play+Light view. So if you can make Play+Light display raw histograms and zebras/clip warnings (or at least as an option) it would be great.

Quote
Safety-wise, they are about the same. These systems don't have a MMU, any task can write anywhere in the RAM, and Canon code saves their settings at shutdown by... reflashing the ROM.
Thanks for explaining.

Quote
The crop_rec_4k branch
... is *probably* a little safer.
You are doing a great job.

Quote
In any case, the strongest safety net we have is the ability to emulate the firmware in QEMU
I still haven't had the time to install that and try it out but I read that not only LiveView doesn't work in it but also Image capture and review without which I guess there is really nothing to test in QEMU, right?

Quote
(with the user's ROM),
What is that?

Quote
so if something goes really wrong (such as camera not booting or acting weird), I should be able to look into it.
How? What if my camera gets bricked and there is no way to diagnose?

Quote
And, of course, the ROM dumper from bootloader.
Is that something I *must* do before installing the crop_rec_4k build? Please explain as for a layman as this whole thing with so many links to different long threads is a little overwhelming for an ML newbie :)

Quote
Well, somebody *has* to test this before it goes into mainline :D
I would be happy to but of course I am cautious not to cause a damage.

Do I simply download and unzip  magiclantern-crop_rec_4k.2017Sep26.5D3123.zip onto the card? Do I need to run "Firmware update" after that (as when installing ML for the first time)? Please provide the steps.

Thanks.
#35
Quote from: a1ex on September 26, 2017, 01:31:52 AM
Typo - reply #1 (as printed by the forum).
Thanks.

So if I understand correctly: the raw histogram and raw clip warnings show only in LiveView and in the instant image review (but not in Play+Light).

Unfortunately that doesn't explain why in LiveView the zebras are non-raw because it contradicts what one has set in the menu options: Use RAW zebras = Always. I do read that the help info says "if possible" but it is not quite clear on what that "if" is based (and I couldn't find anything in the user guide). From a user perspective one should get what one has set in the menu.

I also think it would be useful to have an option in the menu to enforce raw histograms and clip warnings even in Play+Light menu. Can you do this?

Quote
Check the crop_rec_4k builds in a few hours.
It seems you have put the new heuristic in code which is great and I am curious to test it.

Do I simply download and unzip  magiclantern-crop_rec_4k.2017Sep26.5D3123.zip onto the card? My concern is the warning on the top of the page:

"The following builds are works in progress, known to have rough edges.
Please test thoroughly before considering them for serious work."


Are those crop_rec_4k less safe (potentially being able to cause damage)? Would you recommend to rather wait for the new functionality to be put in the main builds? I am just cautious not to damage my camera with something not fully tested.
#36
Quote from: a1ex on September 25, 2017, 11:44:29 PM
See replies #2, #11.
Reply #1 is from Walter.
Reply #2 is from me.
If my replies are not counted, reply #2 is from you: "CR2, please"
If only your replies are counted, then I can't relate reply #2 to anything regarding histograms/zebras discrepancies.
Considering that I can't be sure which is #11.
Please disambiguate.

I don't use ACR (I use libraw through 3DLC) and I am not looking to calibrate a perfect match between raw converters but:

Has what the raw converter assumes as white anything to do with correct ETTR? To my mind: As long as we can tune exposure in raw conversion for correct histogram of the output file it should not matter what the absolute values at the input are, so all that matters is right exposure. Or am I missing something?

ETA: You edited your answer, so I will have to add too:
What have histogram/zebras displayed as a visual feedback on camera LCD to do with raw converters? When you display the zebra/histogram you don't change the values in the CR2 file, you just compare them to a value and determine whether it must show as clipped or not. Right?
#37
Quote from: a1ex on September 25, 2017, 09:20:23 PM
You are looking at YUV-based zebras. Try setting "Use RAW zebras: Always".
I have "Use RAW zebras: Photo" and the help info says "Will use RAW RGB after taking a pic". After reading your current reply I set it to "Always" but looking at the shots which I shared with Play->Light still shows different zebras and histograms.

Quote
Also, when the histogram doesn't have vertical bars (stops), it's YUV-based.
In Histogram type I have RAW-based (RGB) for which the help info says "Will use RAW RGB in LiveView and after taking a pic"

Quote
Good point; however, I have some ideas to detect the presence of such a spike when checking the white level (a better heuristic). On Canons, the clipping is harsh (many pixels at the maximum value), with the possible exception of a few hot pixels.
Are you suggesting to simply wait for (the) next version?
#38
Quote from: a1ex on September 25, 2017, 04:41:53 PM
You said:

I'm not aware about such behavior, nor I can reproduce it, so it's your duty to provide some sort of proof. That was the reason for the link.
Obviously neither you, nor I are mind readers, so it would be easier if you ask directly to avoid misunderstanding. Anyway here is the proof:

1) WB set to UniWB:

In LiveView:



After taking the shot:



2) WB set to 5000K:

LiveView:


After taking the shot:



As you can see:

- the zebras and histograms in image review are different for different WB
- the zebras in LiveView are different (histograms look the same) for different WB
- the zebras, histograms and values for the same WB are different in LiveView and in image review (Play > Light).

CR2 files:

https://drive.google.com/open?id=0B2Mb7hSVSnnFN3NuSkVVNjdRUTQ

Let me know if you need any more info about this.


Quote
As for your main question, I'm afraid I don't understand it. What should I recommend?
A way to correctly ETTR (even if it would mean not using ML).

Quote
On 5D3, at full stop ISOs, the autodetected white level can be underestimated by log2(15283-3000-2048) - log2(15283-2048) = 0.37 stops (worst case), iff there are no overexposed pixels in the image. So, you may get false clipping warnings within that limit. If that's good enough for you, then just use it.
I can get within that range without ML (using UniWB JPG histograms) but I am looking for something more accurate as the JPG histograms are really bad (I have tried so may variations in Picture style but still they don't show accurately the saturation).

Quote
Otherwise ...
Unfortunately I am not a programmer and if I dare to touch your code I am risking to damage something. As you correctly pointed out - this would be a maintenance nightmare and there would be more and more questions. That's why considering your much bigger expertise I was hoping that we can diagnose the issue together and hopefully you, knowing your code, could fix it properly (or advise for a way to correctly ETTR in a different way).

BTW I am thinking about another solution that may work for ETTR: Wouldn't it be much easier to simply display a zoomed portion of the rightmost zone of the raw histogram? Then one can simply evaluate by the shape of it (e.g. a spike) and see if there is clipping of a particular channel or not. In that case you won't need to detect or fix any values. It would be camera independent. All you would have to do is to display the portion of the histogram bigger and zoomed. What do you think?

Quote

13307, 11388 (ISO 160, 1/8"  - off by 0.27 EV)

For which channel is that? Compared to previously mentioned max of 9960, this looks much closer to what RawDigger shows.
BTW in the spreadsheet which I shared (shot at ISO 160, 0.5") I notice a pattern in the saturation values (without black subtraction):

- for all ISO x160 multiples: 13519-13526
- for all other ISOs and ISO 20000: 15518-15530
- for ISO 25600: 16376

Does that mean they are pretty much fixed (and can probably be hard coded?)

Quote
Do you have the patience to get a matrix of these values? You may use a Lua script if you wish.
I don't know Lua and I don't understand everything you do (I installed ML just 2 days ago). So please provide the steps for what you need me to do.

Quote
FYI, the max values from your raw file are:


octave:1> a = read_raw('_MG_5911.CR2');
octave:2> prctile(a(:), [10 50 90 99 99.9 99.99 99.999 100])'
ans =
    3353    8279   11886   12639   12848   12943   13006   13102


and here's how the clipping warnings would look with white level 11388 (Canon's heuristic):


I don't know why but RawDigger shows different values:



and again according to RD this shot is a tad underexposed (considering RD's saturation at 11477) about 0.05EV. So if such overexposure is shown on LCD screen, that would make the photographer underexpose it even more which is contrary to ETTR.
#39
Quote from: a1ex on September 25, 2017, 02:16:55 PM
https://www.chiark.greenend.org.uk/~sgtatham/bugs.html

Enough chit-chat for now.

That still doesn't answer my main question because obviously ML does not show the real raw histogram clipping. I provided all the info you asked for, so I don't see why this link. The 'chit-chat' arose from the fact that my main question was (and still is) unanswered. I won't repeat it again though. Thank you for your time.
#40
Quote from: a1ex on September 25, 2017, 07:35:40 AM
Sorry I didn't include more links - my display is defective, so I am/was using a smartphone to mirror the main desktop (so I can type quickly, but reading or looking up links is slow).

First post from QEMU thread gives you a README.
Thanks. I will look into it.

Quote
CMOS/ADTG aka ISO research: http://www.magiclantern.fm/forum/index.php?topic=10111
I already read this. My test confirms a slight improvement in DR after installing ML. Here are some calculated values and graphs showing the difference (diff = value_with_ML - value_before_ML):



Quote
Hardcoding clipping point: I know how to find it for short exposure times, but I don't know how it changes with long exposures. I only know it gets lower, from past bug reports, but didn't do a controlled experiment to find out by how much.

The white level detection is also explained in the iso160 thread. It starts from a low guess, then scans the image for brighter pixels, then backs off a bit - all this to make sure it shows the clipping warnings no matter what the correct white level might be. Your example case is either a bug or an edge case (didn't look yet).

In other words, I was just trying to come up with something that works on all other supported ML camera.
What do you need to make it work accurately? I would be glad to provide test data if you don't have the time for it. Just let me know what I have to do.

Quote
Sure, but the digital gain will get burned in the CR2 (so you'll be losing levels without gaining additional highlight detail in the raw file). You will gain more highlight detail in the JPEG preview, but I don't see it a good enough reason to implement and maintain this "feature". Rather, the CMOS/ADTG tricks do actually capture additional highlights (effectively increasing DR), and that's currently available in the iso_regs module for 5D3 (just not very user friendly, but at least you can tweak all these sensor gains from the menu).
I am curious to try how it would work on photos but I don't find any module called iso_regs in the ML menu?

Quote
https://www.magiclantern.fm/forum/index.php?topic=8309.0 (old answer)
https://www.magiclantern.fm/forum/index.php?topic=15088.msg186141#msg186141 (updated)
Thanks. So it is the light button that does the thing.

BTW what is the reason for zebras to be affected by the WB setting? This means they are not really raw (although they are set to RGB RAW). Is that a bug?

Quote
The closest approximation I can come up, besides dual ISO, would be this:
https://www.magiclantern.fm/forum/index.php?topic=19315.0

Alternating short/long exposures is doable, but not trivial. There are routines for doing arithmetic on raw buffers on Canon's image processor - documented here:
https://www.magiclantern.fm/forum/index.php?topic=13408.

If you can understand the sensor internals, you can probably change all its registers from adtg_gui. However, besides tweaking some gains at various amplifier stages, and overriding LiveView resolution to get 3K/4K/fullres LV, I wasn't able to find anything useful for controlling the clipping point beyond what's already documented in the ISO research thread. I'm not saying there isn't - I'm just saying I'm not familiar with sensor electronics, so maybe I don't know where to look.

There are some CMOS registers that appear to adjust the black sun protection, but didn't look much into them.

So, feel free to grab adtg_gui and understand/document what some of these registers do.
I am not sure if we are talking about the same thing. The two links don't mention anything about logarithmic raw data. What I am talking about is not to look for a way to HDR but to use the actual DR of the sensor but instead of having it convert the light to linearly encoded data, to have the raw data in a logarithmic way. The advantages of that would be:


  • No need for ETTR because we will have the same number of levels for any f-stop
  • The benefits in post-production which follow that.
  • Maybe even smaller raw files (because we don't really need 4000 levels per f-stop).

Of course that would also require customized raw conversion perhaps. I am getting the idea for this from a program called 3DLUTCreator which you may be familiar with. It has this unique functionality: when you open a raw file in it it

1) converts it to LogC.tiff (using Alexa's logc formula) which is 16-bit tiff with UniWB and logc encoded data
2) it works with that tiff to color grade it

This gives the possibility to encode the whole raw data into the LogC.tiff file (which can contain up to 21EV of DR) and also a fairly uniform number of levels per f-stop. (*fairly uniform - because they are not the same for each f-stop, due to the logc formula itself, it is not an ideal logarithm).

So all that got me thinking that if the raw file itself is logarithmic, that would probably save us from all the issues arising from the linearity of raw data today.

Your links reminded me of another thing though. Have you seen this:

https://www.nasa.gov/feature/revolutionary-camera-recording-propulsion-data-completes-groundbreaking-test

Quote
It is my understanding that log2(max/stdev) is only a rough approximation, especially at high ISOs. Rather, I prefer to read it from the SNR curve - detailed answer: https://www.magiclantern.fm/forum/index.php?topic=13787.0
I don't know why it should be a rough approximation. I am using this method as described by the libraw's developers. You can check the link which I gave, it describes it. I will look at your link too.

Quote
Also, how are you computing the noise stdev?
I don't know how exactly RawDigger calculates the sigma values. So far I have tried two kinds of dark noise samples:
1) from a shot with the lens caps on (used in the test to calculate the differences in the above graphics)
2) from the dark (hidden) pixel areas (recommended in the libraw developer's article, used in the Google spreadsheet which I shared in a previous post)

The difference between 1) and 2) is not very big. 2) seems to give more uniform values across channels while 1) seems to show bigger difference in DR between R, G, B, G2.

Quote
For DR measurements, I have some confidence in raw_diag's 2-frame SNR analysis (a method inspired from Roger Clark's method); however, white level is still detected using heuristics (that may fail if the clipping is not harsh). For FWC and read noise, I have a feeling finding them from the SNR curve may be an ill-conditioned problem.
I am not familiar with Roger Clark's method and I don't know what FWC is. Links?

So considering everything said so far:
Quote
What would you recommend to someone who needs correct raw ETTR for photo using the best of what the sensor is capable of?

:)

I also hope you can check why I am not getting any notifications about replies in the thread. I have them on in settings
#41
Thanks for the detailed explanations. Sorry to repeat my main question but this is the actual thing I am looking for, I hope you can share some thoughts: :)

Quote from: heyjoe on September 24, 2017, 03:54:19 PM
What would you recommend to someone who needs correct raw ETTR for photo using the best of what the sensor is capable of?

Running ML in QEMU sounds really interesting. I am using KVM/QEMU to run guest OS's on my linux host, so I am curious to learn more about how to run ML too. Is the thread you linked to the thing I need to read? Or is there some sorted procedure?

Quote from: a1ex on September 24, 2017, 09:29:55 PM
Very easy - I've emulated the image capture process in QEMU and used your CR2 as input data.

That's exactly what I did - on the virtual camera.
It would be nice to be able to do it when pressing the Play button of camera. Is that possible to implement? Also is there a way to have bigger histograms? These look super small. I can't really imagine how I would get a visual feedback if I shoot in bright sunny day outdoors. Maybe we need a beep to tell us "It is overexposed"? :)

Quote
From dbg_printf's from raw.c.
I was rather wondering how they were determined before being put in your code, i.e. what physics they are based on and why they are so different from actuality.

Quote
The findings from the ISO research thread are not yet included in ML, sorry about that. The code is generic; I was hoping to cover all ~15 (soon ~20) camera models with the autodetection.
...
So yeah - it's not perfect, manpower is an issue and contributions are welcome.
I understand it is a work in progress. Of course I would be glad to help with what I can. However I am not a programmer per-se (I do some coding for simple scripts and web only). Is it really difficult to put the right saturation values and recompile? Or does it need more research in order to have them really accurate?

Quote
On recent models, they do - about 0.1 stops according to my measurements.

On older models (5D2 generation), they don't - they are the same.

http://www.magiclantern.fm/forum/index.php?topic=9867.0
Yes, I saw this thread which took me to the wikia article. My findings are in this spreadsheet. It seems the difference is smaller than 0.1 stops.

Quote
Movie -> Image fine-tuning. Not applicable to photos.
Is it possible to make it available for photos too?

Quote
BTW - are you interested in experimenting with the ISO research tools and hopefully reviving that thread? Most of that stuff applies to photos, and there is a significant DR improvement that can be achieved. You may start with the raw_diag and iso-regs modules, and maybe cross-check the results with other software.
You mean the thread about iso160 multiples or? Please clarify what is your idea. I have been experimenting with RawDigger for the last few days, trying to answer my main question. If you think any test would help you to make ML do what I want - of course I would be glad to help.


BTW is it possible to program the sensor to capture light logarithmically and record it to raw data, instead of linearly? I mean not the log files which pop cameras output after converting raw data but to create/program a real logarithmic sensor. That would be a revolution. I have been searching for info about logarithmic sensors but the only one I found was about some CCTV cameras which have very high noise.

----
ETA: Why am I not getting any email notifications about updates on the thread? I have the following settings:

Turn notification on when you post or reply to a topic. = On (Instantly, for the 1st reply for replies and moderation).
#42
Thanks for looking at the file. BTW how were you able to view the histogram and overexposure indication? Walter previously said this is possible only during image review, i.e. only after taking the shot.

Quote from: a1ex on September 24, 2017, 12:52:05 AM
On this image, ML assumes the clipping point at 9960 for ISO 160 and at 13200 for ISO 100.
Where do these values come from?

Quote
It doesn't know the true clipping point, which is not be the same across camera models
I downloaded and installed ML 5D Mark III 123. How come a version specific for that model does not know the clipping points for it (considering also the factors you mention)?

Quote
and is affected by many factors, including exposure time and aperture (!), so it's trying to guess it from the brightest pixels in the image - raw.c:autodetect_white_level().
My test confirms that saturation values depend on these factors too but the values for ISO 160 are about 11400 in RawDigger which is quite far from 9960. Why is there such a huge difference?

My goal is to correctly ETTR, using the best of what the sensor is capable of, that's why I need correct feedback while shooting. Using this method I have found that ISO multiples of 160 give slightly better DR compared to "native" ones. I also read your article but I can't find a way to set those "best ISOs" which you recommend. How can I do it? (And is it applicable for photo at all?)

The article also says:
QuoteDial ISOs from ML menu/keys. ISO 160 from ML is better than ISO 160 from Canon controls.
but my test shows that there is no difference. Could you please explain?

Unfortunately using the standard Canon histograms (yes, with UniWB) can't give me accurate feedback - I always find a discrepancy between what they show and what RawDigger shows, so this results either in underexposure or overexposure. So I turned to ML hoping to be able to expose correctly to the right. But now when I see that the raw zebras are not really raw (they change when I change the WB setting) and that difference in saturation values which you explain - I wonder what to do.

What would you recommend to someone who needs correct raw ETTR for photo using the best of what the sensor is capable of?
#44
Here is an example which shows the difference between ML and RawDigger.

The image is correctly exposed to the right without clipping but ML shows that it is overexposed.

#45
Is there anything else I need to set? It seems there is something wrong with these exposure indicators.

I see overexposure indication for image with max channel value 9200 (in RawDigger) but that is definitely far below saturation limit (which is about 11400 for 5D3). Why does ML show overexposure for image which is actually underexposed (not ETTR)?

It also seems these are not really raw because when I change the WB they change too.

I hope someone can clarify.
#46
Thanks!
#47
Hi,

My question is about photo, not video.

I recently tried ML for the first time and I see it displays RAW histograms in live view. Unfortunately this doesn't help me to ETTR with studio strobes.

Is it possible to view raw histogram and or otherwise check for raw channel over/under exposure *after* taking the shot?

The camera is 5D3 in case that matters.
#48
Thanks for the explanations Levas. I understand what you are saying.

As soon as I have some time I will give ML a try.
#49
Thanks for explaining Levas. I don't know why you say it is unlikely the cart to get corrupted. It has happened to many people.

BTW what has the battery to do with the whole thing? I.e. - why should I replace it and by replacing - do you mean I need a new battery or simply remove it from the body and put it back in?


Forum related question:

I am trying to get email notifications for this thread. In profile settings I read "To receive notifications from a topic, click the "notify" button while viewing it." but there is no such button?
#50
Quote from: DeafEyeJedi on November 17, 2016, 11:06:22 PM
exFAT just makes things run smoother. Bypasses the 4GB limit for continuous recording. Can be done in OS X via Disk Utility. Note any CF/SD cards higher than 128GB will retain it's exFAT after format inside camera. This can be useful.
Then I don't think I have ever used anything else than exFAT because my smallest card is 32GB. BTW I use mainly Linux and (unfortunately) sometimes Windows :)

Quote
Nope. Because then it won't be able to run (since the camera will be looking for the bootflag) so if you want to be able to run clean-vanilla Canon (with no ML whatsoever) then please do the protocol I mentioned above.

It seems I didn't make myself clear. The idea is:

- CF card with ML
- SD card without ML
(both cards in camera)

Why shouldn't the camera be able too boot with this configuration? After all ML is on the CF, so what is the problem?

QuoteBeside the fact that you are using a 5D3 -- why put yourself in a position to be bottlenecked by the SD slot's limited writing speeds in camera comparing to the CF's slot hence the reason why I recommended to get more CF cards to make your workflow easier otherwise.

Because I already have these cards and I paid for them. Plus I don't shoot video for the moment so I am not really in a hurry for top speed :) When I need video - that may change of course.

Quote
If you want to run a clean 5D3 (vanilla canon) without ML involved then please do the follow:

Run firmware update (in canon menu) and let it do its thing.

Then once it starts counting down from 60 seconds -- please stand by and wait until it gets to 0 seconds.

Camera should restart on its own then you'll have yourself ML free camera. Hope this helps.

Hm. Are you saying that the only way to get rid of ML and boot original firmware (which you call vanilla) is to re-flash the firmware of the body? Why is that necessary considering that the only thing which changes in the body is the boot flag. Can't one simply disable the boot flag and that's it?

If what you say is so, that would mean that after enabling the boot flag the camera will never be able to boot without a card with ML. Which also means that one may not be able to flash the firmware if it is not on a card with ML. And this brings back the unanswered question 2 :) I hope you can explain as I am getting a little confused.

QuoteBut if you don't mind me asking why would you NOT want to use ML after all, just curious?

I am willing TO use ML, just learning at the moment in order not to mess up anything :)