Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - a1ex

Pages: [1] 2 3 ... 6
1
Camera-specific discussion / Canon EOS 4000D / 3000D / Rebel T100
« on: January 26, 2019, 11:37:54 PM »
Just received a firmware dump from this model.

ROM dumper (requires an SD card formatted as FAT32):
4000D

Code: [Select]
  Magic Lantern Rescue
 ----------------------------
 - Model ID: 0x422 4000D
 - Camera model: Canon EOS 4000D / Rebel T100
 - Firmware version: 1.0.0 / 1.9.2 1B(13)
 - IMG naming: 100CANON/IMG_0213.JPG
 - Boot flags: FIR=0 BOOT=0 RAM=-1 UPD=-1
 - ROMBASEADDR: 0xFF0C0000
 - card_bootflags 106744
 - boot_read/write_sector 106f38 107030
 - 101DE4 Card init => 2
 - Dumping ROM0... 100%
 - MD5: (yours will be different)
 - Dumping ROM1... 100%
 - MD5: (yours will be different)
 - No serial flash.
 - Saving RESCUE.LOG ...

To emulate (Canon GUI working out of the box):
- pretend it's a 1300D
- apply the following ROM patch:
Code: [Select]
dd if=ROM1.BIN of=BOOT.BIN bs=64k skip=1 count=1
dd if=BOOT.BIN of=ROM1.BIN bs=64k seek=511
- throw away ROM0 (it's not connected)
- change flash model ID to 0x003825C2 (1300D has 0x003925C2)
- CURRENT_TASK 0x2F53C
- 0xFE1171B4 DebugMsg
- 0x3888 task_create



TODO:
- commit the emulation sources (my job)
- start porting ML (your job; follow the 1300D thread)

Have fun!

2
Recently, this blog post came to my attention: https://blog.kasson.com/nikon-z6-7/how-fast-is-the-z7-silent-shutter/

This gives a method for measuring rolling shutter of any camera, by filming some flickering light, as long as you know its frequency.

Examples of flickering lights:

- a plain old incandescent bulb in PAL land will flicker at 100 Hz (caveat: mains frequency is not exactly stable, but still reasonably good)
- some laptop monitors will start to flicker when you reduce their brightness; this frequency might be a little more stable
- many LED lights are also flickering; some at hundreds of Hz (useful for us), others at some kHz (not useful for us)
- if you've got an Arduino and a LED, you can program it to flicker at arbitrary frequency -> pwm.ino (default: 500 Hz)
- or, just move the camera around, looking for something that flickers

Unknown frequency?

What if you've got a flickering light, but you don't know its frequency? You can measure it with any ML-enabled camera, as we already know the sensor readout timings:
- open the FPS override submenu, without actually enabling it (i.e. select FPS override in ML menu, leave it OFF and press Q)
- look for Main Clock => camera-specific constant (5D2 and 5D3 use 24 MHz, 700D/650D/M/100D use 32 MHz and so on)
- look for Timer A => this gives line readout speed. Timer A / Main Clock = line readout time. Example: 5D2 25p => 600 / 24 MHz = 25 microseconds per line.
- look for Timer B => this gives frame rate: FPS = Main Clock / Timer A / Timer B.
- write down these values; the Octave script below will do the math for you.

Octave script

I've prepared a small Octave script to perform this measurement: rolling.m

Requirements:
- Octave 4.x with the following packages: image and signal
- to analyze DNG files, you will also need read_raw.m and dcraw

You can either run the measurements on your own (caveat: the script may require some fiddling), or you can upload test images for me to analyze.

Sample test images

All converted from silent picture DNGs:
5D3-500hz-25p.jpg (using pwm.ino at 500 Hz)
5D3-500hz-24p.jpg (same light source)
5D2-100hz-25p-weak.jpg (mains frequency, very weak light, but still usable)

Sample output

Code: [Select]
octave rolling.m 5D3-500hz-25p.jpg 24e6 480
Using blue channel.
Pattern repeats every 100 lines (method: pwelch).
Pattern repeats every 100 lines (method: overlap).
Pattern repeats every 100 lines (method: zerocross).
Method: pwelch => 100.00 lines
Line readout clock: 50.00 kHz, i.e. 20.00 μs/line (known).
Light source frequency: 500.01 Hz (measured).

octave rolling.m 5D3-500hz-24p.jpg 24e6 440
Using blue channel.
Pattern repeats every 109 lines (method: pwelch).
Pattern repeats every 109 lines (method: overlap).
Pattern repeats every 109 lines (method: zerocross).
Method: pwelch => 108.98 lines
Line readout clock: 54.55 kHz, i.e. 18.33 μs/line (known).
Light source frequency: 500.52 Hz (measured).

octave rolling.m 5D3-500hz-25p.jpg 500
Using blue channel.
Pattern repeats every 100 lines (method: pwelch).
Pattern repeats every 100 lines (method: overlap).
Pattern repeats every 100 lines (method: zerocross).
Method: pwelch => 100.00 lines
Light source frequency: 500.00 Hz (known).
Line readout clock: 49998.86 kHz, i.e. 20.00 μs/line (measured).
Rolling shutter: 25.80 ms for 1290 lines.

octave rolling.m 5D3-500hz-24p.jpg 500
Using blue channel.
Pattern repeats every 109 lines (method: pwelch).
Pattern repeats every 109 lines (method: overlap).
Pattern repeats every 109 lines (method: zerocross).
Method: pwelch => 108.98 lines
Light source frequency: 500.00 Hz (known).
Line readout clock: 54488.46 kHz, i.e. 18.35 μs/line (measured).
Rolling shutter: 23.67 ms for 1290 lines.

octave rolling.m 5D2-100Hz-25p-weak.jpg 24e6 600
Using red channel.
Pattern repeats every 401 lines (method: pwelch).
Pattern repeats every 20 lines (method: overlap).
Pattern repeats every 21 lines (method: zerocross).
Method: pwelch => 400.68 lines
Line readout clock: 40.00 kHz, i.e. 25.00 μs/line (known).
Light source frequency: 99.83 Hz (measured).

# From the blog post: "So, the scan time is a bit over 60 milliseconds [...]"
wget https://blog.kasson.com/wp-content/uploads/2018/10/Z702693.jpg
mogrify -resize "8256x5504" Z702693.jpg
octave rolling.m -v Z702693.jpg 120
Using red channel.
Vignette fix...
Pattern repeats every 736 lines (method: pwelch).
Pattern repeats every 20 lines (method: overlap).
Pattern repeats every 733 lines (method: zerocross).
Method: pwelch => 736.36 lines
Light source frequency: 120.00 Hz (known).
Line readout clock: 88363.15 kHz, i.e. 11.32 μs/line (measured).
Rolling shutter: 62.29 ms for 5504 lines.

Accuracy?

That depends on:
- how stable your test frequency is (mains frequency is probably +/- 1%, maybe more);
- how accurate our stripe size measurement is (let's say +/- 1 pixel, so it depends on stripe size).

A few tests with 5D3 at 1080p25 with the same 500Hz test light, input files 1 2 3 4:
Code: [Select]
for f in *.DNG; do octave rolling.m $f 500 | grep "measured"; done
Line readout clock: 50123.14 kHz, i.e. 19.95 μs/line (measured).
Line readout clock: 49898.92 kHz, i.e. 20.04 μs/line (measured).
Line readout clock: 49932.19 kHz, i.e. 20.03 μs/line (measured).
Line readout clock: 50137.52 kHz, i.e. 19.95 μs/line (measured).

Not that bad.

Wanted: test images

I'm looking for some test images from recent models not yet running ML, i.e. all DIGIC 6 and newer (including, but not limited to, 80D, 750D/760D, 7D2, 5D4, 5DS, 200D, 800D, 77D, 6D2, M50, EOS R), cross-checked with images from a camera already running ML.

Test conditions:
- blank wall under flickering light, without stray objects (e.g. please don't include the light bulb)
- focus doesn't matter (the script will average every row of the image)
- lens doesn't matter at all (you can perform the experiment without a lens if you want)
- choose any shutter speed that makes the flicker obvious (usually faster shutter speeds are preferred)
- ISO and aperture are not important; just make sure the image is reasonably clean

Test set should include:
- For the camera already running ML:
    - two simple DNG silent pictures from movie mode, one at 1080p24 and another at 1080p25
    - if you don't want to install ML on the test camera, a frame extracted from H.264 video will also work (in this case I'll have to guess the captured resolution)
    - I'll use these to measure (or double-check) the frequency of the light source.
- For the camera not (yet?) running ML:
    - a video frame (extracted from video) at each resolution x frame rate setting from Canon menu
    - if the camera has an option to take completely silent pictures in LiveView, please include one of these as well (full-res JPG)

I'll use these tests to estimate the sensor readout speed and to verify some hypotheses about LiveView configuration for 80D, 5D4 and other recent models I've looked into.

This method can be used with images from non-Canon cameras, too; feel free to submit them if you are curious, just be aware this won't magically bring ML to your camera :)

3
Reverse Engineering / Low-level image capture
« on: June 24, 2018, 04:46:30 PM »
Goal: capture raw images (both photo and LiveView) without executing any of the image processing paths. Just get the raw data.

Why? To understand how it works and to have fewer variables for experimenting with crop modes, high frame rates, readout settings etc.

Idea: log all MMIO activity and replay only what's done from the Evf task and associated interrupts.

Current status: able to get periodic HEAD timer interrupts!

Log: 5D3-mv1080p25.log

Rough overview:
- image capture is controlled from the main CPU (maybe with the help of Eeko; I hope it's not the case)
- all the interactions between the CPU and its peripherals are done via MMIO and interrupts (lowest level)
- high-level interactions are done via ADTG, CMOS and ENGIO registers; on top of ENGIO we've got EDMAC, image processing modules etc
- Canon's image capture code is too complex to understand what it does, but we can trace its actions (messages, functions called, MMIO activity)
- stuff is happening in Canon's EvfState (look for state transitions in the log file)

Step by step:

1) evfInit: runs at startup, no interesting MMIO activity

2) evfActive: this starts LiveView, creates resource locks 0x50000 (STARTUP), 0x40000 (HEAD) and 0x250000 (CARTRIDGE) and powers on the image capture device:

- before using a hardware device in an embedded system, we usually have to enable some clocks; best guess:
Code: [Select]
    MEM(0xC0400008) |= 0x400000;
    MEM(0xC0202000) = 0xD3;
    MEM(0xC0243800) = 0x40;

- then we may have to power it on (SDRV_PowerOnDevice, InitializePcfgPort):
Code: [Select]
    EngDrvOut(0xC0F01008, 0x1);
    register_interrupt("IMGPOWDET", 0x52, imgpowdet_cbr, 0);
    MEM(0xC0400028) = 0x100;

    /* InitializePcfgPort */
    EngDrvOut(0xC0F18000, 0x2);
    EngDrvOut(0xC0F1800C, 0x7C7F00);
    EngDrvOut(0xC0F01010, 0x200000);

    /* probably some GPIO */
    MEM(0xC0220020) = 0x46;

    /* IMGPOWDET interrupt triggers shortly after this */

    /* these seem to be related, not sure what they do */
    msleep(10);
    MEM(0xC0220024) = 0x46;
    msleep(10);

At this point you'll get an IMGPOWDET interrupt, showing that some image capture stuff was successfully powered on.

3) evfStart: bunch of initializations, including FPS timers, raw capture resolution, ADTG, CMOS; enables HEAD1 timer

4) when HEAD1 fires -> evfPrepareChangeVdInterrupt[FrameNo:0]; runs RamClear (guess: zeroing out the image buffer); enables HEAD3

5) when HEAD3 fires -> evfChangeHdInterrupt: stops RamClear, SetLvTgNextState(0x2) (lots of ADTG and CMOS regs)

6) when HEAD1 fires again -> evfChangeVdInterrupt: re-programs HEAD3

7) when HEAD3 fires again -> evfChangeHdInterrupt: SetLvTgNextState(0x4) -> PowerSaveTiming, SetReadOutTiming, SetLiveViewHeadFor1stSR, enables HEAD2, SensorDriveModeChangeCompleteCBR

8 ) evfModeChangeComplete (happens right after the above)

9) when HEAD2 fires -> evfPrepareCaptureVdInterrupt[FrameNo:1], LVx1_SetPreproPath, first raw frame should be available?

To be continued. Unable to get an image yet.

4
Reverse Engineering / Front AF LED (PROP_LED_LIGHT)
« on: April 14, 2018, 08:25:27 PM »
Background: the front LED was something we had no idea how to activate (other than some unreliable hack based on red eye reduction settings).

One hint: https://bitbucket.org/hudson/magic-lantern/issues/2351/front-led-does-not-light-when-recording (this LED might be triggered by a RC-6 remote). I don't have one to try, maybe I should get one, but I don't really have any other use for it.

Anyway - while looking for a possible LED address in 6D2, I've noticed an interesting piece of code referencing these strings:
Code: [Select]
fLedOn_Bv %x %x
AFAE LED %d %d %d %d
[LED] OFF -> ON %x
[LED] ON -> OFF %x

That hints the 6D2 might be turning the front LED to autofocus, and that LED might be controllable from software.

Going further in the low-level routine, it sends a MPU message (in other words, it might be asking the MPU - a secondary CPU - to turn on the AF LED). A closer look reveals similar strings in the 5D3 (though not as clear). On this camera, the low-level LED routine changes property 0x80050035 (size=2, arguments appear to be previous and requested LED state). There are some more interesting strings:
Code: [Select]
AEWB_DSTOCK_GetLedLightMode
AEWB_DSTOCK_GetLedLightEnable
AEWB_DSTOCK_GetLedLightState
AEWB_DSTOCK_GetLedLightGuideNumber
AEWB_DSTOCK_SetLedLightInfo

The front LED even has a guide number!

Where are these initialized? Breakpoint on SetLedLightInfo in QEMU:
Code: [Select]
[        AeWb:ff23c230 ] (98:01) [AEWB] aewbProperty ID=0x80030042(0x3)
...
    0xFF23C1E8(6b2df4, 6b85ac, 0, ff23c1e8)                                      at [AeWb:1796c:185c40] (pc:sp)
     0xFF23DBA0(6b2ea8 &"AEWB_DataStocker", 3, 0, 0, 0)                             at [AeWb:ff23c2d0:185be0] (pc:sp)

0x80030042 is PROP_LED_LIGHT (from known_spells.py). Value 3 means LedLightMode 3 and all others 0. Things start making sense.

Manually changing property 0x80050035 doesn't seem to work. When does Canon firmware turn the front LED?! (other than with RC-6 remote)

Some more: 0x80050035 appears to be 09 22 on the MPU side. 6D2 sends message 09 20 (SpecificToPartner). Changing property 0x80050035 to 0x101 sends the following message to MPU:
Code: [Select]
CA19D>    PropMgr:ff2e9f18:01:03: ###RequestPropertyLVLEDLightRequestResultCBR 9 32 1 1
CA1D5>    PropMgr:000b0d48:00:00: *** mpu_send(08 06 09 20 01 01 00), from ff122df8

From what I could tell, that's exactly what 6D2 does. Yet, the LED doesn't turn on...

5
If you have played with the broken camera from the homepage, you already knew this is coming :)

That was one of my oldest pet peeves since the 5D Mark III was announced. I had several unsuccessful attempts to understand how it works, but as I didn't really know what I was doing, it was a complete mystery for me. Until some days ago - I've applied this knowledge and got a successful proof of concept. I was like - whoa, it was THAT easy?!

Background

There are four finger sensors on the rear scrollwheel - touch them lightly to get an event similar to a button press. During this process, the camera does move slightly, but in any case, the movement is a LOT less than with a full button press. Therefore, it's desirable (although far from perfect) while shooting video, but also during other situations where camera movement should be avoided - such as extreme macro without a sturdy fixture. Or, even during a timelapse, when you want to make sure you've enabled some setting in menu and you don't want to risk having to align everything in post.

It's probably a very underrated feature, since the only videos I could find on this were in Korean. However, I find it very useful - and also fun to explore.

With Canon's implementation, this feature only works while recording video (H.264), and... you have to press Q during video recording (!) in order to activate it. Why? Beats me...

Under the hood

I could not trigger events with another object (metallic or not) and I could not trigger them without moving the camera slightly. However, after registering the "press" event, you can sometimes move your finger back by 1-2 mm and the camera would consider the "button" is still "pressed".

This feature is controlled by the MPU - it sends events similar to the direction keys (actually coded much like the joystick events: regular up/down/left/right = 0x0B followed by 02/09/06/05, silent control = 0x28 followed by the same direction code). The analog side is not under our control; it's actually beyond our understanding limit, since we don't even know the processor architecture of the MPU on DIGIC 5.

Unfortunately, there's no SET event - just 4 direction keys and one (common) unpress event. Pressing two "buttons" at the same time does not give any event (so we can't, for example, use 2-finger taps to assign various functions). However, we can use gestures - slide the finger from top to right, for example, and you get two "press" events and one "unpress" at the end. That way, you can detect:

- 8 "small" gestures (45 degrees): top->right, right->bottom, bottom->right etc.
- larger gestures would work as well: top->right->bottom (180 degrees) or any other combinations.

Properties

I've mentioned before that, with Canon's implementation, this kind of control was only available while recording (and only after pressing Q). Turns out, the controls weren't tied to any of these. They are controlled by property 0x80030047 PROP_SILENT_CONTROL_STATUS (MPU message 03 46). Control values: 0 = enable, 2 = disable. Status values: 1 = enabled and 3 = disabled. There's also 0x8000004B PROP_SILENT_CONTROL_SETTING (the setting from Canon menu).

Limitations

Unfortunately, this feature is guarded very well by the dragons in the MPU:

- it only works in GUI modes 41 (Q dialog), 84 (sound level adjustment) and a few others
- it does not work during regular standby or recording (so it's hard to use it to open ML menu or to change exposure parameters)
- it disables the rear scrollwheel (could not find a way around it)
- it disables the top scrollwheel in GUI mode 41, but works in 84
- GUI mode 84 has other issues, e.g. it doesn't come back cleanly to standby...
- for some reason it does not send unpress events for long touches while recording (what the...)

Proof of concept code

https://bitbucket.org/hudson/magic-lantern/branch/silent-control

It's not without side effects, sorry.

Button assignments

I'm not sure what's the best way to assign these gestures to functions, because (1) there are just 4 direction events and (2) they are very easy to trigger by mistake. I was tempted to start enabling them using a gesture (or even using just gestures for all the actions).

First PoC (0352ba0):
- direction keys for navigation
- works nice, but how do you change values?!

Second PoC (cb9c6ce):
- hold direction pads for at least 0.3 seconds: navigation
- top->right or right->top: Q in ML menu, MENU key in Canon menu (this "gesture" is closest to the Q button)
- right->down or down->right: SET (opposite corner)

Proposals welcome.

Side note

While looking into this, I've found a solution for the menu timeout issue on 6D/M/100D/70D (fix available in the lua_fix experimental build). Nobody bothered to report back, besides dfort.

Future

Does any newer Canon camera still have this feature? (5D IV maybe?)

6
Found this after looking into an issue reported by Walter. Couldn't solve it, but discovered something a little more interesting: how to re-program the bokeh 8)

Say goodbye to busy backgrounds!

Left: with Canon firmware.
Right: with Magic Lantern's new "Crème de la crème à la Edgar" feature.
Camera: 5D Mark III, using the hardware mod from the homepage.
Lens: EF-S 24mm f/2.8 STM.



Technique: https://petapixel.com/2018/02/12/get-look-smooth-trans-focus-without-stf-lens/

Download: after Arikhan uploads the NX1 raw hack.

Happy Easter egg hunting!

8
[...] photos on the homepage do not necessarily reflect the latest development.

Solved. Sorry it took so long - have fun browsing the menus on the home page :)



Now the screenshots can be auto-updated every time a new build arrives.



Under the hood:
- a script that navigates the entire ML menu and saves screenshots (it runs on Jenkins, here's a GIF)
  - the following modes are simulated: photo, photo LiveView, movie
  - for each top-level ML menu option: you can toggle the option (SET), navigate the submenu (Q) or do both
  - nothing can be changed in submenus (yet)
  - anything you have changed is discarded as soon as you go away from that menu item (to keep the number of screenshots manageable)
  - extras: Canon menu, Q menus (just navigation, without changing any option)
  - result: some pre-rendered screenshots (about 3000 images at the time of writing)
- some hardcoded navigation logic (that runs on the server)
  - simulation state is completely given by the key sequence you see in the URL
  - screenshots are served as static images like this (will be cached by your browser)
  - each screenshot has about 10K (LiveView screenshots are larger)
  - URL grows with each click (but you can start over from the power button)
- front-end (interpreted by the browser)
  - button overlays (the red circles) are SVG elements on top of the camera image
  - works with or without JS (looks nicer with JS, e.g. flips the movie button, LED activity, nicer transitions)
  - LED shows network activity (querying the next state, loading a screenshot)
  - keyboard works too if JS is enabled (same buttons as in QEMU); ESC to disable keybindings, ENTER to re-enable
  - browser back button works too

9
Reverse Engineering / Interrupt IDs
« on: January 23, 2018, 07:16:12 PM »
After writing these notes (in particular, the section about interrupts), I've noticed we didn't document what all these interrupts are used for. This info is interesting for emulation and understanding how Canon code works; they are not used directly in ML code.

So, here's my first attempt to list all the interrupts we know about. Sources of info:

- startup-logs or emulation logs with register_interrupt calls enabled (some have names in Canon code):
Code: [Select]
grep --text -nro " register_interrupt(.*)" startup-logs/ tests/*/gdb.log | grep -o register_interrupt.*

- interrupts declared in QEMU, model_list.c
Code: [Select]
cat qemu-2.5.0/hw/eos/model_list.c | grep -o "\..*interrupt.*=.*,"

- interrupts scattered in QEMU source: eos_trigger_int (either hardcoded IDs or arrays)
Code: [Select]
FILES=$(cat qemu-2.5.0/hw/eos/Makefile.objs | grep -E "eos/\w+.o" | grep -oE "\w+.o" | cut -d . -f 1 | sed -e 's/$/.c/' | sed -e 's!^!qemu-2.5.0/hw/eos/!')
cat $FILES | grep "eos_trigger_int\|^\w.*(" | grep -B 1 eos_trigger_int
cat $FILES | grep -zoE " [a-zA-Z_]*interrupt[a-zA-Z]*\[[^[]*] = {[^{]*}" | tr "\\n" " " | tr "\\0" "\\n"

- Omar interrupts

Results (machine output, take with a grain of salt):

Code: [Select]
0x01:
0x02:
0x03: WdtInt
0x06:
0x09: dryos_timer
0x0A: dryos_timer
0x0D: Omar
0x0E: UTimerDriver
0x0F:
0x10: OC4_14, hptimer
0x16:
0x18: hptimer
0x19: OCH_SPx
0x1A: OCH_SPx, hptimer
0x1B: OCHxEPx, dryos_timer
0x1C: OCH_SPx, Omar, hptimer
0x1D: OCHxEPx
0x1E: OCH_SPx, UTimerDriver, hptimer
0x1F: OCHxEPx
0x20: ICAPCHx
0x21: ICAPCHx
0x22: ICAPCHx
0x23: ICAPCHx
0x24: ICAPCHx
0x25: ICAPCHx
0x26: ICAPCHx
0x27: ICAPCHx
0x28: OC4_14, hptimer
0x29: OCHxEPx, sd_dma
0x2A: MREQ_ISR, mpu_mreq
0x2C: DmaAD
0x2D: DmaDA, Omar
0x2E: UTimerDriver, uart_rx
0x2F: BLTDMA, BLTDMAC0, BltDmac, dma
0x30: CFDMADriver, cf_dma
0x32: SDDMADriver, SdDmaInt, sd_dma
0x33:
0x34:
0x35: SlowMossy
0x36: SIO3_ISR, mpu_sio3
0x37: INTC_SIO4
0x38: uart_rx
0x39: OCH_SPx, uart_rx
0x3A: uart_tx
0x3C: Omar
0x3E: UTimerDriver
0x40:
0x41: WRDMAC1
0x42: ASIFAdc
0x43: ASIFDac
0x44: HDMIDET, MICDET, USBDET
0x45: VIDEODET
0x47: MICDET, VARISW3
0x49: OCHxEPx
0x4A: CFDriver, MREQ2_ICU, cf_driver, sd_driver
0x4B: SDDriver, SdConInt, sd_driver
0x4D: Omar
0x4E: UTimerDriver
0x50: EMEGENCY, EMERGENCY_ISR, MREQ_ISR, mpu_mreq
0x51: CAPREADY, CARDDOOR_ISR
0x52: IMGPOWDET, MREQ_ISR, mpu_mreq
0x53: CAPREADY, HDMIDET
0x54: DOS_ISR, EMEGENCY, HDMIDET, MICDET, USBDET, VIDEODET
0x55: ASCHK_ISR, FUNCSW, USBDET, VIDEODET
0x56: HDMIDET, IMGPOWDET, MICDET, TOEDET, USBDET, VARISW3, VIDEODET
0x57: DOSCHK_ISR, TOEDET, USBDET, VARISW3
0x58: EDmacWR0, WEDmac0, edmac
0x59: EDmacWR1, OCH_SPx, WEDmac1, edmac
0x5A: EDmacWR2, LENSIF_SEL, WEDmac2, edmac
0x5B: EDmacWR3, WEDmac3, edmac
0x5C: EDmacWR4, Omar, WEDmac4, edmac
0x5D: EDmacRD0, REDmac0
0x5E: EDmacRD1, REDmac1, UTimerDriver, edmac
0x5F: EDmacRD2, REDmac2, edmac
0x60: CompleteReadOperation
0x61: AfComplete
0x62: AfOverRun
0x63: Obinteg
0x64: JP51_INT_R, JpCore, jpcore
0x65: ADKIZ, ADMERG, IntDefectCorrect, prepro_execute
0x66: Integral, WB Integ, WbInteg
0x67: Block, WbBlock
0x68: EngInt PBVD, Engine PB VD, PB_VD, display
0x69: EngInt PBERROR, EngInt PBVD, OCHxEPx, PB_ERR, Pb error
0x6A: HEAD1, Head1, head
0x6B: HEAD2, Head2, head
0x6C: HEADERROR, HeadError
0x6D: EDmacWR5, Omar, WEDmac5, edmac
0x6E: EDmacRD3, REDmac3, UTimerDriver, edmac
0x70: HarbInt
0x74: BLTDMA, BLTDMAC1, BltDmac, dma
0x75: BltDmac, dma
0x76: BltDmac, dma
0x77: HDMIDET, USBDET
0x79: OCH_SPx
0x7A: XINT_7
0x7C: Omar
0x7E: UTimerDriver
0x80:
0x81:
0x82: CFDriver, cf_driver
0x83: WEDmac8, edmac
0x84:
0x86:
0x88:
0x89: OCHxEPx
0x8A: INT_LM, WEDmac9, edmac
0x8B: REDmac6
0x8C:
0x8D:
0x90: WEDmac6
0x91: REDmac4
0x92: REDmac5, REDmac7, edmac
0x93: CompleteOperation
0x94:
0x95: edmac
0x96: REDmac10, edmac
0x97: REDmac11, edmac
0x98: CAMIF_0
0x99: OCH_SPx
0x9A: CompleteOperation
0x9B:
0x9C: Omar, SEQ
0x9E: REDmac13, edmac
0x9F: edmac
0xA0: BltDmac, EekoBltDmac, dma
0xA1: BltDmac, EekoBltDmac, dma
0xA3: Jp57, JpCore2
0xA5: RDDMAC15, edmac
0xA8: BltDmac, CAMIF_1, dma
0xA9: BltDmac, OCHxEPx, dma
0xAA: CompleteOperation
0xB0: SSIO_SIOINT
0xB1: SDDriver, sd_driver
0xB2: OCH_SPx
0xB3: OCH_SPx
0xB8: CFDMADriver, SDDMADriver, sd_dma
0xB9: OCH_SPx
0xBA: OCH_SPx
0xBB: OCH_SPx
0xBC: Omar
0xBE: SdDmaInt0, sd_dma
0xC0: WEDmac6, edmac
0xC1: REDmac4, edmac
0xC5: SAFARI_INT
0xC6: SAFARI_INT_ERROR
0xC8: REDmac5, edmac
0xC9: Fencing_A, OCHxEPx
0xCA: INT_LM, WEDmac10, edmac
0xCB: WEDmac11, edmac
0xCD: Omar
0xCE: SerialFlash
0xD0: Fencing_B
0xD1: Fencing_C
0xD2: WEDmac12, edmac
0xD3: WEDmac13, edmac
0xD9: HEAD3, Head3, ICAPCHx, head
0xDA: WEDmac14
0xDB: WRDMAC15, edmac
0xDC: Omar
0xDD:
0xDE: SerialFlash
0xE0: HEAD4, head
0xE1: SsgStopIrq
0xE2: REDmac8, edmac
0xE3: CFDMADriver, cf_dma
0xE4: GaUSB20Hal
0xE6:
0xE8:
0xE9: ICAPCHx
0xEE: SdConInt0, sd_driver
0xF9: ICAPCHx, WEDmac7
0xFC: OCH_SPx, Omar
0xFD:
0xFE: SerialFlash, dryos_timer, serial_flash
0xFF:
0x102: RDDMAC13
0x109: ICAPCHx
0x10C: BltDmac
0x10D:
0x10E: SerialFlash
0x111: Eeko WakeUp
0x115:
0x117:
0x119: ICAPCHx
0x125:
0x127:
0x129: ICAPCHx
0x12A: mpu_mreq
0x12D:
0x137:
0x139: ICAPCHx
0x13A: CAPREADY
0x13E: xdmac
0x140: ICOCCHx
0x141: ICAPCHx
0x142: ICAPCHx
0x145:
0x147: SIO3_ISR, mpu_sio3
0x148: ICAPCHx
0x149: ICAPCHx
0x14A: ICAPCHx
0x14E: xdmac
0x150: ICAPCHx
0x151: ICAPCHx
0x152: ICAPCHx
0x157:
0x158: OCH_SPx
0x159: ICAPCHx, OCH_SPx
0x15A: OCH_SPx
0x15D: uart_rx
0x15E: xdmac
0x162: SerialFlash
0x167:
0x169: ICAPCHx
0x16D: uart_tx
0x16E: xdmac
0x171: SDDMADriver, sd_dma
0x172: SDDriver, sd_driver
0x174:
0x177:
0x178: DOS_ISR, MICDET
0x179: ICAPCHx
0x17B: SerialFlash, serial_flash
0x187:
0x189: ICAPCHx
0x18B: WdtInt

TODO:

- group by camera generation, DIGIC version etc
- other sources of info (such as strings present in the interrupt handling function, or other notes about them)
- brute-force interrupts (trigger manually) and see what the firmware is trying to do
- auto-build the above list on Jenkins (so it will be always up to date, at least with QEMU sources)

10
Following this request, I've decided to revive the old filepref module. Renamed it to img_name.mo.

Features:
- custom image file prefix (IMG_1234.JPG -> ABCD1234.JPG; from the old filepref module)
- change image file number to any value (IMG_1234.JPG -> IMG_5678.JPG; experimental, restart required)

TODO:
- timestamped file names (original request). Please don't expect it anytime soon - I don't know how to change the file number (last 4 characters) without restart. Maybe you can figure it out?
- date-stamped file names? (MMDD1234). This one might be easier; still need to find out how to reset the counter.
- continuous numbering? (12349999 -> 12350000, ABCD9999 -> ABCE0000). This one should be easy.
- customize folder number? (didn't try, but noticed the property in QEMU).

Known/possible issues:
- on 5D3, Canon file naming options must be set to default.
- might conflict with Dual ISO custom file naming (not tested).
- only tested on 5D3 and 60D.

Related:
- Image file prefix is also available to Lua (lua_fix builds)

Source: https://bitbucket.org/hudson/magic-lantern/src/img_name/modules/img_name
Binary: https://builds.magiclantern.fm/modules.html#img_name (only the first feature works; the second one requires a custom ML build)

11
Reverse Engineering / TFT SIO communication (tft_regs.mo)
« on: November 26, 2017, 03:51:30 PM »
Some notes after looking into this.

QEMU logs: zip

How I've got them:
Code: [Select]
make -C ../magic-lantern/60D_install_qemu
./run_canon_fw.sh 60D,firmware="boot=1" -d debugmsg,io
# same for 600D, 650D, 700D, 70D
In ML menu, selected Display -> Advanced -> Orientation -> Normal/Mirror/Reverse, then copied the console output. Had to silence a few things in QEMU to get clean logs. The important lines are those like this:
Code: [Select]
[  DisplayMgr:ff0611b4 ] (82:02) SIO [3]:0xf01d

60D, 600D:
Code: [Select]
[ GuiMainTask:ff325714 ] (04:03) -->Mirror start
[  DisplayMgr:ff0611b4 ] (82:02) SIO [0]:0x1000
[  DisplayMgr:ff0611b4 ] (82:02) SIO [1]:0xbe01
[  DisplayMgr:ff0611b4 ] (82:02) SIO [2]:0xe401
[  DisplayMgr:ff0611b4 ] (82:02) SIO [3]:0xf01d
[ GuiMainTask:ff325774 ] (04:03) -->Normal start
[  DisplayMgr:ff0611b4 ] (82:02) SIO [0]:0x1001
[  DisplayMgr:ff0611b4 ] (82:02) SIO [1]:0xbe01
[  DisplayMgr:ff0611b4 ] (82:02) SIO [2]:0xe401
[  DisplayMgr:ff0611b4 ] (82:02) SIO [3]:0xf01d
[ GuiMainTask:ff325744 ] (04:03) -->Reverse start
[  DisplayMgr:ff0611b4 ] (82:02) SIO [0]:0x1000
[  DisplayMgr:ff0611b4 ] (82:02) SIO [1]:0xbe01
[  DisplayMgr:ff0611b4 ] (82:02) SIO [2]:0xe401
[  DisplayMgr:ff0611b4 ] (82:02) SIO [3]:0xf09d

700D, 650D:
Code: [Select]
cat 700D-*.log | grep -E "DisplayMgr.*SIO|-->"
[ GuiMainTask:ff4d91bc ] (04:03) -->Mirror start
[  DisplayMgr:ff128980 ] (82:01) SIO [0]:0x36
[  DisplayMgr:ff128980 ] (82:01) SIO [1]:0x140
[ GuiMainTask:ff4d921c ] (04:03) -->Normal start
[  DisplayMgr:ff128980 ] (82:01) SIO [0]:0x36
[  DisplayMgr:ff128980 ] (82:01) SIO [1]:0x100
[ GuiMainTask:ff4d91ec ] (04:03) -->Reverse start
[  DisplayMgr:ff128980 ] (82:01) SIO [0]:0x36
[  DisplayMgr:ff128980 ] (82:01) SIO [1]:0x1c0

70D (EOS M matches this):
Code: [Select]
cat 70D-*.log | grep -E "DisplayMgr.*SIO|-->"
[ GuiMainTask:ff504660 ] (04:03) -->Mirror start
[  DisplayMgr:ff134c18 ] (82:02) SIO [0]:0x602
[ GuiMainTask:ff5046c0 ] (04:03) -->Normal start
[  DisplayMgr:ff134c18 ] (82:02) SIO [0]:0x600
[ GuiMainTask:ff504690 ] (04:03) -->Reverse start
[  DisplayMgr:ff134c18 ] (82:02) SIO [0]:0x606

Experimental code (don't click me):
Code: [Select]
static void run_test()
{
    msleep(3000);

    #ifdef CONFIG_5D3_113
    void (*lcd_sio_init)() = (void *) 0xFF12D284;
    void (*lcd_sio_write)(uint32_t * data, int size) = (void *) 0xFF12D1E0;
    void (*lcd_sio_finish)(void * sio_obj) = (void *) 0xFF13BDC8;
    void ** p_lcd_sio_obj = (void **) 0x246F0;
    #endif
    // 650D 104: FF127E88, FF127D88, FF13B868, 23C48.
    // 700D 115: FF128A28, FF128928, FF13C420, 23C58.

    printf("LCD sio start\n");
    lcd_sio_init();
    lcd_sio_write((uint32_t[]) { 0x36, 0x140 }, 2);
    lcd_sio_finish(*p_lcd_sio_obj);
    printf("LCD sio finish\n");
}

5D3: the above code turns off the screen, but leaves the backlight on.
650D: wip
700D: ?

SIO initialization sequences are different (likely different TFT controllers). 650D and 700D are identical.

5D3 has an interesting SIO sequence when display brightness is set to Auto.

Code: [Select]
11949> DisplayMgr:ff12d238:82:02: SIO [0]:0x34
11981> DisplayMgr:ff12d238:82:02: SIO [1]:0x1700
119B5> DisplayMgr:ff12d238:82:02: SIO [2]:0x1808
119E9> DisplayMgr:ff12d238:82:02: SIO [3]:0x1960
11A1C> DisplayMgr:ff12d238:82:02: SIO [4]:0x35

Hypothesis: high byte is TFT register address, low byte is value (similar to ADTG, CMOS, audio).

5D3: register 0x19 appears to be gamma correction (0-63). The remaining two bits cause some flicker in saturated blue (?!)

Register 0 is set to 0x34 on TftDeepStanby; 0x35 brings back the image.

Code: [Select]
    for (int i = 0; i < 64; i++)
    {
        lcd_sio_init();
        lcd_sio_write((uint32_t[]) { 0x34, 0x1900 | (i + (rand() & 3) * 64), 0x35 }, 3);
        lcd_sio_finish(*p_lcd_sio_obj);
        msleep(50);
    }

The search space appears small (256 registers, 256 possible values), so let's brute-force it:
Code: [Select]
    for (int reg = 0; reg < 0x100; reg++)
    {
        for (int val = 0; val < 0x100; val++)
        {
            bmp_printf(FONT_LARGE, 50, 50, "%02x: %02x", reg, val);
            lcd_sio_init();
            lcd_sio_write((uint32_t[]) { 0x34, (reg << 8) | val, 0x35 }, 3);
            lcd_sio_finish(*p_lcd_sio_obj);
            msleep(50);
        }

        /* restore the display back to working condition */
        enter_play_mode();
        exit_play_qr_mode();
    }

Documenting other registers (either on 5D3 or on other models) is welcome. Other than trial and error, I don't have a better way to analyze them.

Other registers present (found with { 0x34, rand() & 0xFFFF, 0x35 }, but not written down):
- color adjusments (temperature?)
- mirroring, flipping
- half resolution
- scaling, translation (both H and V)

So far, the image gets back to normal when switching the display mode (such as going into Canon menu).

12
Just playing with this dataset and https://github.com/pathak22/pyflow

Do you mean you want to try some super resolution algorithms ?
I have about 45 frames of this castle, before I start panning to the right...
Uploading frame 0 to 45 right now, takes about half an hour.
Same link as before.
http://drive.google.com/drive/folders/0B1BxGc3dfMDaRmtKc2tOa3dHMTA?usp=sharing

- before: dcraw M27-1337-frame_000002.dng
- after: averaged with frames 1 and 3, warped with optical flow to match frame 2





To reproduce the above result, get the files below, install the dependencies (follow comments and error messages), then type:
Code: [Select]
make -j2 M27-1337_frame_000002-a.jpg

or "make -j8" to render the entire sequence on a quad-core processor.



Makefile (use Firefox for copying the text; Google Chrome and Safari will not work!)
Code: [Select]
# experiment: attempt to reduce aliasing on hand-held footage using optical flow
# requires https://github.com/pathak22/pyflow

# replace with path to pyflow repository
FLOW=python ~/src/pyflow/flow.py

# default target: render all frames as jpg
all: $(patsubst %.dng,%-a.jpg,$(wildcard M27-1337_frame_*.dng))

# render DNGs with dcraw
%.ppm: %.dng
dcraw $<

# helper to specify dependencies on previous or next image
# assumes the file name pattern is: prefix_000123 (underscore followed by 6 digits)
# fixme: easier way to... increment a number in Makefile?!
inc = $(shell stem="$1"; echo $${stem%_*}_$$(printf "%06d" $$((10\#$${stem//*_/}+$2))) )

# enable secondary expansion (needed below)
.SECONDEXPANSION:

# next or previous frames
%-n.png: %.ppm $$(call inc,%,1).ppm
$(FLOW) $^ $@

%-p.png: %.ppm $$(call inc,%,-1).ppm
$(FLOW) $^ $@

# average
%-a.png: %.ppm %-n.png %-p.png
convert -average $^ $@

# fallback rules: first / last file will only have "next" / "previous" neighbours
# FIXME: these rules may be chosen incorrectly instead of the above in some edge cases; if in doubt, delete them and see if it helps
%-a.png: %.ppm %-n.png
convert -average $^ $@

%-a.png: %.ppm %-p.png
convert -average $^ $@

# convert to jpg
%.jpg: %.ppm
convert $< $@
%.jpg: %.png
convert $< $@

# 100% crops
%-crop.jpg: %.jpg
convert $< -crop 400x300+900+650 $@

# careful if you have other files in this directory ;)
clean:
rm -f *.ppm *.jpg *.png *.npy

# do not delete intermediate files
.SECONDARY:

# example:
# make -j8
# make -j2 M27-1337_frame_000002-a.jpg

flow.py:
Code: [Select]
# Modified the demo from https://github.com/pathak22/pyflow
# -> just save the warped image and the computed flow; filenames from command line

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
# from __future__ import unicode_literals
import numpy as np
from PIL import Image
import time
import pyflow
import sys

try:
    print("%s %s -> %s" % (sys.argv[1], sys.argv[2], sys.argv[3]))
except:
    print("usage: %s input1.jpg input2.jpg output.npy" % sys.argv[0])
    raise SystemExit

im1 = np.array(Image.open(sys.argv[1]))
im2 = np.array(Image.open(sys.argv[2]))
im1 = im1.astype(float) / 255.
im2 = im2.astype(float) / 255.

# Flow Options:
alpha = 0.012
ratio = 0.75
minWidth = 20
nOuterFPIterations = 7
nInnerFPIterations = 1
nSORIterations = 30
colType = 0  # 0 or default:RGB, 1:GRAY (but pass gray image with shape (h,w,1))

s = time.time()
u, v, im2W = pyflow.coarse2fine_flow(
    im1, im2, alpha, ratio, minWidth, nOuterFPIterations, nInnerFPIterations,
    nSORIterations, colType)
e = time.time()
print('Time Taken: %.2f seconds for image of size (%d, %d, %d)' % (
    e - s, im1.shape[0], im1.shape[1], im1.shape[2]))

flow = np.concatenate((u[..., None], v[..., None]), axis=2)
np.save(sys.argv[3] + ".npy", flow)

import cv2
cv2.imwrite(sys.argv[3], im2W[:, :, ::-1] * 255)

Exercise for the reader: use more frames to compute the correction.

Have fun.

13
General Development Discussion / Full-screen histogram WIP
« on: October 30, 2017, 10:26:45 PM »
Something like this?








Topic split from here.

14
General Development Discussion / Automated tests for nightly builds in QEMU
« on: September 19, 2017, 10:35:56 PM »
Another pipe dream came true :) - this time, a dream of mine.

Have you noticed a bunch of screenshots on the nightly builds page?

Were you wondering what's up with them?



These screenshots are created on the build server, by emulating the very builds available for download, unmodified, in QEMU.

In other words, most of the nightly builds are no longer 100% untested when they appear on the download page :)

This is not an overnight development - it's built upon all these years of fiddling with QEMU. A short while ago I couldn't give a good answer regarding the usefulness of the emulator - now you can see it live.

Right now there are only a few tests, with OCR-based menu navigation (using tesseract):

1) navigate to Debug -> Free Memory and take a screenshot from there
2) load the Lua module and run the Hello World script
3) load the file_man module and browse the filesystem
4) play the first 3 levels of the Sokoban game (lua_fix only; example for 1200D)

TODO:
- add more tests (easy, but time-consuming)
- emulate more camera components (e.g. image playback to be able to test ML overlays)
- check code coverage
- diff the screenshots
- nicer reports

For now, have fun watching the testing script playing Sokoban in QEMU :)



Emulation log

If you are wondering what's the point of testing this game: it covers many backend items, such as menu navigation, module loading, script config files, making sure keys are not missed randomly during script execution, checking whether the camera has enough memory to run scripts - most of these are real bugs found on some camera models from the current nightly builds.

At least, these tests will catch the long-standing issue of some camera models running out of memory, thus not being able to boot. Not very funny for a build considered somewhat stable...

And the emulation is still pretty limited, so I'm just adding tests for what works :)



A while ago I've got the suggestion to use openQA, but I'm still wrapping my head around it. If you can show how it could save us from reinventing the wheel, I'm all ears.

15
Sneak preview of what I'm working on:

Code: [Select]
ML ASSERT:
a
at ../../src/stdio.c:44 (streq), task module_task
lv:0 mode:3

module_task stack: ad340 [69c60-1dd3b0]
0x0006DC34 @ 7162c:1dd400
0x0007CF7C @ 6dd3c:1dd3f0
0x00069C00 @ 7cf9c:1dd3e0
0x000AD644 @ 69c5c:1dd3b0

What's the meaning of these codes?

Code: [Select]
eu-addr2line -s -S --pretty-print -e magiclantern 0x0006DC34 0x0007CF7C 0x00069C00 0x000AD644
entry_guess_icon_type at menu.c:694
streq at stdio.c:43
ml_assert_handler at boot-hack.c:596
backtrace_getstr at backtrace.c:859

eu-addr2line -s -S --pretty-print -e magiclantern 7162c 6dd3c 7cf9c 69c5c
menu_add.part.25+0x100 at menu.c:1212
entry_guess_icon_type+0x108 at menu.c:711
streq+0x20 at stdio.c:44
ml_assert_handler+0x5c at boot-hack.c:605

Putting all together:
Code: [Select]
menu_add (menu.c:1212) called entry_guess_icon_type (located menu.c:694)
 entry_guess_icon_type (menu.c:711) called streq (located at stdio.c:43)
  streq (stdio.c:44) called ml_assert_handler (located at boot-hack.c:605) - that's the ASSERT macro
   ml_assert_handler (boot-hack.c:605) called backtrace_getstr (located backtrace.c:859)

Heh, that backtrace went a little bit too far :)

Note: the above line numbers are valid for this changeset.

Works for Canon code too (but it's unable to figure out indirect calls):
Code: [Select]
ASSERT: FALSE
at RscMgr.c:2513, task InnerDevelopMgr
lv:0 mode:3

InnerDevelopMgr stack: ad360 [697d8-19e498]
0xUNKNOWN  @ de48:19e568
0xUNKNOWN  @ 17bbc:19e540
0x000178B4 @ ff139c38:19e528
0xUNKNOWN  @ 178e4:19e518
0xUNKNOWN  @ 1796c:19e4f8
0xFF0F2F14 @ ff301928:19e4e0
0x00001900 @ ff0f2f80:19e4d0
0x000AD664 @ 697d4:19e498

Will post more details after committing the source.

In the mean time, I'd appreciate a small script (easy coding task) that would take a crash log as input (as in the above examples) and create a human-readable output from it (as in the "putting all together" example). To get the debugging info required for name translation, you'll need this changeset.

17
Some early notes (5D3 1.2.3).

PROP_REBOOT (software reboot):
Code: [Select]
    int reboot = 0;
    prop_request_change(PROP_REBOOT, &reboot, 4);

0x80010001 PROP_TERMINATE_SHUT_REQ (0=request, 3=execute, 4=cancel)
Code: [Select]
08F1E>    PropMgr:000aecf0:00:00: *** mpu_send(06 04 04 07 00), from ff12298c
09101> **INT-36h*:000aed58:00:00: *** mpu_recv(06 05 02 0b 00 00), from ff2e87f8
09388>    PropMgr:ff0cdc60:8c:03: terminateChangeCBR : SHUTDOWN (0)
093AA>    PropMgr:ff0cde2c:8c:16: SHUTDOWN_REQUEST

Opening battery door:
0x80010002 PROP_ABORT
Code: [Select]
786D6> **INT-36h*:000aed88:00:00: *** mpu_recv(06 05 06 26 01 00), from ff2e87f8
78821> **INT-36h*:000aed88:00:00: *** mpu_recv(06 05 06 13 01 00), from ff2e87f8
7915B> **INT-36h*:000aed88:00:00: *** mpu_recv(06 05 04 0d 00 00), from ff2e87f8
86E81> **INT-36h*:000aed88:00:00: *** mpu_recv(06 04 02 0c 01), from ff2e87f8
8A249>    PropMgr:ff0f8f74:00:03: [SEQ] CreateSequencer (Terminate, Num = 2)
8A2CB>    PropMgr:000aeb2c:00:00: *** task_create("Terminate", 0x11, 0x1000, 0xff0f8e40, &"Sequencer"), from ff0f8ffc
...
8A1F8>  Terminate:000aeb0c:00:00: *** terminateAbort(0x200000, 0x0, 0x0, 0x200000), from ff0f8edc
8A5D7>  Terminate:000aeb0c:00:00: *** terminateAbort(0x10, 0x0, 0x0, 0x10), from ff0f8edc

Saving settings to ROM at shutdown:
Code: [Select]
8A16A>    PropMgr:ff1282a4:02:03: Compare FROMAddress (0) 0x40710e00 0xff060000 Size 2424

Code: [Select]
      RAM_DebugMsg(140, 22, "Write to FROM");
      prop_erase(0x5000000);
      prop_write(0x5000000);
      ...

When opening the battery door, I've identified prop_erase/prop_write calls to 0x3000000 (0xff21c8bc, triggered from PROP_ABORT), 0x5000000 and 0x2000000 (0xff0ce424, Terminate task).

On normal shutdown (power button or card door), terminateShutdown is used instead of terminateAbort.

Canon settings are organized like this (see PROPAD_CreateFROMPropertyHandle):

Code: [Select]
name   ROM addr   N * sector_size?  block_size?   prop_class
TUNE   0xF8B20000   23 * 0x20000      0x2E0000    0x1000000
TUNE2  0xF0020000   42 * 0x10000      0x2A0000    0x1000000
FIX    0xF8E60000    4 * 0x20000       0x80000    0
RING   0xF8F40000    2 * 0x20000        0x1000    0x2000000
RASEN  0xF8EE0000    3 * 0x20000       0x20000    0x4000000, 0x5000000, 0xE000000
LENS   0xF8E00000    3 * 0x20000       0x20000    0xB000000
CUSTOM 0xF8060000    2 * 0x20000        0x1000    0x3000000

The prop_class fields is the "category" of properties stored in each block. Examples:
- RING: 0x02040002 PROP_LANGUAGE, 0x02040003 PROP_VIDEO_SYSTEM, 0x02040005 PROP_DATE_FORMAT
- FIX: 0x2 PROP_CAM_MODEL, 0x5 PROP_DCIM_DIR_SUFFIX
- LENS: 0x0B000000 PROP_OPTICAL_CORRECT_PARAM
- TUNE: 0x1010022/25...37 vertical stripe correction parameters
- TUNE2: 0x10500d1...d4 (see above)
- RASEN: unknown (0x5010002 contains '192.168.1.20', 0x5010003 contains 'Wft-canon')
- CUSTOM: some of them look like picture style parameters (probably settings for C modes)

See also http://www.magiclantern.fm/forum/index.php?topic=4729.0

18
As you probably have guessed from the latest developments (QEMU, EDMAC graphs, JPCORE, EEKO), our understanding on how LiveView works has improved considerably. Finally, all my fiddling with QEMU, at first sight with little or no purpose for the everyday users, starts paying off.

Today, Magic Lantern proudly announces new ground-breaking features that were previously thought impossible or very hard to achieve.

We proudly present....

4K RAW Video Recording!



DOWNLOAD

Twitter announcement




On the 5D Mark III, you now have the following new resolutions:

* 1920x960 @ 50p (both 1:1 crop and full-frame - 3x3 pixel binning) - continuous*)
* 1920x800 @ 60p (same as above)  - continuous*)
* 1920x1080 @ 45p and 48p (3x3 binning)  - continuous at 45p*)
* 1920x1920 @ 24p (1:1 square crop) - continuous*)
* 3072x1920 @ 24p (1:1 crop)
* 3840x1536 @ 24p (1:1 crop) (corrupted frames at 1600)
* 4096x2560 @ 12.5p (1:1 crop) - continuous*) at 8 FPS
* 4096x1440 @ 25p (1:1 crop)
* Full-resolution LiveView: 5796x3870 at 7.4 fps (128ms rolling shutter) - continuous*) at 5 FPS!
* Full-width LiveView - decrease vertical resolution in the crop_rec submenu, all the way to 5796x400 @ 48 fps :)

The last feature complements the well-known full-resolution silent pictures - the new implementation will be usable at fast shutter speeds, without the exposure gradient - but with rolling shutter (just like regular LiveView frames).

*) Continuous recording for the above resolutions can be achieved as long as you can get a LJ92 compression ratio (compressed / 14-bit uncompressed) of about 50-55%, with preview set to Frozen LV (previously known as Hacked Preview) for an additional speed boost. Otherwise, you'll have to reduce the resolution or the frame rate.

The following table shows how compression rate changes with ISO and bit depth; please check the figures for your particular scene in the raw video submenu, as they can vary a lot, depending on the scene content.

Bits per pixel      14  12  11  10   9   8
ISO  100 1/100     61% 53% 50% 48% 46% 43%
ISO  200 1/200     62% 54% 51% 49% 47% 44%
ISO  400 1/400    63% 54% 51% 49% 47% 45%
ISO  800 1/800     65% 55% 52% 50% 48% 46%
ISO 1600 1/1600    67% 56% 53% 50% 48% 46%
ISO 3200 1/3200    70% 57% 53% 50% 49% 47%
ISO 6400 1/6250    76% 60% 55% 52% 50% 48%
ISO 12800 1/12500  79% 63% 57% 53% 50% 49%

Credits: Greg (full-width LiveView), g3gg0 (video timer, DIGIC registers documentation and lots of other low-level insights).

Complete list of new video modes:
Code: [Select]
                                /*   24p   25p   30p   50p   60p */
    [CROP_PRESET_3X_TALL]       = { 1920, 1728, 1536,  960,  800 }, /* 1920 */
    [CROP_PRESET_3x3_1X]        = { 1290, 1290, 1290,  960,  800 }, /* 1920 */
    [CROP_PRESET_3x3_1X_48p]    = { 1290, 1290, 1290, 1080, 1080 }, /* 1920; 1080p45/48 <- 50/60p in menu */
    [CROP_PRESET_3K]            = { 1920, 1728, 1504,  760,  680 }, /* 3072 */
    [CROP_PRESET_UHD]           = { 1536, 1472, 1120,  640,  540 }, /* 3840 */
    [CROP_PRESET_4K_HFPS]       = { 2560, 2560, 2500, 1440, 1200 }, /* 4096 half-FPS */
    [CROP_PRESET_FULLRES_LV]    = { 3870, 3870, 3870, 3870, 3870 }, /* 5796 */

What else could you wish for?



FAQ

Where's the catch?

This is only a very rough proof of concept. It has not been battle-tested and has many quirks. Some of them may be easy to fix, others not so. In particular:

* It feels quite buggy. I'm still hunting the issues one by one, but it's hard, as Canon's LiveView implementation is very complex, and our understanding on how it works is still very limited.
* Write speeds are high. For example, 10-bit 4096x2500 at 15 fps requires 180 MB/s. 1080p45 should be a little more manageable at 111 MB/s.
* Canon preview is broken in most modes; you need to use the grayscale preview in the raw recording module.
* High-resolution modes (in particular, full-res LiveView) may cause trouble with memory management. This is very tricky to solve, as we only get 3 full-resolution buffers in LiveView, with restrictions on the order in which they must be freed, and lots of other quirks.
* Since these settings were pushed to limit, the risk of corrupted frames is high. If it happens, decrease the vertical resolution a bit (from the crop_rec submenu).
* When refreshing LiveView settings, the camera might lock-up (no idea why). Pressing MENU twice appears to fix it.

May I fine-tune the new modes?

Yes! I've included some of the knobs on the user interface. Normally you shouldn't need to touch these buttons, but if you do, you might be able to squeeze a few more pixels.

Does it work with FPS override?

Sort of. It's not reliable at this point, so it's best not to try yet.

Overheating?

During my tests, I didn't manage to get a sensor temperature higher than 60 degrees. Your mileage may vary.


Risks?

This mod changes some low-level sensor parameters that are not well understood. They were all figured by trial and error, and there are no guarantees about the safety of these changes.

As usual, if it breaks, it's your fault, sorry.

Will it work on other camera models?

I hope so; however, this is an area where I hope to get contributions from others (yes, from you). If these new features don't motivate you to look into it, I wonder what else will.

I'll explain how all this works in the coming days or weeks.

Is it difficult to port to other camera models?

So far, the 3x3 720p mode from crop_rec was ported to EOS M (rbrune), 700D (dfort) and 100D (nikfreak). So it shouldn't be that hard...

Will you port this to my camera model, please?

No, sorry. I have better things to do - such as, preparing the April 1st prank for next year :)

Wait a minute, didn't you say you are primarily a still photo user? Why are you even doing this?

If you look close, the usefulness for video is fairly limited, as the write speeds (and therefore the recording times) are not practical.

But the full-resolution LiveView is - in my opinion - very useful for still photo users. Although the current implementation is not very polished (it's just a proof of concept), I hope you'll like the idea of a 7.4 FPS burst mode, 100% silent, without shutter actuations.

Right now, you can take the mlv_lite module with pre-recording and half-shutter trigger: at 10 bits per pixel, you get 5 frames pre-recorded, and saved to card as soon as you touch the half-shutter button. Or, you can capture one frame for each half-shutter press, with negative shutter lag! (since the captured frame will always be pre-recorded).

And if a burst at 7.4 fps is not enough, you may also look at the 4K modes (12-15 fps).

(I know, I know, GH4 already does this, at much higher frame rates...)

The help menu for full-res LiveView says 5796x3870, but MLV Lite only records 5784x3856. What's going on?

The raw recording modules have a couple of alignment constraints (e.g. can only start cropping from a multiple of 8 pixels, and the size of the cropped area (that goes into the MLV file) must be multiple of 16 bytes (that is, W*bpp/8 + H mod 16 must be 0).

To capture the full resolution, you may use the silent picture module. However, this module is not the best when it comes to memory management and buffering. Currently, you'll get an impressive buffer of 2 frames in burst mode :)

But hey - it outputs lossless DNG!

What about that lossless compression routine?

It's included, although I didn't manage to test it much. There is a lot of room for improvement, but for a proof of concept, it seems to work.

update: also got lossless compression at reduced bit depths (8...12-bit).



P.S. The initial announcement was disguised as an April Fools joke, just like the original crop_rec.

Twitter announcement

From original April Fools post:
Quote

With our latest achievements in wizardry with ARM programming and DIGIC reverse engineering, we can speak of a new era of raw video recording.

On models like the 5D Mark III, the next upcoming releases will feature an improved version of our crop_rec module that delivers the following new resolutions:
 
* 1920x960 @ 50p (both 1:1 crop and full-frame - 3x3 pixel binning)
* 1920x800 @ 60p (same as above)
* 1920x1080 @ 45p and 48p (3x3 binning)
* 1920x1920 @ 24p (1:1 square crop)
* 3072x1920 @ 24p (1:1 crop)
* 3840x1600 @ 24p (1:1 crop)
* 4096x2560 @ 12.5p (1:1 crop)
* Full-resolution LiveView: 5796x3870 at 7.4 fps (128ms rolling shutter).

The last feature complements the well-known full-resolution silent pictures - the new implementation will be usable at fast shutter speeds, without the exposure gradient - but with rolling shutter (just like regular LiveView frames).

Please understand that providing the source code for those highly DIGIC optimized routines is a bit troublesome and will need some extra legal care. After this step is taken and as soon we are finished with ensuring the product quality you are used from Magic Lantern, we will upload the code to our repository.

Consider this being a huge leap towards our next mind boggling goal:

8K RAW Video Recording!



Sample DNG from 5D Mark III, to show that our proof of concept is working:

8k.dng

Stay tuned for more information!


19
Modules Development / Writing modules tutorial #1: Hello, World!
« on: March 16, 2017, 10:24:14 PM »
So far, if you wanted to write your own module, the best sources of documentation were (and probably still are) reading the source code, the forum, the old wiki, and experimenting. As a template for new modules, you probably took one of the existing modules and removed the extra code.

This is one tiny step to improve upon that: I'd like to write a series of guides on how to write your own modules and how to use various APIs provided by Magic Lantern (some of them tightly related to APIs reverse engineered from Canon firmware, such as properties or file I/O, others less so, such as ML menu).

Will start with the simplest possible module:

Hello, World!




Let's start from scratch:
Code: [Select]
hg clone -u unified https://bitbucket.org/hudson/magic-lantern
cd magic-lantern/modules/
mkdir hello
cd hello
touch hello.c

Now edit hello.c in your favorite text editor:
Code: [Select]
/* A very simple module
 * (example for module authors)
 */
#include <dryos.h>
#include <module.h>
#include <menu.h>
#include <config.h>
#include <console.h>

/* Config variables. They are used for persistent variables (usually settings).
 *
 * In modules, these variables also have to be declared as MODULE_CONFIG.
 */
static CONFIG_INT("hello.counter", hello_counter, 0);


/* This function runs as a new DryOS task, in parallel with everything else.
 *
 * Tasks started in this way have priority 0x1A (see run_in_separate_task in menu.c).
 * They can be interrupted by other tasks with higher priorities (lower values)
 * at any time, or by tasks with equal or lower priorities while this task is waiting
 * (msleep, take_semaphore, msg_queue_receive etc).
 *
 * Tasks with equal priorities will never interrupt each other outside the
 * "waiting" calls (cooperative multitasking).
 *
 * Additionally, for tasks started in this way, ML menu will be closed
 * and Canon's powersave will be disabled while this task is running.
 * Both are done for convenience.
 */
static void hello_task()
{
    /* Open the console. */
    /* Also wait for background tasks to settle after closing ML menu */
    msleep(2000);
    console_clear();
    console_show();

    /* Plain printf goes to console. */
    /* There's very limited stdio support available. */
    printf("Hello, World!\n");
    printf("You have run this demo %d times.\n", ++hello_counter);
    printf("Press the shutter halfway to exit.\n");

    /* note: half-shutter is one of the few keys that can be checked from a regular task */
    /* to hook other keys, you need to use a keypress hook - see hello2 */
    while (!get_halfshutter_pressed())
    {
        /* while waiting for something, we must be nice to other tasks as well and allow them to run */
        /* (this type of waiting is not very power-efficient nor time-accurate, but is simple and works well enough in many cases */
        msleep(100);
    }

    /* Finished. */
    console_hide();
}

static struct menu_entry hello_menu[] =
{
    {
        .name       = "Hello, World!",
        .select     = run_in_separate_task,
        .priv       = hello_task,
        .help       = "Prints 'Hello, World!' on the console.",
    },
};

/* This function is called when the module loads. */
/* All the module init functions are called sequentially,
 * in alphabetical order. */
static unsigned int hello_init()
{
    menu_add("Debug", hello_menu, COUNT(hello_menu));
    return 0;
}

/* Note: module unloading is not yet supported;
 * this function is provided for future use.
 */
static unsigned int hello_deinit()
{
    return 0;
}

/* All modules have some metadata, specifying init/deinit functions,
 * config variables, event hooks, property handlers etc.
 */
MODULE_INFO_START()
    MODULE_INIT(hello_init)
    MODULE_DEINIT(hello_deinit)
MODULE_INFO_END()

MODULE_CONFIGS_START()
    MODULE_CONFIG(hello_counter)
MODULE_CONFIGS_END()

We still need a Makefile; let's copy it from another module:
Code: [Select]
cp ../ettr/Makefile .
sed -i "s/ettr/hello/" Makefile

Let's compile it:
Code: [Select]
make

The build process created a file named README.rst. Update it and recompile.

Code: [Select]
make clean; make

Now you are ready to try your module in your camera. Just copy the .mo file to ML/MODULES on your card.

If your card is already configured for the build system, all you have to do is:
Code: [Select]
make install

Otherwise, try:
Code: [Select]
make install CF_CARD=/path/to/your/card

or, if you have such device:
Code: [Select]
make install WIFI_SD=y

That's it for today.



To decide what to cover in future episodes, I'm looking for feedback from anyone who tried (or wanted to) write a ML module, even if you were successful or not.

Some ideas:
- printing on the screen (bmp_printf, NotifyBox)
- keypress handlers
- more complex menus
- properties (Canon settings)
- file I/O
- status indicators (lvinfo)
- animations (e.g. games)
- capturing images
- GUI modes (menu, play, LiveView, various dialogs)
- semaphores, message queues
- DryOS internals (memory allocation, task creation etc)
- custom hooks in Canon code
- your ideas?

Of course, the advanced topics are not for second or third tutorial.

20
General Development Discussion / Recording RAW and H.264 at the same time
« on: February 11, 2017, 02:34:52 PM »
I was experimenting with shooting raw video while simultaneously recording H.264 [...]

Quote
[...]it is too much of a hack[...]

Here's an attempt to make it a bit less of a hack:

http://bitbucket.org/hudson/magic-lantern/branch/raw-h264-proxy

21
Currently, focus peaking gives you the option to use two image buffers: the LiveView one (720x480 when used on internal LCD) and the so-called HD one (usually having higher resolution). Of course, the peaking results with the two options are slightly different.

To simplify the code, I'd like to use only the LiveView buffer, like most other overlays.

Is there any reason to use the high-res buffer? In other words, did any of you get better results by using it?

22
General Development Discussion / Thread safety
« on: February 05, 2017, 02:12:43 AM »
While refactoring the menu code, I've noticed it became increasingly complex, so evaluating whether it's thread-safe was no longer an easy task (especially after not touching some parts of the code for a long time). The same is true for all other ML code. Not being an expert in multi-threaded software, I started to look for tools that would at least point out some obvious mistakes.

I came across this page, which seems promising, but looks C++ only. This paper appears to be from the same authors (presentation here), and C is mentioned too, so adapting the example is probably doable.

Still, annotation could use some help from a computer. So I found pycparser and wrote a script that recognizes common idioms from ML code (such as TASK_CREATE, PROP_HANDLER, menu definitions) and annotates each function with a comment telling what tasks call this function.

Therefore, if a function is called from more than one task, it must be thread-safe. The script only highlights those functions that are called from more than one task (that is, those that may require attention).

Still, I have a gut feeling that I'm reinventing the wheel. If you know a better way to do this, please chime in.

Source: https://bitbucket.org/hudson/magic-lantern/branch/thread-safety

Note: in DryOS, tasks == threads.

23
General Development Discussion / Experiment - Dynamic My Menu
« on: January 31, 2017, 09:51:00 PM »
Today I was a bit tired of debugging low-level stuff like Lua tasks or camera-specific quirks, but still wanted to write something cool. So here's something I wanted for a long time. The feedback back then wasn't exactly positive, so it never got implemented, but I was still kinda missing it.

Turns out, it wasn't very hard to implement, so there you have it.

What is it?

You already know the Modified menu (where it shows all settings changed from the default value), and My Menu (where you can select your favorite items manually). This experiment attempts to build some sort of "My Menu" dynamically, based on usage counters.

How it works?

After a short while of navigating ML menu as you usually do, your most recently used items and also your frequently used items should appear there. As long as you don't have any items defined for My Menu, it will be built dynamically. The new menu will be named "Recent" and will keep the same icon as My Menu.



Every time you click on some menu item, the usage counter for that item is incremented. All the other items will have a "forgetting factor" applied, so the most recently used items will raise to the top of the list fairly quickly.

Clicking the same item over and over will only be counted once (so scrolling through a long list of values won't give extra priority to menu items). Submenu navigation doesn't count; only changing a value or running an action are counted.

Time is discrete (clicks-based). It doesn't care if you use the camera 10 hours a day or a couple of minutes every now and then.

To have both good responsiveness to recent changes, but also learn your habits over a longer time, I've tried two usage counters: one for short term and another for long term memory. If, let's say during some day, you need to keep toggling a small set of options, it should learn that quickly. But, if no longer need those options after that special day, those menu items will be forgotten quickly, and the ones you use daily (stored in the "long term memory") should be back soon.

So, the only difference between the "long term" and the "short term" counters is the forgetting factor: 0.999 vs 0.9. In other words, the "long term" counters have more inertia.

When deciding whether a menu item is displayed or not, the max value between the two is used, resulting a list of "top 11 most recently or frequently used menus". The small gray bars from the menu are the usage counters (debug info).

I have no idea how well this works in practice - it's something I came up with a few hours ago, and the tuning parameters are pretty much arbitrary.

Source code committed, and if there is interest, I can prepare an experimental build as well.

24
General Development Discussion / Touch-friendly ML menu
« on: January 06, 2017, 07:02:44 PM »
Some experiments I did last summer on a 700D (which I no longer have).

I remember it worked to some extent, but had some quirks. Don't remember the exact details, but I hope it could be useful (or at least fun to tinker with).

http://bitbucket.org/hudson/magic-lantern/commits/branch/touch-menu

25
General Chat / Script for undeleting CR2 files
« on: January 01, 2017, 09:17:31 PM »
Looks like my 5D3 decided to reuse the file counters on two different cards. When sorting some photos, one CR2 just got overwritten by another image with the same name (by mistake).

How to undelete it?

Testdisk's undelete tool didn't help (the file wasn't deleted, but overwritten). PhotoRec would have probably worked, given enough time, extra HDD space and patience to sort through the output files (not practical). I found a guide using debugfs, which didn't seem to work (too much low-level stuff I wasn't familiar with), and this article seemed promising. I knew a pretty tight time interval for the missing file (a couple of seconds, from previous and next file in the set), so I wrote a quick Python script to scan the raw filesystem for CR2 files with the EXIF date/time in a given range.

It worked for me.

It's all hardcoded for my system, but should be easy to adjust for other use cases.

Code: [Select]
# CR2 recovery script
# Scans the entire partition for CR2 files between two given timestamps,
# assuming they are stored in contiguous sectors on the filesystem.
# Hardcoded for 5D Mark III.

import os, sys, re
from datetime import datetime

d0 = datetime.strptime("2016:06:10 17:31:36", '%Y:%m:%d %H:%M:%S')
d1 = datetime.strptime("2016:06:10 17:31:42", '%Y:%m:%d %H:%M:%S')

f = open('/dev/sda3', 'r')

nmax = 600*1024
for k in xrange(nmax):
    p = k*100.0 / nmax
    f.seek(1024*1024*k)
    block = f.read(1024*1024)
    if "EOS 5D Mark III" in block:
        i = block.index("EOS 5D Mark III")
        print k, hex(i), p
        b = block[i : i+0x100]
        date_str = b[42:61]
        try: date = datetime.strptime(date_str, '%Y:%m:%d %H:%M:%S')
        except: continue
        if date >= d0 and date <= d1:
            print date
            out = open("%X.CR2" % k, "w")
            f.seek(1024*1024*k + i - 0x100)
            out.write(f.read(40*1024*1024))
            out.close()

Pages: [1] 2 3 ... 6