Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Messages - a1ex

#11501
Feature Requests / Re: 1080p crop mode on 60d
June 25, 2012, 08:48:12 AM
No (or at least not in the near future).
#11502
Feature Requests / Re: folder support?
June 25, 2012, 08:46:33 AM
No.

What is possible is to change the file prefix (for example, to have ABC_ instead of IMG_).
#11503
In LiveView you can get around 39; for recording, anything above 35 results in clipped frames and weird artifacts.
#11504
General Help Q&A / Re: new firmware version on 60d
June 20, 2012, 04:50:43 PM
Yes, next release will work with 60D 1.1.1.

Until then, it's safe to downgrade to 1.1.0.
#11505
With or without black bars?
#11506
No.
#11507
Share Your Videos / Re: Bulb ramping demos
June 18, 2012, 08:22:14 PM
Another bulb ramping test:





Camera: 60D
Lens: Samyang 8mm at f5.6
Intervalometer set to 30 seconds
Metering: median, reference brightness at 20%
Shutter speed: from 1/4000 to 28 seconds
ISO: from 200 to 3200 (HTP)
Image quality: sRAW

Postprocessing:
1) developed the RAW files at +1 and +3 EV
2) enfuse
3) defishing (with nona)
4) MSU deflicker
5) frame merger ( bit.ly/frame-merger )
#11508
General Development / Re: 600D Audio Controls?
June 18, 2012, 06:55:02 PM
We have understood what most audio registers do, but reconfiguring the chip seems a very difficult task. The only success was enabling audio remote shot, but it breaks video recording.
#11509
There is a small hope for 5D3, from the digic investigation thread.
#11510
Forum and Website / RSS feed
June 18, 2012, 10:37:36 AM
The RSS feed for the forum only contains 5 items. Can this be increased to around 20?
#11511
General Help Q&A / Re: Wind filter
June 18, 2012, 10:36:19 AM
Canon's audio controls have no effect, they are completely overriden by ML.
#11513
The problem is how do you put all these on the user interface?

To set initial and final focus point, you would have to use something like rack focus (and everybody says it's very difficult to setup). You can already use it to see how many focus steps you need, but you have to stop it manually.

As it's now, you have full manual control over the process (when to start focusing, when to stop, when to go faster/slower...) but you risk moving the camera when you change some parameters. But it shouldn't be difficult to fix camera movement in post.

Probably the best way to do advanced timelapse shots like this would be to use Lua scripting.
#11514
1. You can press PLAY to go back at 0%.

2. This needs a bit of reverse engineering. BTW, I'm trying to finish the stable release for 5D2, so until then I'll skip the feature requests.
#11515
Camera-specific Development / Canon 1100D / T3
June 14, 2012, 04:50:54 PM
Canon 1100D / T3
Current state: Release Candidate Danger, Will Robinson! Danger!


The main issues that are holding ML are:
Blindly maintained (development happens without a physical camera in our hands)
Lack of physical buttons (for menu, for example)
Lack of RAM (only 128MB; most other Rebel models have 256MB)
Low resolution sensor: 4290x2858 => 1430x952 theoretical max in LiveView
Slow SD interface (20MB/s, not enough for raw video)
Low display resolution (1-pixel wide items may be incorrectly displayed or aliased)

Short-term Todo List:
Enable raw video and silent photo capture (almost there)
Enable Lua scripting (feedback needed -- what works and what not?)
Make sure the stability tests are passing (feedback needed)
Write down which features work and which ones not (menu walkthrough, for testers)
Update the user guide (task common for all other camera models)
Done: Port 550D button hack to 1100D AE_COMP to open menu but retain the original long-press functionality.
Done: Start working on the fonts and display routines

Random bonus stuff:
Done: In-camera audio meters.

I will be posting updates here as I make progress -- Nanomad

The current version compiles and runs on 1100D.
RC 3: http://nanomad.magiclantern.fm/1100D/magiclantern-v2.3.1100D.RC3.zip Use development builds instead!!
Use the included .fir (for 1100D 1.0.5) to enable the bootflag.

Source code: http://bitbucket.org/hudson/magic-lantern/src/tip/platform/1100D.105

Main builds: http://builds.magiclantern.fm/ (some features not working)
Development builds: http://builds.magiclantern.fm/experiments.html (feedback needed)
#11516
Archived porting threads / Canon 5D Mark III
June 14, 2012, 04:45:09 PM


ML will work on 5D Mark III. Right now it's in very early stage - the early testers reported it as almost unusable.
#11517
Share Your Videos / Bulb ramping demos
June 12, 2012, 09:23:20 PM
First try of ML bulb ramping:


Focus ramping:
#11518
Step 1. Get the source code
hg clone -u unified https://bitbucket.org/hudson/magic-lantern


Step 2. Get a compiler



2a -> pre-built toolchain from gcc-arm-embedded (preferred 5_4-2016q3, but any other version should work)

Manual toolchain setup should be straightforward on Linux, Mac and Windows Subsystem for Linux.

General instructions:

Either unzip the downloaded toolchain in your HOME directory (for example, you may get ~/gcc-arm-none-eabi-5_4-2016q3/bin/arm-none-eabi-gcc), or install a package that provides arm-none-eabi-gcc and make sure it's in your executable PATH. That way, the compiler will be picked up by our Makefiles without additional tweaking.

Ubuntu: package gcc-arm-none-eabi (older) or gcc-arm-embedded from ppa:team-gcc-arm-embedded/ppa (latest). Or just unzip the downloaded toolchain into your HOME directory if you prefer to avoid a system-wide installation.

Mac: the easiest way is probably this tutorial from Daniel Fort, but manual setup should work just as well.

Windows 10 WSL is known to require some tweaking (such as installing an X server for QEMU), but generally works well.

Windows (native, without WSL): you may install Cygwin (but make sure there are no spaces in your username). For Windows XP you will have to use an older version of Cygwin, but other than that, it should work without much trouble.





2b -> QEMU installation script will download the recommended toolchain and install it for you.

This method is recommended for Ubuntu, Mac and Win10 WSL; other Linux distros may require additional tweaking.

Besides setting up and compiling QEMU, this script also installs the ARM compiler and debugger, and will provide an environment to run your camera firmware and Magic Lantern in the emulator (very useful for debugging and reverse engineering - check out the guide).

Videos: for Ubuntu and Mac (todo: WSL and native Windows).



To run QEMU install script:

hg clone https://bitbucket.org/hudson/magic-lantern
cd magic-lantern
hg up qemu -C
cd contrib/qemu
./install.sh





2c -> Linaro bare-metal toolchain should also work (use ARM_ABI=eabi in Makefile.user).




2d -> pre-built VM if you feel lazy (Kudos to Anton2707) (User/Root password is 123456)

Also, nikfreak has a VM available here (with ARM-console preinstalled).

Note: these VMs are old and do not have the latest bells and whistles (QEMU etc), but these should be easy to install.




2e -> Docker (experimental, feedback welcome if you decide to try it)




2f -> Build the compiler yourself.




Step 3. Start hacking


hg up unified -C
cd platform/550D.109
make clean && make zip
make install
make install_qemu


Useful branches

- unified (mainline)
- Experiments section on the download page (WIP stuff)
- Pull requests on Bitbucket (same)
- All branches (lots of unfinished work and experiments)
- crop_rec_4k (4K recording and lossless compression, very bleeding edge)
- dm-spy-experiments (logging tools for reverse engineering)
- iso-research (CMOS/ADTG ISO experiments, adtg_gui, raw_diag)
- qemu (camera firmware emulation)
- new-sound-system (experimental audio controls for DIGIC 5)
- 1200D, 1300D, 100D_merge_fw101, 70D_merge_fw112, 5Ds_experiments etc (ports in progress)
- vxworks (450D, 40D, other oldies are welcome); vxworks-dm-spy (logging)
- digic6-dumper (80D, 750D, 7D2 etc)
- recovery (experiments running from bootloader context, mostly diagnostics or ROM dumping)

Useful stuff

- Writing modules tutorial #1: Hello, World!
- How can I run Magic Lantern in QEMU? (aka QEMU guide)
- EOS firmware in QEMU - development and reverse engineering guide (must read for understanding how the firmware works)
- Notes on old wiki: http://www.magiclantern.wikia.com/wiki/For_Developers
- Using WiFi SD (FlashAir and Transcend) to avoid swapping the card back and forth
- Speed up compilation:
   export MAKEFLAGS="-j8"  # parallel build
   make install ML_MODULES="lua ettr"  # only compile the modules you are interested in
- Examine ASM code: tag a function with DUMP_ASM, then run: make dump_asm
- Stack trace in crash logs
- Submitting a pull request
#11519
* Lua scripting ( http://groups.google.com/group/ml-devel/browse_thread/thread/6f708e124d700712 ).
* PicoC scripting ( http://www.magiclantern.fm/forum/index.php?topic=3769.0 )
* Android USB controller ( http://groups.google.com/group/ml-devel/browse_thread/thread/4b0d01d64a459485 )
* Follow focus / remote control with a simple TV remote ( https://groups.google.com/group/ml-devel/browse_thread/thread/6ec6723d7da4119e )
* Custom file prefix (for example, 5D2_1234.CR2, 02JA1234.CR2 etc). Maybe also absolute file numbering from shutter count.  Done: https://bitbucket.org/hudson/magic-lantern/commits/49682329d18213d88f74a9fe516c202091f14d22
* Exposure simulation for extremely dark scenes [DONE]
* Templates for HDR and focus stacking scripts
* Very short uncompressed 422 video clips (1-2 seconds, maybe more on 5D3) Done: http://www.magiclantern.fm/forum/index.php?board=49.0

You can find some more features still to be implemented in this google doc.
#11520
Features copied from another Canon camera
Copying Canon code or functionality may carry legal risk for us. We do respect the Canon company and love their products and we are strict about staying on the right side of the law.

1080p 60fps, 2K, 4K, RAW video...
The best we could do was 1080p 35fps on 60D and 600D. Update: 4K works, but has major limitations.

Custom codecs
Codecs are not implemented on the general-purpose ARM processor. We can only use what Canon has already included in hardware (H.264, JPEG, LJ92) and fine-tuned their parameters (such as the H.264 bit rate).

The lossless compression used for raw video is the same "codec" Canon uses for CR2. The same processing path (codenamed JPCORE) might be able to handle (M)JPEG. However, we cannot implement additional codecs (such as H.265, JPEG2000 or ProRes). Even if these might be able to run on Canon's image processing hardware, we simply don't know where to start.

Things that can be done in post
Why spending development time on things like in-camera HDR? Magic Lantern is not a replacement for Photoshop ;)

Previewing is OK (e.g. HDR preview, anamorphic preview, fisheye correction preview).

Real-time video processing (e.g. stabilization, sharpness algorithm)
We can't program the image processor. These things can only be done if the functionality is already in Canon firmware (i.e. some parameters that can be tweaked - like in the Image Effects menu).

AF microadjustment
Not possible to control AF outside LiveView. Update: dot_tune works on cameras where AFMA is present in Canon menu.
Not possible on other cameras, with our current knowledge.

Image on both LCD and external monitor at the same time
Not possible (unless proven otherwise by DIGIC investigation).

AF confirmation without chipped adapters
Not possible (camera refused any attempts to fake lens info).

Timecode
Very difficult (see http://www.magiclantern.wikia.com/wiki/Timecode ). The 5D Mark III has it.

Continuous AF in movie mode
Very difficult to do it right (we couldn't).

Scrollwheel controls
It's not possible to remap them while recording. In standby, ML menu uses a trick: it opens some Canon dialog in background to steal wheel events from it, but this trick doesn't work while recording.

1D support
These cameras are way outside our reach. Even if we could buy them, very few 1D users would benefit from ML.
There are also legal concerns regarding Canons Pro line of cameras.




Sure, at some point, some of these might become possible, but chances are extremely small. Spending time on those is effectively searching for the needle in the haystack.




A detailed explanation by dmilligan, on why Magic Lantern cannot increase the FPS of cameras. 

Quote from: dmilligan on May 02, 2014, 11:57:05 PM
Your question really boils down to this:
"Why can't I capture more information, by throwing away information?"

Now from a more practical standpoint:
Compression (what you refer to as "lowering the bitrate") is a difficult, computationally intensive task (it's also impossible). It is not a magical process where you throw some data in and it comes out smaller. The only way to get enough of an effective compression ratio for the incredibly huge size of a video data stream, is to just throw away some of it. The goal here being to throw out the least important information, but we are throwing away information nonetheless. The better an algorithm is at throwing away data (i.e. the better it is at figuring out what data is unimportant), typically the more complex it is. There are very easy ways to throw away data, such as reducing the resolution and line skipping, and there are very hard ways of throwing away data such as DCT

Lets now consider (a very oversimplified) pipeline that a video stream goes through in the camera:
Sensor -> Raw Data -> Image Processing (demosaic, wb, pic style, curves, etc.) -> H.264 Encoder -> Storage

When you talk of "bitrate" you are only talking about the bitrate at the very last step of this pipeline, the bitrate out of the encoder to the storage media. There are many other steps prior to this to consider. If you want a 1080p stream out of the encoder, you also need that 1080p stream to make it's way through the rest of that pipeline (at 60fps). That's where the limitation is, in fact there are probably many, I'll just go over some of the possible ones:
1. The H.264 encoder, can't handle 1080p of video data coming into it at 60 fps (remember it has to do something very complex and computationally intensive with the data and then spit out that result very quickly)
2. The image processing electronics can't handle 1080p of raw data at 60 fps
3. The internal buses that move the raw data from the sensor to the image processors can't handle that much data (1920*1080*14bit*60fps = 1.7 Gigabits per second)
4. The sensor itself isn't fast enough to sample 1080 lines at 60 fps (it takes some finite amount of time to read out each line, and they are read one by one)

I'm not saying that all of those are true, but at least one or more of them are, and that's why 60p mode is a lower resolution. Overcoming any of these obstacles is possible, but it would require more transistors (i.e. faster, more complicated electronics), which would make the camera more expensive. So without more expensive internal electronics, the only way to get enough "compression" to be able to even get our video data to the encoder, is to "compress" the data starting at the sensor itself, and what's the only way to do that? line skipping and reducing the resolution -> basically don't read in as many pixels.