Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Messages - calypsob

#1
Feature Requests / Sequenced Intervalometer
July 20, 2018, 08:38:38 AM
It would be great to be able to run a sequenced intervalometer in ml, for ex
10x 60s, 10x 120s, 10x 10s exposures completely automated. This would have alot of practical applications in landscape or architectural photography, especially for photographers wanting to median combine exposures as a smart object and then combine the masters into an hdr composite. Personally I would use this feature for astrophotography. At times the stars must be blown to get good background signal, i end up having to shoot extra exposures to fix highlights and medium length exposures to improve snr lost to dark current in the long exposures.
#2
Feature Requests / Re: Read Noise E-
September 27, 2017, 09:59:53 PM
Quote from: a1ex on September 27, 2017, 06:04:37 PM
It can - try the raw_diag module.

There are a couple of methods:
- from OB areas (not reliable, just an extremely rough approximation, but works on any image)
- from one dark frame (includes both fixed and random components)
- from the difference of two dark frames (you'll get the random noise component * sqrt(2), assuming it's Gaussian)
- from the difference of two regular images (so you can estimate the noise at various signal levels - enough information for plotting a SNR curve).

I have some trust in the last method, especially for DR measurements, as long as white level (clipping point) is autodetected well (sometimes it isn't). However, fitting FWC and read noise from the SNR curve is probably an ill-conditioned problem - if you can suggest a better method, I'm all ears.

Alex, this sounds awesome.  I will need to check this out later on and see what I can explore the feature more
#3
Share Your Photos / Re: Outer Space with the help of ML
September 27, 2017, 09:25:02 PM
Quote from: g3gg0 on September 27, 2017, 07:17:15 PM
wow!
really impressive shots :)

and this shot in germany? nice.

Thanks! sorry for the confusion, these were shot on the blue ridge parkway outside of Roanoke Virginia.  The platform taht I use to track the sky is a GEM German equatorial mount https://en.wikipedia.org/wiki/Equatorial_mount
#4
Share Your Photos / Re: Outer Space with the help of ML
September 27, 2017, 06:29:41 PM
I just wanted to add some more to this post.  I tell you ML is worth its weight in Gold for astrophotography and I commonly suggest ML supported Canon bodies to entry level astrophotographers even if they are noiser than the competition.  With a high quality german equatorial mount I am able to image without a laptop anymore, which for me lifts a huge burden.  I use ML for several things when I acquire data on a deep sky object.  First Live view Gain and FPS override allow me to use live view to focus and find my targets in the sky.  When you get FPS override dialed in, it is fantastic for finding stars and also focusing with a bahtinov mask.  I use the ML intervalometer for acquisition of light, dark, flat, and bias frames. I do have to stop magic lantern every so often to manually dither the equatorial mount.  I also use raw histogram, though I am not convinced that I am reading the iamge correctly sometimes, I have gone back to lifting the shadows 1/3 off the back of the histogram in jpeg preview. 

Here are some of my more recent images.  I use several cameras, a t3i, t2i, and 60d.  The t2i is modded for full spectrum, and the 60d is debayered monochrome and also full spectrum.  The 60d is used to generate independent luminance data which is later applied to color data capture by the T2i in order to bring out more resolution.  The T3i is unmodded.  All 3 use ML as their control platform.  I typically use the 60d and 550d in tandem along with a pair of matching Samyang 135mm F2 lenses.  These work great wide open at F2.  I also commonly use a pair of older pentax k 50mm f1.7 lenses stopped to F4. I also commonly use clip in filters designed by Astronomik, primarily on the full spectrum bodies. Most of my images list a full acquisition description in flickr if you are wondering how I acquired the data.


NGC 7023 Iris Nebula and LDN1148 by Wes Schwarz, on Flickr

Barnard's E B142-143 Dark Nebula "process ver. 2" by Wes Schwarz, on Flickr

North American Nebula NGC7000 by Wes Schwarz, on Flickr

The Markarians Chain and it's Galactic Neighbors by Wes Schwarz, on Flickr

RGB version M81 M82 integrated flux nebula #Explored by Wes Schwarz, on Flickr

M45 and California by Wes Schwarz, on Flickr

Andromeda and Triangulum and the Intrastellar IFN by Wes Schwarz, on Flickr

Orion and Horsehead V.II by Wes Schwarz, on Flickr

Witch Head Nebula IC2118  process Version III by Wes Schwarz, on Flickr
#5
That is a great idea, I think a RAW image statistics function would be great.  Mean, median, std deviation, MAD would all be superb for astrophotography and other scientific applications.   
#6
Feature Requests / Re: Dark Frame for astronomy
September 27, 2017, 05:57:16 PM
Quote from: a1ex on August 22, 2017, 01:28:54 AM
Take a look at changesets 51da5cdb8dde and 99be9612028c, and this discussion. They are not included in the current codebase, as I wasn't happy with the increased code complexity; to get started, you can just compile ML from the second changeset.

I suggest moving them to a new module (dark.mo? other name?), and maybe have it average the dark frames as well, see EekoAddRawPath. From the silent picture module, you will need the "basic" full-res capture functionality, but outside LiveView.

One issue to solve: the captured area from a ML DNG doesn't match the CR2 exactly (for some reason, Canon code crops a few pixels from the raw buffer before saving it). This will have to be adjusted for every single camera model. Besides, as this one is not a widely used feature, you can't rely on bug reports, so you'll also have to come up with some testing procedure (ideally automated).

As I'm not really into astrophotography, are you (or anyone else) willing to step up and polish the above patches?

The cropping that you mention was actually addressed in the program pixinsight. I think canon does this because the pixels at the edge are used for calibration of the columns.  It seems that the best option here would be to allow the user to enter the sensor size and pixel size independently so  you do not need to write code for every camera model.   That being said however I am a bit off put by the OP desired results.  Is he wanting to do dark frames by having the camera close its shutter similar to LENR?
#7
Feature Requests / [ALREADY DONE] Read Noise E-
September 27, 2017, 05:50:58 PM
 I would like to figure out if magic lantern can give me an evaluation of noise in my images.  Particularly with astrophotography, if I can figure out the Read e- noise value in an individual image, I can make a rough estimate of how many subs I will need to shoot for integration and also how long of an exposure I need to take in order to swamp the read noise.  This of course would need to be a reading from the raw file and not the jpeg preview and it would need to give a value listed in the R,G,B channels independently.  Cutting out the need for a computer for this process would be huge.
#8
Reverse Engineering / Re: ProcessTwoInTwoOutLosslessPath
December 31, 2016, 11:47:14 PM
 Will this feature only be available on 5D Mark III?
#9
General Help Q&A / Re: New raw mode
December 31, 2016, 11:37:56 PM
Sorry i meant .dng
#10
General Help Q&A / New raw mode
December 31, 2016, 05:03:04 AM
So can the new 14bit dcraw mode be used on all canons or just the 5d series? Ive see. Several articles with conflicting explanations now. Id love to tet this on my 60d, 600d, 550d. From what I read it specifically sounds like it was only implemented on 5d3
#11
I tried stacking the data today and got some strange results with the 50s and 49s darks combined together.  Fortunately I also had a set of staggered 49 and 50s dark frames so I had almost and equal amount of darks to calibrate together and independently.

on my first try I calibrated everything together and scaled the darks because I used 50 and 49s.

light_BINNING_1_integration
Calculating noise standard deviation...
* Channel #0
σR = 9.257e-005, N = 1909520 (10.59%), J = 3
* Channel #1
σG = 8.183e-005, N = 321791 (1.79%), J = 4
* Channel #2
σB = 9.191e-005, N = 338895 (1.88%), J = 4

on the second try I applied calibration frames independently to 49s and 50s.  After applying indepent calibration frames I stacked the 49 and 50s frames together and got much lower read noise, but it was a little higher in green? 

integration
Calculating noise standard deviation...
* Channel #0
σR = 1.072e-004, N = 212831 (1.18%), J = 4
* Channel #1
σG = 8.856e-005, N = 365346 (2.03%), J = 4
* Channel #2
σB = 1.001e-004, N = 395618 (2.19%), J = 4


A1ex, I will try and figure out how to adjust the script to give consistent subs, thanks for the tip. 
#12
General Help Q&A / are my intervalometer settings wrong?
November 07, 2016, 11:20:36 PM
Using my 550d with ML I am shooting timelapse for astrophotography, lets say 300s images "5 minute".  When I get home I always have data which is 1 second short, some are 299s and the rest are 300s.  Does anyone know why this happens?  It messes me up because I need to later add dark frames and end up having to take 299s and 300s subs.   
#13
Feature Requests / Re: USB output commands
September 24, 2016, 11:26:44 PM
I need to explain the work flow here briefly

Camera>Bulb mode

ML Intervalometer>Exposure time

>Delay between exposure, the option to shoot like crazy or wait a certain time>During this time a dither command would be sent to dither several pixels either north, south, east, or west

>after the delay the next exposure begins. 
#14
Feature Requests / USB output commands
September 24, 2016, 11:23:35 PM
Can magic lantern allow for usb output commands?  I know a few people that could assist me with the coding end but I want to check if its possible first.  I want to send dithering commands to an equatorial mount between exposures, this would be a benefit for dslr astrophotography.  It would only benefit a niche audience so probably nightly build material. 

Below is a link to the lacerta mgen autoguider.  It is a camera with an interface which communicates with the guide camera and sends commands to the dslr, intervalometer commands.  It also sends st4 commands to the equatorial mount which allow it to dither, move several pixels in each direction between exposures.  This allows you to calibrate out noise between exposures when doing a median stack in post processing.  http://www.teleskop-express.de/shop/product_info.php/language/en/info/p4173_Lacerta-MGEN-Stand-Alone-Autoguider-for-astrophotography.html
I want to avoid using a lacerta mgen, or a laptop to dither, I dont want a guide camera at all because my eq mount does not need autoguiding in its current configuration, I just want my dlsr, lens and the ability to dither during an intervalometer run.


here is an adapter which converts usb to st4  http://www.highpointscientific.com/zwo-usb-to-st4-adapter-usbst4?utm_source=google&utm_medium=cse&utm_term=ZWO-USBST4&gclid=CJb8t-_1qM8CFdgBgQodLWAMeQ

If I could tell magic lantern to send out dither commands, then all I would need is a usb cable.  some mounts use usb instead of an st4 port so either way the stock canon usb cable works.  CAn we send these simple st4 commands?  This would be huge  for astrophotography.  I did not go into detail on this process because I dont want to spend to much time explaining if ML cannot send usb commands.  Anyone with some experience please let me know and I will start contacting some coders.  Thank you

#15
I tried again this weekend and got it to work. The camera needs to wait the same duration as the exposure before starting the next exposure with lenr on. So 120s exposure, you need a 120s pause for lenr to remove the dark frame. Then you immediately begin the next exposure. This worked like a dream and because I was using a 135mm lens I did not need to autoguide and be tethered to a laptop, I manually dithered every 10 subs and ran lenr at iso 800, almost no noise shooting the iris nebula in mag 21.4 skies. Cant go wrong.
#16
Ive been messing with the ml intravalometr and cannot use lenr, long exposure noise reduction without messing up every other frame. Is there a possible fix for this?
#17
I think im gunna tryusing 300 seconds jitter tonight.
#18
Ok i tried it. I tried 300 second exposures and every other exposure was 65 seconds. Can anyone advise as to how I could use LENR without this occuring?
#19
General Help Q&A / Lenr during intravalometer operation
October 12, 2015, 02:33:31 AM
What happens every if I use long exposure noise reduction while using the intervalometer in magic lantern? I dont want to throw off the sequence or mess up the camera if i can avoid it. Thanks.
#20
Share Your Photos / Re: Outer Space with the help of ML
December 10, 2013, 07:06:43 AM
Thanks folks, gotta love ole ML
#21
Feature Requests / Re: af motor control and focus peak
November 16, 2013, 02:09:06 AM
without a cls ccd filter in the camera it is pretty easy to see stars but with the filter it is really hard which is why I suggest the darks, cleaner star and bahtinov results that way.  I was unaware of the stepper feature though thanks for the link.
#22
Share Your Photos / Outer Space with the help of ML
November 14, 2013, 08:16:09 AM
I primarily use ML in the field to find stars in live view through a telescope while using a narrowband filter, narrow band filters cut out a TON of light so without exposure gain for live view it would be nearly impossible for me to find and locate stars.  Another feature I use alot is the intravalometer for capturing meteors during meteor showers.  Oh and you cannot live without the Dark Red overlay setting, saves your night vision!  ML rules out under the dark skies.  Some of these shots are through my telescope with the T2i and others are static on a regular tripod with both T2i and T3i, T2i is full spectrum.


veil 3 by LMNO Sunset Deluxe, on Flickr

this one is not complete, I still need to add alot of frames to reduce the noise

Untitled by LMNO Sunset Deluxe, on Flickr


MILKYWAY BAHAMAS PANO RAINBOW SHAPE by LMNO Sunset Deluxe, on Flickr


Gemenid Meteor Shower December 2012 by LMNO Sunset Deluxe, on Flickr


fullspectrum milky way by LMNO Sunset Deluxe, on Flickr 
#23
Feature Requests / af motor control and focus peak
November 14, 2013, 07:52:24 AM
AF microadjustment
Not possible to control AF outside LiveView.
I was kind of confused by this and I hope I am not asking a question that overlaps this statement.

I am not sure if anyone on here has ever tried out backyard eos before but it is kind of like eos utility only revamped for astrophotography with canon dslr's.  It has a really awesome frame and focus mode which uses a graph with a value beneath it that gets closer to one as the image becomes sharper, also the graph shows the picture of a giant peek which gets sharper as the image does.  In addition to this back yard eos allows control of the AF motor similar to how Eos utility works, there are 3 speeds, full, medium, and low.  Instead of auto focusing the image the arrows are pushed and the AF motor inside of an AF lens responds to the speed control commands.  This would be very useful for DSLR users who use an Astrotrac, polarie, or skytracker for focusing out in remote locations where a laptop is not convenient to bring along. The main purpose of this would be to assist with star focusing, it would not work with a wide lens as well as it would a telephoto.  Also one last idea which I believe to be far fetched but what the hell might as well ask, has anyone ever investigated dark frame removal from the noisy live  view screen generated during lowlight or high iso conditions?  I can do it on my computer with PHD and an orion star shoot CCD camera by covering the camera, allowing it to take 10 dark exposures, and uncovering.  The software subtracts the darkframes and the signal to noise ratio increases dramatically and the stars are much more clear, the black background is much more clear due to the removal of noise in the image.  Like I said its out there but if the camera can do it that would be great for everyone.

While I am thinking about it, another good use for full control of the af motor in a variable speed selection would be while using a teleconverter on a telephoto.  This usually prevents autofocus from working because the light is so low, however manual control of the autofocus would allow the user to decide when the image was in focus via live view to take a picture.  Left and Right arrows to control in-out focus and map Up Down to adjust between 3 speeds.
#24
ok you are right sorry about that, I got things a little backwards http://backyard.8m.net/startrail.html  I was looking at that diagram and thinking exposure time not arcseconds.  So exposing at 0 degrees will give the least amount of star trails.  This stuff gets confusing sometimes.
#25
As you increase declination from 0 degrees your exposure time becomes maximized. 

The "get your stars right " link is using what is called "the rule of 600" to solve for maximum exposure possible based on lens focal length.  The rule of 600 however was designed for 35mm sensors, if you are using APS-C then you need to divide 600 by 1.6 to get 375.  For crop sensor bodies you use the rule of 375.  To find your exposure time divide 375 by your focal lenght, for instance 375 seconds / 11 mm = 34.09 seconds.  This is the maximum amount of time you can expose stars before getting star trails.  I believe that the calculation on the link will give you 31 seconds at 11mm because they used the actual value of the aps-c sensor which is a little less than 1.6x smaller than a full frame 35mm sensor.  If you are in the northern hemisphere, USA, then North is going to be 0 degrees declination, the north star is actually at 89 degrees so dont rely on polaris to find zero degrees declination , use a compass instead. 

The farther you move away from polaris, the tighter your stars will be.  I am not 100% sure but I think as you aim towards the south the trails will come back, so east and west are going to allow you to attain maximum exposure times with a fixed tripod. 

If you look at this guys work it is easy to see that the trails get bigger the farther away you get from polaris http://www.lincolnharrison.com/startrails/