Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Messages - SpcCb

#1
Quote from: 50mm1200s on July 19, 2018, 10:57:33 AM
Found it! They call it "Bayer drizzle". The implementation seems to be from Dave Coffin (from dcraw). The software DeepSkyStacker uses it.
Would be nice to have it on HDRMerge. I'll see if Beep6581 have more information on this.
The "Drizzle" algorithm was originally invented by Andrew Fruchter and Richard Hook for images made by the Hubble Space Telescope. I worked on it a couple of years during studying, it's an amazing algo. Several astronomy software use it, the first on PC's was Iris if I'm not wrong, since a decade or more.

There's some papers on Drizzle, take a look in the Harvard Library for the source : -> http://adsabs.harvard.edu/abs/2002PASP..114..144F
#2
General Help Q&A / Re: Dangerous sun (in Dual-ISO)
April 19, 2018, 10:37:40 PM
It's important to say again : Never shoot the Sun directly.
Even with UV, IR, ND, polarizing, welding or something filters. None of them can block extremely strong and invisible part of the light spectrum from the Sun.
Even if the mirror is down : lens or/and body parts can get very hot, can brick or melt.
Even if  the focal is short, it depends of the diameter of the lens. (should be <5mm without diffraction part [a hole in fact] to not be 'more dangerous' than your eyes).
And the most important : A fraction of second of inattention and you can get severe and permanent damage in your eyes if you cross the light beam.

Every years I see medical reports of hundred injured persons.
Note that 2/5 of the cases are people who suffered of chronic headaches but don't remember that they look at the Sun in a camera. It's hard to discover that you will have chronic headaches for the rest of your life.. Think about that. And think it could be more dramatic.

Don't play to this game with your gear, and with your eyes !

***
If you want to take images of the Sun, contact a Astronomy Club around in your place. They're usually very open to visitors and you could see what kind of special filters they use (to your eyes, absolutely no light looks pass through) to make it safe.
#3
Quote from: Walter Schulz on July 30, 2017, 11:19:14 AM
Canon's hotpixel remap function is known to have issues. Not working sometimes. You tried several times?
Indeed, if there's marco group of several hot pixels (more than 1) the remapping function doesn't think it's hot pixels for example. However, it seems that adrjork works with RAW video and usually softwares well see hot pixels so it shouldn't be a major issue. IMHO the problem will be to deal with the random thermal noise, specially at high ISO.
#4
Nice ! I think there will be a couple of ML camera in this Solar eclipse ;)

If you plan to sync sequences on time and use liveview, take care to not burn your sensor. Specially if the camera is at the prime focus of a telescope, in seconds you can reach several hundreds of °C on the focus plane.
And it is very hard to calculate the precise time of C* (w/ second accurate), even if you know your precise location.
#5
@adrjork > Note that the temperature indication on screen is not the sensor temperature, it's taken from the motherboard in the camera body. So high temperature number on screen, orange or red alert doesn't mean you are frying your sensor.

Since a couple of years I've done hardcore experimentations about temperature with 5D2 & 5D3 for scientific applications; taking long exposure photos up to 3h at high ISO (far worst to filming for the sensor), reach up to 80°C on the sensor (temperature record by a probe on the sensor, not the temperature on screen), also cooling sensors, up to -65°C (most of them pop off under -48°C), etc.

What I can say is that you probably will fry the motherboard _or the battery, with explosion risks_ before the sensor if you use your camera for a long time with liveview under very hot local conditions (I'm in Fr. so I know weather can be hot in Italia) and with Sun directly on it etc. The sensor is stronger than some other electronic parts in the body.

By the way it's possible to 'mark' the sensor and so get more hot pixels or noise after long time running under very hot local conditions, and also get a camera more sensible to thermal noise. Some kind of wear sensor level if you see what I mean. But like someone said, you can easily remove a couple of hot pixels, by remapping or in post if you filming in RAW.
If you want to see if your sensor is prematurely worn, put the camera in a fridge for 1/2h at 5~10°C (vegetables part) and then take a dark photo @ISO2500 with 1/50s exposure, you will see if it's not clean when camera is cold. If it's really a problem you can ask to Canon to replace the sensor, but don't tell them what you use your camera under very hot conditions because it's write to not do it in the manual.
#6
Quote from: a1ex on March 23, 2017, 02:20:37 AM
Binning is simply blurring followed by downsampling (reducing the resolution by dropping pixels).

Example on how 5D3 does 3x3 binning to get 1080p Bayer raw from a full-res Bayer matrix:

function im = bin_pixels_3x3(im)

f = 1/9 * ...
[
    1 0 1 0 1;
    0 0 0 0 0;
    1 0 1 0 1;
    0 0 0 0 0;
    1 0 1 0 1;
];

imf = imfilter(im, f);
im = imf(1:3:end, 1:3:end);


First step is a 3x3 blur (averaging) on each Bayer channel, second step is keeping the central pixel from each 3x3 block (after filtering).

Also FYI, on Canon DSLRs, pixel binning is done by hardware (analog electronics), not firmware.

Reference: http://magiclantern.fm/forum/index.php?topic=16516.0
Sorry, I use to split both method in my work because averaging or binning, it's not the same thing for me.
Sort explanation -> http://harvestimaging.com/blog/?p=1560
As you spoke of binning in some camera I thought it was binning _I mean pixel signal addition_ (was there a square root f(px-nbr) noise reduction when you made tests about that last year?), but if you say it's averaging, it's not binning. Or this is not clear.
It makes me thinking what is doing the averaging.. ??? Maybe it will be interesting to see the source code of the firmware function? Or the source code of the ASIC||FPGA who does it?
#7
Quote from: dmilligan on March 23, 2017, 01:51:57 AM
Yes. In post, simply apply a blur to your image. You will reduce noise at the expense of resolution (that's basically what pixel binning is).
(...)
I'm not so agree with that :)
Blurring pixels is far to binning pixels: for example you loose the level gain and so the SNR gain at same level.
#8
Hello ΓΝ,

It's already done by some Canon cameras (like the 5D3) by the stock firmware (internal). See for << pixel binning >>.
You can also do it in post processing, several softwares can do it, even photoshop (menu filter>divers>other by using matrix convolution).
Best results will be with a RAW source image, of course (because of debayering).

Have fun ;)


Edit: Harf! dmilligan was first :D
#9
Shoot Preparation / Re: Milkyway
March 10, 2017, 07:00:46 PM
Quote from: jackyes on March 10, 2017, 08:37:47 AM
My mistake, at 200mm (as some user review) the skytracker can't go longer than 1 min  :-[
Mhmm.. If you use a 200mm with big aperture (200f/2.8?) + camera body, I think it will be a lot for the device for a question of weight/balance. It's a supposition, it needs to be tested, however I'm very perplex. And if you use a lightweight 200mm (plastic body 200f/4.5||5.6?), with 60s exposures you will not see many things on images. :/ Sorry to say that, but forget to use this startracker with a focal over 80mm, you will be disappointed. As I said, see what you can get with a cool 50mm, and 35/24/14/etc. on Milky Way or wide field, the Ioptron is made for that. ;)

Quote from: jackyes on March 10, 2017, 08:37:47 AMBut i'm mainly interested in Wide field space, but will try some Deep space (with the limitation of the skytracker...).
Lightweight is a must for me. I plan to walk for hours before camping and start shooting ;)

I think i will buy the iOptron  :-\
Ok, I see, I did it too ;) So my 2ct advice would be to take a look on Li-ion AA batteries and a flexible solar panel with a charging module to charge batteries on your backpack during hikes by day.
#10
Shoot Preparation / Re: Milkyway
March 10, 2017, 06:44:35 AM
Quote from: Levas on March 09, 2017, 09:33:14 AM
@SpcCb
Wow, great pic of the milky way :)

9 exposures of 300 seconds  :o

What is the mod on your 5d2, did you remove UV/IR filters in it ?
Thanks Levas ;)
Yes, 9*300s; a moderate (number of frame)/(exposure time) ratio, I already did more on Milky Way :)
This 5D2 is not full spectrum, there's an Astrodon Inside filter in place of the Canon IR-cut filter who gives 400>700nm @99% with fluorite substrate, very nice product.


Quote from: jackyes on March 09, 2017, 03:01:42 PM
@SpcCb
I will buy an Ioptron SkyTracker V2, it can handle 300mm @ 150 sec well (i'm mainly interessed in astrolandscape, i don't need such precision :P ), can be powered by 4 AA battery (~24h@20°) and is LIGHTWEIGHT (i love to go on the mountains). And it is cheap ~300€.
Do you have some hint on a good Skytracker?
You know, 150s @300mm with this kind of device is 'marketing'. I think you could use up to a 80mm, more you will see stars trails because the device has ~40" of periodic error in tracking. <50mm it will be easier, specially at the beginning. But <=80mm will be far enough for Milky Way, look at my previous link where it was 50mm.

For the power source, with AA batteries maybe it will be enough for a couple of hours but it could be expensive (?). An other solution could be to use a external source (9~12V, 500mA), like from a car lighter if you have your car close to you.

In the same budget there's the Skywatcher Star Adventurer travel mount, two times heavier but stronger and more precise. Plus you could use an auto-guiding system, later. It depends if you want a very light device or not, for 300€ and 0.5kg the Ioptron is the one on the market I think.
#11
Shoot Preparation / Re: Milkyway
March 09, 2017, 03:35:50 AM
Quote from: jackyes on March 08, 2017, 11:00:39 PM(...)
How you chose the correct exposure with a tracker? Do you have a rule of thumb for a beginner? ;)
There's some rules, however the mean issue will be the sky darkness if you have a tracking device I as said. Even with special filters to reduce the light pollution.
If you live in a big city, there's nothing to do. If you live in a country side, it's limited. You have to go in a dark wild place to fully exploit tracking capabilities.

To begin, start to learn how this device work (you have to make precise alignment, well balance all the gear, find the adapted mobile energy source, etc.), then make tries with short exposures wide open with a 'relatively high ISO' (with a digital camera it's easy to get results in a couple of second @1600~3200ISO), and if all is good increase the exposure time. If something wrong, go back to the beginning and try to figure what is wrong. Like that you can make progress, step by step.
Beside, if you have an astronomy club in your country, it could be a big help to meet and speak with people passionate by that.
#12
Shoot Preparation / Re: Milkyway
March 08, 2017, 07:12:22 PM
Quote from: jackyes on March 08, 2017, 02:49:32 PMSo anyone has some experience with (A)ETTR, skytracker and milkyway?
Maybe only with (A)ETTR and milkyway?
IMHO, ETTR is useless for this kind of subject. How the ETTR algo will know where is the right on a scene with _saturate, in all probability_ stars and very low level on 90% of the surface?
It depends of what you are looking for, but usually we try to get the maximum of signal so it meanly depends of the sky quality (background level) if you are not limited by the tracking. We can say we are 'sky limited' in this case.
Of course, if you use a simple tripod and you don't want trailing, you will be 'tripod limited', but if the sky is very illuminated.

Edit: Here is an example of Milky Way photograph with tracking under a very _very_ good sky -> https://www.instagram.com/p/BQqGnkgj_su/
(5D2mod+ML - 24-70f/2.8L@50f/4 - no filters - sideral tracking - 9*300s - isoless)
#13
Camera-specific Development / Re: Canon 550D / T2i
March 07, 2017, 02:48:20 AM
Yes a1ex I remember seen this last year, indeed it would be interesting to see dual-iso 'rendered' in playback. :)
In isoless_playback_fix I did not saw where the problem comes from after a couple of days on this, it's why I put the 550D in the exit condition of the function. :/ I use this short way because I still don't have a lot of free time to dive deeply into the code and make long experimentations, and I saw that 1100D is in the same case. Plus, for the moment I've my eyes in minihizostuff to make it work on 550D with last builds (maybe I will disturb you, later ^.^)', after I will see what I could do more 'unified', everything in its time when you don't have a lot of time. :)
#14
Camera-specific Development / Re: Canon 550D / T2i
March 06, 2017, 04:59:51 AM
After some experiments on the 550D, it looks the function isoless_playback_fix is not working well on last builds (and maybe on olders, I only used builds since February); the screen comes aliased (blury) when 'bright' or 'dark' appears during dual_iso image review. It makes hard to see something on screen ^.^'
Based on 7D & 1100D special conditions I made a test with:
if (is_7d || is_1100d || is_550d)
(line #437 in dual_iso.c)
It's a bit radical (bypass the function); no more 'bright' or 'dark' indications, but no more screen alteration. Maybe it should be set as it until it will be corrected?
#15
I have already encountered some similar issues while filming with the process "zoom to make focus >> un-zoom to film" when I start the recording too fast after un-zoom, or when I start recording just after a previous recording stopped (ie. while the LCD screen is refreshing and the LED is blinking). Sometimes I got corrupted frames, but in most of cases audio was 'out of time' (worst than desync and impossible to resync/reconstruct, it's like the audio file is fragmented and placed in random order). I haven't tried all builds but it appends with builds made in 2014-02-16, 2016-10-09, 2017-01-27, with regular RAW and MLV, so maybe it is just because the camera CPU is not fast enough or for a question of buffering (?), &so that we have to wait a few seconds between some kind of actions. Since I know that, I wait a couple of seconds before all recording starts and all is working fine.
#16
Aeidxst, what is your preferred language? If someone could help you..
#17
Quote from: a1ex on February 11, 2017, 05:42:13 PM
Best guess (as I don't use these tools):

When you place the dark frame in a Photoshop layer, the image data is no longer linear; also, values below 0 are clipped. Therefore, a simple subtraction will probably not do the trick.

Subtracting the dark frame before debayering should do the trick (regardless of what tool you use to do that).
I agree, it's the optimal and the most simple way.

However, after debayering _theoretically_ if there's no clipping|scaling in the dynamic, and if the process of dark frames and light frames are the same (log, etc.), it should _relatively_ match.
In all cases, if you make a *master dark frame* (average of several frames), it will better work. And maybe a small Gaussian smoothing could help too, to correlate with the Gaussian distribution of light frames.
#18
Quote from: dfort on February 08, 2017, 05:28:39 AM(...)
So all you're looking for is that one hexadecimal number to unlock this feature. The question is how do you find it? Read the code, it tells you. Now the hard part is following those instructions and come up with the right address.
(...)
I'm a bad violin player too, but if the goal is to find an hexa number, is it possible to use a sort of brute force method by scanning all numbers with a match (size of the expected informations?) to find the right number? I say that just in case, I imagine all high level devs had already thought about this or maybe there's a issue in this method.

@reddeercity > The problem here, like in many other parts of ML as you know, is that kind of hack is so complex that only a couple of persons can figure how to do it. For other violin players it took years to understand. Although I had 20 years of coding in 5 languages, my eyes blinking face of some parts of ML ^.^
#19
General Help Q&A / Re: err 70, 5d mark ii
February 01, 2017, 01:45:06 AM
He hehe! :D Trust me, it was not on purpose. I still shaking to think of it. ISO research and other ADTG stuff made me crazy sometimes when I thought I was doing that on a €2k+ production camera ^.^' But I'm cured now. *v*'

Thanks for the LCD remanence explanation, I did not know what was the cause, just it was after ugly loops freeze and forced-shutdown. BTW, it not sounds anodyne.
#20
General Help Q&A / Re: err 70, 5d mark ii
January 31, 2017, 11:41:12 PM
Quote from: johnwe on January 31, 2017, 02:49:45 AM
I sent the camera to Canon, they said it needed a new PCB.

I am curious to hear if this could have anything to do with ML?

I looked at the pins to see if any were bent or missing. They all look good. I could be wrong about that.

Should I avoid ML on this camera now? I'm getting it repaired.

In some cases (unexpected software loops?), main board can reach high temperature especially if you don't quickly remove batteries. So maybe it's possible to fry the main board.
I ever saw that, the camera body become very hot and the LCD stay with remanences of last stuff on screen. It's freaky.
But it's very very rare (never saw?) by using public builds, it's more when you try very new 'unstable' functions, advanced hacking tools or maybe if you don't respect some process. Since more than 5 years I'm using ML on cameras I never fry something or had physical damage on cameras.
So, do you avoid ML on your camera now... It's your choice, if you are not comfortable with this idea, maybe you should. There's no zero risk.
#21
Camera-specific Development / Re: Canon 550D / T2i
January 31, 2017, 05:47:26 AM
For info and people who frequently ask questions about SD cards best choise, I just bought a 550D for some bucks this week& and I was confronted to the jungle of SDHC.
After a quick tour with keeping attention on write speeds and not on the stratospheric marketing' read speeds I made my choice on the new Sandisk SDHC Extreme 90 Mb/s series, apparently the cheapest SD UHS1 card who can maintain more than 25Mb/s. I already have an 'Ultra' version but it's stuck @ 15Mb/s (even the new 'Ultra' 80Mb/s series), beside the 'Extrem Pro' series is expensive to use in a camera like that and the 90Mb/s of write speed is useless.
Actually this new 'Extrem' [regular] version can write up to 40Mb/s but I'm not sure that this can be maintained so it could be short for XXD cameras with a faster SD interface.
We can found this card (32Gb version) for $15 on web stores.
But I'm not working for Sandisk (^.^) so maybe we could find equivalents in other makers.. (Even if I did not right now)
#22
Thanks reddeercity, nice presentation ;)

So it's working only in crop mode on the 5D2? Or you are using this mode because of aliasing in scaled mode?
Ok, I saw in the other thread you said there'r some problems in full mode.
#23
Nice xaint!

Maybe it will be interesting to share sources, to make versions for other camera models, test it, and also maybe to help in development.

For the FWHM algorithm, it's hard to get something relevant from the live view feed because of the debayering +/|| scaling (I tired without good results and a certain lack of knowledge to deal with the feed at 1:1 _in crop mode_ resolution). I'm interested by how do you manage that? :)
IMHO it's the most interesting feature because it's on the imaging device, and could be link with a robofocus to compensate the point between LP frames for example.
#24
Reverse Engineering / Re: EF Lens communication protocol
November 28, 2016, 06:51:24 AM
Excellent JP !
Maybe it will help us to understand differences between 5D2 and 5D3 protocols.
Your web site is also full of interesting stuff, like about batteries mod etc.
Are you in the '79'?
#25
Feature Requests / Re: Panorama module?
October 25, 2016, 05:25:45 AM
I think it requires to make some geometrical transformations on each images before to be stitch. Maybe it's a bite too heavy for DIGICs (?).
Matthew Brown and David G. Lowe of the University of British Columbia, Canada, wrote a good paper about this (PDF).

An other approach should be to use a lens correction model for each lens (for several focals in case of zoom lens) like PTLens do and 'just' align individual frames before stitching, but I'm not sure that is a better solution.