Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Messages - vroem

#26
Tragic Lantern / Re: Raw video on 50d and 40d
July 01, 2013, 11:15:40 PM
Quote from: CFP on June 30, 2013, 02:02:06 PM
To answer my own question: Forget that.

I just understood that it is impossible because of the sensors's bayer pattern. It would result in a monochrome picture. The only usefull skipping values are 3 or 1 (No skipping). Looks like there is no way to increase the 1X mode resolution of any Canon DSLR. The only way I can imagine would be to combine 4 pixel to one cluster and skip clusters instead of pixels. But I doubt that this is possible and it might affect the image quality. Time to stop dreaming and be happy with what we have I guess ;D
In fact at least my 60D has another frame skipping algorithm, it's activated in 5x zoom mode: it skips 2 lines and keeps 3 consecutive lines. But apparently it gives bad aliasing. In general for line skipping to work, the rule is to consecutively skip multiples of 2 lines. You can then keep any amount of lines.
#27
General Development / Re: 4k Filming
June 30, 2013, 11:34:28 AM
I'm not a dev but here is how Canon skips lines:

It keeps one out of every 3 pixels in every direction.
Some zoom levels have other skipping techniques, but they are even worse.

Heh time for an avatar :-)
#28
Raw Video / Re: 5D3 Raw 4:2:2 or 4:4:4?
June 29, 2013, 09:52:24 AM
Of course. It applies to all raw.
#29
Raw Video / Re: 5D3 Raw 4:2:2 or 4:4:4?
June 29, 2013, 02:54:36 AM
Quote from: Videop on June 29, 2013, 01:34:19 AM
Ok, I think we are close now.. So a 22 Megapixel sensor like in 5D3 means 22 million photosites, not 22 million groups of four (two green, one blue, one red) photosites or actuall photo sensing elements?
No every photosite generates one pixel. This is true for all digital imaging sensors.
So 5D3 has 22M photosites = 22M pixels = 11M green pixels + 5.5M red pixels + 5.5M blue pixels
Before demosaicing the pixels have only a 14bit brightness value, the demosaicing will reconstruct the full rgb color for every pixel by interpolation.

QuoteBut in such case wouldn't the info actually be "de-bayered"?
Anyway I'm looking forward to finding out how this is managed in 5D3.
Whatever 5D3 does, its raw video output is Bayer.
#30
Raw Video / Re: 5D3 Raw 4:2:2 or 4:4:4?
June 29, 2013, 12:12:51 AM
Quote14bit color depth x three primary colors x 1920x1080 pixels x 24fps = 2090Mbit/s
There. I corrected that for you.  :)

The Bayer color filter array will filter one color per photosite: either red, green or blue. As a result, for every 4 pixels there is 1 red, 2 green and 1 blue. There are more greens because humans are more sensitive to it. That's what makes bayer "compression". To get true rgb color you will need a debayering (demosaicing) algorithm. It will interpolate for every pixel the missing 2 colors from the neighboring pixels. Typically this is done by photo editing software. By the way: this is where moiré is generated.

The reason why cameras use either a bayer filter (or prism in 3CCD/3xCMOS) is simply because in the end all sensors are grayscale, so we need to filter each color and feed it separately to the sensor. Either on photosite level (bayer) or else on sensor level (3ccd).

About 5D3: I'm not sure ML devs know the exact technique it uses to downscale to Full HD.
Here is what I know:
- The output is 14bit bayer, same format as the sensor but 1/3th of the resolution in both dimensions, so exactly 1/9th the information
- 5D3 probably uses all pixel information in its downscaling (unlike line skipping which typically uses one pixel out of nine)
- Its downscaling is better than line skipping: less moiré and aliasing artifacts
#31
Raw Video / Re: 5D3 Raw 4:2:2 or 4:4:4?
June 28, 2013, 02:48:18 PM
Maybe other kinds of compression are better, but bayer is what comes off the sensor, that's why it's referred to as raw (raw meaning unmodified*). And you won't get more information by recompressing it, I'm sure you understand that.
You could however influence quality when choosing the debayering method.

You really need to learn what a bayer pattern is. But even if you don't, you should know that one of the reasons all color cameras (except for 3CCD ones) get their color from an optical bayer filter is because it's a very efficient way of compressing the data.

There I said it. A bayer filter is like an optical compressor applied before capturing.

*) You can do some transformation on raw that results in raw. An example is line skipping: by repetitively skipping a multiple of 2 H/V lines every n lines, you will result in another true bayer pattern. This is what happens to all ML raw videos that use full sensor width, except on the 5D3
#32
Raw Video / Re: 60D RAW video - it's working !!!
June 01, 2013, 12:10:27 PM
Quote from: kotik on June 01, 2013, 11:02:52 AM
In the video world there are 8-(amateur), 10- and 12-bits (both pro) formats. And the bits are for each color (RGB or YUV).
And videocamera's/codecs have different subsampling: 4:1:1, 4:2:0, 4:2:1, 4:2:2 and 4:4:4!
I was just saying that comparing 14-bit RAW with 8-bit RGB is like comparing apples and oranges.
#33
Raw Video / Re: 60D RAW video - it's working !!!
June 01, 2013, 10:48:36 AM
ML can only produce 14-bit raw. This is the same on all cameras that ML supports, this is also the format of all EOS sensors.

14-bit RAW means 14 bits/px with bayer pattern.
8-bit in Photoshop means 24 bits/px RGB. (Also called R'G'B' 4:4:4 in the video universe) Lean what it means

Quote from: driftwood on May 31, 2013, 11:15:18 PM
OK so it is 14 bit but Photoshop only supports 8 or 16 bit. Use 16 bit to mess with. Gonna have to ensure AE is doing the same.
#34
As said elsewhere: Changing picture quality in the canon menu from RAW to JPEG (L/M/S) gives me a bigger shoot_malloc buffer. I get 100MB extra on my 60D:

#35
General Chat / Re: Magic Lantern Cinema Camera?
May 31, 2013, 07:39:13 PM
Apertus promised a crowdfunding campaign somewhere this year ...
#36
I only have a 60D so I can't test this debug build. I made a dng with a lot of hot pixels by overexposing and using high iso. This way I found out that Adobe's raw engine corrects dead or hot pixels, but only the once that are not next to each other. Maybe this is a reason why results are so inconsistent...

I would suggest to use a program like RawTherapee for pixel peeping, it's free and open source.

#37
Quote from: PKVariance on May 28, 2013, 11:22:24 PM
I've been through the forum as much as I can and can't seem to find it - how does one do this x5 crop zoom mode? A link to the location of this wisdom would certainly be enough.

Thanks

Press the plus button.
#38
Raw Video / Re: 60D RAW video - it's working !!!
May 25, 2013, 11:52:37 PM
Changing picture quality in the canon menu from RAW to JPEG (L/M/S) will give you a bigger shoot_malloc buffer. 276MB instead of 180MB.
#39
Heh, that clarifies the [impossible]  ;) Do you know anything more on these linear image effects? Thanks to chipworks, we now know the hdmi chip for the t4i (650D): it's the ADV7523A.
#40
Although this request is tagged as [IMPOSSIBLE], I find this a very interesting idea to explore. I'm not a developer, just a guy who's dreaming.

Of course HDMI has no support for bayered raw. In fact the format used by canon is 8-bit Y'UV 422, this means the hdmi signal has at least 16bit/px compared to the 14bit RAW. So HDMI's bandwidth would be sufficient to get a Full HD raw signal out. Again, I'm not a developer, but just for fun, let's explore the (long!) road it would take to achieve raw over HDMI:

- The first problem I think is that we didn't figure out yet how to feed raw bits into the HDMI buffer. It also needs to be seen if it would accept a bitstream with progressive frames.

- Currently, the camera's builtin display turns off when connecting the HDMI, so you would need to use an HDMI monitor. This means that the signal should be interpretable as YUV422. For that I came up with this raw to yuv conversion. An HDMI monitor interpreting this raw signal as yuv would show a monochrome bayered image with a slight color noise:


                    px1            px2           
14bit raw:          rrrrrrrrrrrrrr gggggggggggggg

                    px1      px1+2    px2      px1+2   
8bit Y'UV422:       yyyyyyyy uuuuuuuu yyyyyyyy vvvvvvvv
                   
16bit yuv-like raw: rrrrrrrr 00rrrrrr gggggggg 00gggggg


Short explanation: every 14bit raw pixel value gets split: the 8 most significant bits become the luminance (y) value of the yuv pixel, the other 6 bits 'leak' into the chroma (u or v) value that follows it. We prefix the chroma value with two zero bits to fill up the byte and also to reduce the color noise resulting from the random value of the 6 last bits.

- Bitrate downconversion (to 12 or 10bit) is in the works, so maybe this kind of upconvertion could be doable by the ARM processor, maybe not.

- Of course this HDMI signal should be recorded uncompressed, as compression would interpret the raw signal as yuv which would destroy it.

- The resulting uncompressed file can than be reconverted to a standard raw file without any loss.

And voilà! A man can dream sometimes, can't he?