Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Messages - Luther

#276
Quote from: masc on May 23, 2019, 07:14:06 PM
The MB don't care at all. Lines of code, dependencies and coding interface cares... and this is more than huge from what I saw. (At least I didn't got how to handle it.)

Well, it's meant to be an all-in-one solution, I think. The only dependency would be Glib (only on windows?).
I don't think there is a simpler open implementation. I've found the OpenFX LensDistortion, works on Natron but, even though I'm not a programmer, I can notice it is more complex than lensfun.
Maybe leave this feature to another time.
#277
Very nice.
#278
Quote from: timbytheriver on May 18, 2019, 06:07:39 PM
"The first point to be made is that compared to other workflows ACES will not improve the final image quality, or enable improved/better colours, or provided any other image related benefit. It is not a 'magic bullet' that somehow guarantees better end results."

This is not true. He is probably comparing ACES with ALEXA Wide Gamut. From that perspective, indeed, ACES would not offer much advantage, as with comparing with ProPhoto.
Now when comparing with sRGB/Rec.709, and that is the case of MLVApp, ACES do offer big advantages in many aspects. For example, when I tried to record a show with many colored light, MLVApp gives out of gamut and muddy colors, while on Rawtherapee (using an extracted DNG and configured to use ACES) it is smooth. Also, if you try some drastic color changes MLVApp can give color artifacts, where with ACES this would probably happen less often.
Adapting ACES to MLVApp would also enable to export in Rec.2020 and that's the industry standard now.
Some examples here:
https://webkit.org/blog-files/color-gamut/

Couldn't test the wide gamut branch yet, but when I do I'll post some examples here.
#279
Sorry for the late reply.

Quote from: masc on May 17, 2019, 11:09:07 AM
Yes, lensfun is really nice - but very huge.

I wouldn't call 5MB huge in this day and age where a web browser is three times bigger than an entire operating system.

Quote
But what I found out: have a look here, you can see that it won't correct CA's for most lenses.

Humm. That seem to be true.

Quote
And I am not sure if it would when using a Speedbooster on a EOS M (which produces a lot of CA's in another way than the lens would without).

In case of other lenses/adapters you could profile yourself with adobe lens profile:
http://rawpedia.rawtherapee.com/How_to_get_LCP_and_DCP_profiles

But, I don't think this is compatible with the XML from lensfun.

Quote from: Ilia3101 on May 17, 2019, 11:30:33 PM
""Complying" with a standard" isn't the intention of profiles. The original intent of the profiles was to give different looks. Back in the day, MLV APp only had standard and tonemapped, to give different looks. These profiles were never supposed to control anything like output colour spaces/logs or stuff like that. It was always planned to be separate but that's not how it turned out. Anyway, the BetterProcessing branch has separate controls for these things.

Understood.

Quote
Also fattal tonemapping looks really creepy.

Looking at the research linked, yes. But, it can be used in different intensities. It was adapted to Rawtherapee exactly for that. Read the discussion here, it is very interesting, might even give some insights for new MLVApp features:
https://github.com/Beep6581/RawTherapee/issues/3061
#280
Quote from: masc on May 15, 2019, 08:53:34 PM
I had success in adding the chromatic abberation correction from RawTherapee into MLVApp.

Nice! Do you think it would be possible load profiles automatically using lensfun (based on metadata)? Also, this lensfun software seems really cool, might be of some help in MLVApp.

Quote from: masc on May 16, 2019, 09:14:47 PM
In the current official MLVApp version DR compression is done behind profile combobox. I don't expect it even to be a little similar.

The "DR Compression" in RT is actually a tonemapping operator. Specifically the Fattal method:
http://www.cs.huji.ac.il/~danix/hdr/hdrc.pdf

It is a bit different than what MLVApp does in profiles. The way I see it is: where the profile is used to "comply" with a standard (e.g, Log-C) in a linear way, the tonemapping tries to compress or lift the illumination dynamically.

I'll see if I can test the vid.stab and the CA next week. Also would like to test the ACES branch :)
#281
Quote from: Ilia3101 on May 08, 2019, 08:31:19 PM
Sorry having a bit of a break. I have to. I will definitely be back to finish at some point in June.

No problem @Ilia3101, hope you're well!
#282
Quote from: Danne on May 08, 2019, 06:34:00 PM
Gottcha!

off topic: damn, this came to mind when I read your "Gottcha!", haha. Dropped hard here. Good to see progress, though.
#283
No ACES :(
Anyhow, nice job this release, the new shadow/highlight seems cool.
#284
Quote from: Kharak on May 08, 2019, 02:12:27 PM
That crazy noise looks like a bad LUT or wrong input for the lut.

Indeed.
#285
Quote from: Walter Schulz on May 05, 2019, 01:30:17 AM
If you do not believe me the next remote session is yours!

I do believe you. I had experiences like that with some people.
But, people overestimate the difficulty to use simpler software. Let me give you an example: you want to watch some videos on youtube. If you do that on Chrome or Firefox, it needs to load >6MB of javascript and takes a bit time don't have a fast internet. Now with simpler software: just access with Links2 the mobile version of youtube; copy the link of the video and press F3 (a macro to call mpv+youtube-dl, using autohotkey on windows or xmonad/ion3 on unix-like systems). Done. Much easier. The same goes for streams, using streamlink. Or pdf, with aria2c+mupdf.
I think most people are just too comfortable to change their habit and way of work and do not consider other solutions.

Back to the topic, why have a browser that does all kinds of things? Why not have something that just does it... browse? A browser should be nothing more than: a http downloader to get/post and a html interpreter. Other stuff like CSS and Javascript should be optional. Oh, companies want to have fancy animations? Then create a separate software for it, using a safe language like Rust or Haskell, not Javascript. The introduction of javascript was probably the most awful thing that happened on the internet. Who the f* thought running a binary on every web page was a good idea? For f* sake.
I do think this complexity is ruining the internet, sincerely. This has to change.
#286
Todays browsers are as big as an entire operating system. This amount of complexity is ridiculous and unnecessary. You cannot expect reliability and security from it.
I've been using Links2 browser for most of the time, it works just fine if you have other software working together (mpv, youtube-dl, subliminal, aria2c, etc).
#287
General Chat / Re: GoT S08E03 - Cinematography
April 30, 2019, 05:21:22 PM
Quote from: timbytheriver on April 30, 2019, 05:00:02 PM
172.8 was indeed a used shutter angle in actual film cameras. If you shoot at 24fps in PAL land (50Hz mains current) it ensures flicker-free shooting with HMI lights that have either dodgy/magnetic/old ballasts – or even fluorescents. If you are shooting at 25fps you shoot 180 deg and you will avoid the flicker also.

Oh, that's interesting. Didn't know that. Makes sense, since they recorded in Ireland.
#288
General Chat / GoT S08E03 - Cinematography
April 30, 2019, 04:32:29 PM
I thought it would be appropriate to open a discussion here about this. So, this episode was probably the biggest in television history (cost and people), but also attracted a lot of criticism about its cinematography choices. What is your opinion?
At first, I thought:
1- They chose a very dark set and color grading to reduce costs in post-production, as it is easier to "hide" details in shadows (the fast cuts also helps in this point)
2 - To maintain realism. In medieval times people didn't have electric lights, let's remember that
3- That choice was coherent with the story line. The "darkest" moment is in a nigh battle and, at the end, the sun rises (hope)

One thing that called my attention was the rolling speed:




I understand using 23,976 fps, as they needed to use NTSC standard. But why not use 180 degree in scenes with no slow-motion instead of 172.8? Also, why use 90 degree in scenes with 23,976? Those inconsistent values are not common practice. Might be to create a euphoric movement (just like in "Saving Private Ryan")...
And why not get a faster set of lenses? They seem to be using Alexa with Cooke S4 (source: imdb), but I personally think this was an error from their part. Using Primos (f/1.8) or even Summilux-C (f/1.4) they would be able to get +1-2 stop and reduce the ISO to 800, getting more dynamic range to work with and less noise.

The color grading was very strange too. Apart from the near crushed shadows, the pallete was heavily bi-chromatic (teal-orange). This made the blood artificial. Take a look on one shot from the BtS and one from the final show:






A part from these points and the bad screenwriting, the show was good. In special CGI nailed it.
Images from the Behind the Scenes:

#289
Quote from: masc on April 30, 2019, 02:46:32 PM
There is exactly one branch: Danne's experiment branch. https://www.magiclantern.fm/forum/index.php?topic=9741.msg208959#msg208959

Wouldn't it be better to push at least some of those changes into main tree? Lot's of questions in this thread this month about where to find the right package for EOS M. Is it too unstable yet?
#290
Quote from: Audionut on April 27, 2019, 01:06:54 PM
Depends entirely on the camera.  5DIII is good to ISO 3200.

Couldn't understand the Table A2. Can you explain? From 1600 to 3200 it has a drop in DR and double more noise, from what I got from this table...


  noise           (electrons)  (e/DN)    ISO
16.86                2.655    0.157    3200
31.76                2.501    0.079    6400
#291
Quote from: ricardopt on April 27, 2019, 09:07:43 AM
Hi Luther, english is not my native language, by aliasing do you mean "jagged lines" or the "jumpiness" in the folliage in the first 16 seconds?

Yep, exactly that. Here is a example:
https://en.wikipedia.org/wiki/Aliasing#/media/File:Aliasing_a.png

Quote from: masc on April 27, 2019, 12:29:13 PM
3x3 produces always aliasing, because of line skipping (except 5D3). The more sharpening one add e.g. in MLVApp, the "better" moiree/aliasing is visible.

My bad, I meant the 1:3 binning. Doesn't that nearly solve the aliasing issue? EOS-M is very impressive, I'm looking to replace my cameras with one too.
#292
Quote from: ricardopt on April 27, 2019, 02:39:45 AM
25th April

So much aliasing! Didn't the last experiments from Danne solved that (3x crop mv1080p MCM rewired)? The crop is a problem, but I think the Viltrox speedbooster could solve that (?). Also, the ffmpeg aliasing script on MLVApp seems to work nicely for this kind of footage...
#293
Quote from: 70MM13 on April 27, 2019, 01:12:56 AM
i have a feeling that there will be no benefit for iso 3200, but you will have to experiment to find any setting that helps.

I think Canon's ISO goes from range 100 to 1600. Above that it uses digital push. @a1ex could correct me if am wrong here.
#294
General Chat / Re: Foveon sensor (SIGMA cameras)
April 23, 2019, 01:08:03 PM
Quote from: ArcziPL on April 23, 2019, 11:51:39 AM
Despite I somehow "like it" (does it matter? or should the photo just match reality perfectly?)

Should always match reality, but hold as much information as possible. Post-production is here if you want to change the look.

Quote
- +1EV and the image from sd Quattro H already gets awful, large colored spots over the whole dark gray background. Mostly prominent close to image edges. Just with +1EV! For me, such a cam would be useless.

That's why you should always expose right  ¯\_(ツ)_/¯
Seriously now, sometimes even with artificial light and fast lenses you can't get enough... in those cases, CMOS Bayer seems to work better.

Quote
I can live with Bayer, for me it's a sweet spot. No wonder it's commonly adopted. No breath-holding what near future brings. :)

Indeed, we can live with Bayer, but we should go always beyond.
I think more important than asking "how" to create better sensors is also "why". What cameras need more than a Alexa 65 can give now? Signal to noise ratio? Dynamic range? Well, Alexa already have a great ISO performance and Kodak films already have enormous DR. So what can we push more? Foveon offers a solution for aliasing, color precision and fine detail preservation... that's nice. Alexa/film is too expensive? Then the point is to make the technology not "better" but "cheaper".
I think people should ask those questions more frequently.

Quote from: scrax on April 23, 2019, 12:32:14 PM
like it takes like 3sec to save a pic and can't be reviewed during this time...

This is just a matter of adapting to other advancements in industry. For example, the issue you mentioned could be solved with better ASIC, using parallel programming, better/faster compression algorithms and better flash drives:
http://parallel.princeton.edu/openpiton/
http://fadu.io/
https://github.com/facebook/zstd


BTW, I'm not saying Foveon is ready. It's clearly not and, in my opinion, never will be because of patents.

ps: Anyone knows if AXIOM project is still going?
#295
General Chat / Re: Foveon sensor (SIGMA cameras)
April 23, 2019, 03:12:35 AM
Quote from: scrax on April 22, 2019, 02:08:40 PM
[...] and recently there is this that could be the evolution of the Foveon and maybe can be bought and developed by Canon

Nice research. I hope no one closes it with patents. This could represent a new evolution in color precision, if the sensor comes with the spectral sensitivity table (to use as IDT on ACES - as each camera has variation on color absorption).

Quote from: ArcziPL on April 22, 2019, 11:35:23 PM
I believe this comparison should tell all:
https://www.dpreview.com/reviews/image-comparison/fullscreen?attr144_0=sigma_sdquattroh&attr144_1=nikon_d850&attr144_2=canon_eosm50&attr144_3=canon_eos5dmkiv&attr146_0=100_6&attr146_1=100_6&attr146_2=100_6&attr146_3=100_6&normalization=compare&widget=645&x=0.1367195625070196&y=0.5167449200029397

sd Quattro H vs. D850 vs. M50 vs. 5D Mark IV. All with same subject, same exposure, ISO100 + 6EV.

You can clearly see Foveon has more fine details when more light is available. On 1/5 speed, look on the grass areas.

Quote from: scrax on April 23, 2019, 12:28:26 AM
For what I do now (cave/mines photography) my tipical subject is in totally dark enviroment with tripod and a lot of flashes or old magnesium bulb. So I have direct control on scene light like in studio and can shot at low ISO's.
That's basically why I'm looking at those sensors, but also seems that they are working on a new FF sensor for 2020 with L-mount, so I think I'll wait and try to learn more about this before making a decision.

Your best bet for now would be Hasselblad... if you can afford it.
#296
Share Your Videos / Re: Anamorphic Look Test
April 23, 2019, 02:58:22 AM
I had a true anamorphic adapter (Kowa 16D), and it was a pain in the ass to focus and way too heavy. Also, lots of vignette and not sharp at all. The image looked cool, though.
So, I did mine 'filters' for less than $1 each. Just ask for someone in you city to cut those cheap wood pieces and paint it in black. Here's the shapes:



#298
Looks nice. I see you applied some strong denoise... are you using any sharpening? There's some crushed shadows in some scenes, too, what LUT are you using? Have you tested ACES?
#299
General Chat / Re: Foveon sensor (SIGMA cameras)
April 22, 2019, 05:37:39 AM
Quote from: scrax on April 22, 2019, 01:58:28 AM
I'll like to try one (mainly fo B/W), but if real ISO are only 100, it is really a camera good only for landscapes (with good weather)...

Anything you can use tripod and is static (not moving) this can be a good camera. Would be nice to test together with HDRMerge...
I think Foveon is a great technology and much better than Bayer. The noise issue could be solved if there was more people producing/researching it, but AFAIK, SIGMA patented the technology and now no one can use it, except them. Too bad.

You can simulate the Foveon, doing a technique called Drizzle. This software (supposedly) does that:
https://github.com/LucCoiffier/DSS/
#300
This guy is a legend. Not only for Magic Lantern, but also his work on Linuxboot, HeadsOS and the BMC exploit is amazing.
Together with Gernot Heiser and Rishiyur Nikhil, Hudson is one of the most amazing people in secure computing. Can't wait for a the RISC-V/Linuxboot/seL4 future!

Quote from: Ilia3101 on April 22, 2019, 03:40:08 AM
There's been some mentions of a magic lantern film on the forum lately, or some kind of collaborative project... a bit of this definitely needs to be included

That would be nice. Also, a1ex and g3g00, at least anonymously. I could definetly help, if needed.