Proposal: Model compatibility & Code Quality fields for README.rst

Started by Marsu42, October 11, 2013, 08:18:30 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Marsu42

Since nightly is the new stable, it could be that new module pull requests are delayed if they aren't production quality. Imho this hinders testing & feedback, and that's what a trunk source code is for - not to provide daily updated production builds. To enhance the process, I propose two additions to the module's README.rst fields:

1. Code Quality: Staging or Production. This should go along with an option in the module debug section to load only modules that are production quality, so no development modules hamper a stable ml experience - on the other hand new modules could get merged faster since they are now marked as work in progress.

2. Model compatibility: Tested / Untested / Not working . Currently, modules are implicitly expected to work with all cameras. For some modules this is difficult or impossible to achieve, so these flags (like ::Compatibiltiy:: +60d +6d -100d -eosm) would help to filter modules. Rationale:

* Some modules won't work on some models at all, like dot_tune w/o camera afma - currently they only show horrible debug messages about some strange core functions not being found. The new flag would enable ml disable loading this module if it's known not to work.

* Newer models are sometimes quite different - I just painfully discovered this for button management and iso setting when moving from 60d->6d. So the modules' authors can only code for one or two platform, and hope for the best for the rest, if they are willing to make the effort to support all cameras at all. The new flag would make clear what camera the module is tested or intended for.

Audionut

Quote from: Marsu42 on October 11, 2013, 08:18:30 PM
Since nightly is the new stable,

Since when?

Modules are by design intended to be cross compatible.  If a module does not work on specific bodies, a flag does not fix this.

Marsu42

Quote from: Marsu42 on October 11, 2013, 08:18:30 PM
Since nightly is the new stable
Quote from: Audionut on October 14, 2013, 03:31:19 AM
Since when?

Since the real stable is so long ago, that most recent interesting ml features aren't included - so in practice people are using the nightly ml for production, resulting in things like the recent "I lost my wedding video because my 5d3 crashed" situations. So without a current stable ml, merging experimental modules *might* become more reluctant, and the compatibility/tested & code quality flags would be a way to tell apart stable/tested code from new modules.

Quote from: Marsu42 on October 11, 2013, 08:18:30 PM
If a module does not work on specific bodies, a flag does not fix this.

You got that right - but it will enable ml to prevent loading say dot_tune on a 60d in the first place rather than giving gibberish error messages. Also it would enable users to see on what models a module has actually been tested with, what it might work with and what models won't work at all.

Edit: Btw, this is not theoretical - if I polish my "remember settings with shooting modes" module to be merged, I will and can only design and test it for my cameras. Since it sets props, using it on other cameras could be dangerous, even if it compiles just fine for these cameras. So will it remain in the pull queue until someone has the time to extend it to be usable on eos 1100d and 100d? Also, it has a hotkey function that ties some settings to keys - what do I do if the keys aren't there or elsewhere on eosm, the function is useless in this case. I see only one clean approach here: flag the module what it's supposed to be working on and what it's even tested with.

a1ex

Modules that can't be loaded on camera X are not included in the zip, because these errors are also detected at compile time.

Of course, if you bypass these checks (e.g. you copy the modules from another camera, or if you compile yourself and ignore the warnings), you get errors.

Marsu42

Quote from: a1ex on October 14, 2013, 09:06:38 AM
Modules that can't be loaded on camera X are not included in the zip, because these errors are also detected at compile time.

Thanks, since I always copy the mo that fact has passed me... so that would solve the "not working on" problem, though I still "tested on" and "code quality" flags would be a good idea for quicker merging & wider testing.

a1ex

Yep. What I'm thinking is a way to automate these things from user feedback. Would be nice if it can be integrated with a page like this, to have an overview: http://nanomad.magiclantern.fm/jenkins/features.html

Rough scenario: a user can select a camera/module combination (say one cell from that table) and says: I've tested this - works, doesn't, has some quirks, maybe some stars, stuff like that. Same for core features. If there are enough "votes" from testers, some statistics could be helpful.

painya

I think the feedback form would be very helpful and I certainly would be updating it often :D
Good footage doesn't make a story any better.

Marsu42

Quote from: a1ex on October 14, 2013, 09:32:43 AMRough scenario: a user can select a camera/module combination (say one cell from that table) and says: I've tested this - works, doesn't, has some quirks, maybe some stars, stuff like that. Same for core features. If there are enough "votes" from testers, some statistics could be helpful.

Good idea, let's hope enough people participate, at least we'll have a rough idea how popular specific modules are... "works" needs some specific description though, it's not enough to have activated the module and seen that it doesn't self-destruct the camera. And there need to be version tags, so once a module is updated the problems it fixes are history on the feedback page.

Question is if people really look at the webpage, imho this information should still somehow end up in the README.rst so people can choose in-camera if they think a specific module is stable enough for their needs and tested on their camera.