Module size

Started by novovaccum, July 25, 2023, 07:39:43 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

novovaccum

Hello,

As I was suggested not to load too large module into memory, I was actually wondering about memory limitation related to modules in ML .

I've been reading through the related topics :

From what I understand it all depends on the model of camera, but also how you are using ML : how many modules are loaded and how they will require memory depending on their needs as having too many modules or memory-intensive ones might result in throwing ERR70

In my case, using the 600D, I'm trying to figure out how much room is set for ML?

As Canon code doesn't know about ML and thinks he owns all memory pools to himself, what are the limitations (if identified)  ?

As I am using some external (configurable) library for my module, I'm concerned about those limitations.
Looking at the size of default modules, the biggest ones such as Lua or EDMAC are over 300k.
Should I consider this to be the limit?

Once again I know it relates mostly on the way things will be configured and how ML will be used but I cannot seem to wrap my head around that subject. And please let me know if this information was already available in the forum. I'll simply delete that post! :)

Thank you for your help! :)


names_are_hard

There's no simple answer, the OS has multiple different pools of memory and we abstract this away for ML code.  You can easily test this in Qemu: make a module with a large static array of data, that way you can make a module of whatever size you want.

I quickly tested loading both EDMAC and Lua modules on 600D in qemu and this worked.  I don't know of any special limit on the size of an individual module.  They go through a build process that will limit them to sizes for 32-bit ELF files, but these are much too large to be relevant.

novovaccum

Thank you again for your help on this!  :)

Quote
I quickly tested loading both EDMAC and Lua modules on 600D in qemu and this worked.

Also did a couple of tests with 600D on qemu. I can load almost every other modules with EDMAC and Lua without any issue!  :)

Quote
I don't know of any special limit on the size of an individual module. They go through a build process that will limit them to sizes for 32-bit ELF files, but these are much too large to be relevant.

32 bit ELF files would mean max size = 4Go, without taking the constraint of the OS that surely will not let us manage such size in memory I think.

Concerning memory allocation and based on the topics related to memory pools, as you pointed out, there's some abstractions for ML code which will set the right memory pool depending on its availability and usage for ML. So I guess when ML is being loaded into memory (and depending of previously activated modules) ML might use malloc to get initialized or even AllocateMemory (MemoryManager) pool if the size exceeds malloc limits?  Or is it directly mapped into the AllocateMemory pool and depending on modules usages other pools (such as shoot_malloc) might be used to provide more memory?

It might be more subtle than that, but I'm just trying to get some insights to understand how things work on that part.

names_are_hard

Quote
Concerning memory allocation and based on the topics related to memory pools, as you pointed out, there's some abstractions for ML code which will set the right memory pool depending on its availability and usage for ML. So I guess when ML is being loaded into memory (and depending of previously activated modules) ML might use malloc to get initialized or even AllocateMemory (MemoryManager) pool if the size exceeds malloc limits?  Or is it directly mapped into the AllocateMemory pool and depending on modules usages other pools (such as shoot_malloc) might be used to provide more memory?

For most of these the answer is "it depends" (and also: I'm not going to spend the time looking up a definitive answer!).  There are different loading methods for ML on different cams, and more generally when doing allocations, heuristics are used to pick which pool to use.  The sizes of the pools vary on different cams.

Module loading is handled in src/module.c, _module_load_all(), so you can look up whatever details you need to know.

For ML loading process, see boot-d45-am.c, boot-d45.c, boot-d45-ch.c, boot-d678.c, assuming you're working from https://github.com/reticulatedpines/magiclantern_simplified/commits/dev