@dubzeebass - Yes! I'll get a quick tut together for this.
@QuickHitRecord - Not a dumb question when you look at it from that point of view but unfortunately that's not how it works - I wish it were that easy.
Sorry for the long post... as usual

I'm no expert when it comes to sensors but this oversimplified description should give you an idea of how things work as I understand it - A raw image from your 5D mark III does not have color in the way we typically understand color. The sensor is monochromatic with a color filter array sitting on top
http://en.wikipedia.org/wiki/Bayer_filter - the CFA is a repeating mosaic of 1 red, 2 green and 1 blue pixels. When photons pass through any of the red, green or blue color filters it registers as voltage on the sensor and depending on what color the pixel filter is and what voltage is measured a guess is made as to what the color is (red, green or blue). What you see as color is basically made-up by a debayering/demosaicing algorithm. The debayer process looks at surrounding pixels measurements and from these will assign levels of red, green or blue for each pixel. The RGB levels dictate the hue and saturation. If you download RawDigger
http://www.rawdigger.com/ you will see a visual representation of how the raw mosaic is made up and what the raw image actually looks like.
Anyway, getting back to what you asked - Could Cinelog be applied directly to ML Raw footage? Yes and no - it is applied to the raw footage (in Resolve the raw data is put into a non-destructive 32bit floating point workspace) but colorspace transforms are applied after debayering - ACR and DaVinci Resolve debayer to specific colorspaces - Cinelog-C transforms these colorspaces, not the raw footage - the debayer algorithm does that part.
The color matching issue is really only an issue for multi-camera shoots where a colorist will usually be tasked with manually matching the source material from various different cameras to an approximate uniform look for the primary grade before getting into the artistic 'look'. The purpose/concept of ACES is to tackle that exact issue and establishes a camera agnostic workflow where an Input Device transform (IDT) will transform the input from camera A, Camera B and Camera C to whatever AMPAS has decided is the 'basic look/basic color' of ACES.
An ACES IDT is a script consisting of a linearization component and color matrices. Simple ones can be a single 3x3 matrix (with a chromatic adaptation matrix for a second illuminant) but the more accurate IDTs can extend to an independent matrix for each exposure index over a range of color temperatures - and all this only works if the metadata in the raw or video file is correctly described and readable.
With ML Raw being 'raw' we have got the ability to manipulate things a lot but we do not have ACES IDT's for any DSLR yet. The matrices in ML DNGs transform the sensor's native RGB values to CIE XYZ colorspace (used for connecting colorspaces) but they are not used for color matching.
Cinelog-C can get an ML Raw image into ACES colorspace but it will still be Canon DSLR color at the moment. What is needed is intermediate matrices for each camera that act as the IDT to transform each DSLR's color to (or as close to) the color of ACES with a minimum of 2 transforms for Daylight and Tungsten shots.
It's a huge job and with my limited resources I can only really do it to the level of color matching reflectance from 24 color patches (i.e. a MacBeth color chart) under controlled lighting. It will not be perfect either because I don't have a color lab with a monochromator, spectroradiometer, spectrophotometer etc. The best I can do is buy a copy of ISET Camera Simulator for Matlab and rent the cameras. The first job is to produce a better set of default color matrices for ML raw because even the Adobe ones have flaws (this may be by design but some things simply don't add up as they should) - I'm working on that bit at the moment and will then get back to developing the ACES IDTs. Even if/when I do have a set they will still need approval for inclusion in the ACES source code and then it's up to app devs to include them in things like Resolve. I have hope but I honestly don't know if that will even happen - probably not, but the matrices will still be useable with OpenColorIO and can be baked into luts so that we can edge closer to ACES color.