There's one thing I never understood
If cameras limited to 21MB/s the card speed is the bottleneck why going the JPEG way? I mean if the camera is able to capture 26 RAW frames in about 1 second why not capture the same frames in JPEG in 2K (to get the best resolution) in a way every second you don't over 21MB/s... of course assuming the software camera can do that like if it was video and not be limited by the burst mode for pictures!
Also wouldn't be amazing if the HDMI output could be used to transfer data? All cameras would become REDs 
Sandro,
You need to spend some time on the General Development part of the forum and start reading some of the more technical posts.
The answer to your question is CPU. There just isn't enough CPU speed on the cameras to convert 24 frames per second to JPEG, not even close.
As far as the HDMI output, HDMI is a standard set in place, with controller chips which obey that standard. It is not a faucet that you can pipe anything you want through. Raw data, JPEG, or other formats are simply incompatible with HDMI and can't be transferred through it. Further, you would need a recording device which could understand what type of data is being recorded, and no such device exists.
If you want a camera which is affordable, records in high resolution, is pretty stable and reliable with the least amount of compromises, you should look into what Blackmagic has to offer you.