I tried using hot pixel fixes with dcraw and never got anywhere. As dfort pointed out, the pixels aren't predictable (at least I couldn't) and the ones that can fire take up almost the entire frame. if I remember correctly. For me, the problem is solved because our sensitivity (or lack of it) for color allows a significant amount of color-information removal to take place before we notice it--the whole idea behind 4:2:0 compression. At first, that answer I'm giving you was not good enough for me. I hated the idea of losing, or doubling up on data points like video compression. So even when Alex or G3ggo came out with chroma smoothing I resisted it.
Now it doesn't bother because my primary love of RAW video is the fact that it allows me to set each pixels 8-bit value (even if smoothed) from 14-bit sources. One of my biggest curiosities is why I can never get 8-bit video to have the same DR feel as RAW. Even with the EOS-M at sub 720p, the RAW has a more nuanced looked than, say, 4K from a GH4 or A7 (I've tried just about every consumer camera you can think of). How can that be? The only cameras that seem to approach the RAW "nuance" starts with the Canon C100 or Sony FS5, both close to $5,000. My guess, until someone with more knowledge can explain it to me, is that in-camera exposure is always a tiny bit off and once it writes to an 8-bit (or 24bit full color) space, even if 4K, there is no way to re-adjust the exposure without seriously degrading the DR feel.
If the above is right, everyone's effort in 10bit is game-changing to me. Initial tests I've done, and looking at other videos' shows that even 10bit RAW is vastly superior to any in camera video out there, even 4K. 10bit not only allows higher resolutions, it can also reduce file size. A focus pixel removed through crude chroma smoothing will probably not be visible to anyone who put a priority on DR. Video shot in 10bit RAW, I wager, will provide a feel that even 8K will not provide. Thoughts?