ffmpeg provides 16bit RGB and 4:4:4 rawvideo YCC. Described as rgb48 and rgb48le, yuv444p16 & yuv444p16le, (le) depending whether LSB is first or not. Whatever color space gamut preferred, linear or gamma encoded. A bit academic really as it's what the grading app or NLE supports.
My understanding is that raw is not even in a color space, nor is it 14bit per channel when talking about say 10bit YCbCr Prores or 16bit per channel RGB48. 4:4:4:4 is with alpha, 4:4:4 is not.
When the raw data is 'developed' into more 'usable' RGB data it can be matrixed into a color space like sRGB, AdobeRGB, Prophoto, XYZ etc either baked or interpreted in a color managed raw handling app, then subsequent color processing would be done on RGB data not raw.
Only a small proportion of the operations done in creating a 'grade' or 'look' are actually done on raw data ie: raw to RGB at best bit depth the app can muster. From then on most opps are done on RGB data preferably in Lab.
Benefit on importing raw over 16bpc is control over WB, debayer algo etc and avoiding all the intermediate storage.
**EDIT** Posted same time as pavelpp. Answer, application support. Only know of Avisynth that will handle those ffmpeg formats and plugin support limited. Dither tools being one.