Thank you for your answers and suggestions.
Planetary imaging is well possible in video mode: the 60D(a) is able to record a video with a resolution of 640x480 pixels, using 640x480 "real" pixels on the center of the CMOS. Because there is no crop or resizing image quality is excellent and you can record faint details on planetary surfaces. The drawback is you use only 640x480 pixels on a 5184x3456 pixels' sensor.
As sun and moon are far bigger seen from earth than planets, astrophotographers try to use all the sensor surface to picture the sun or the moon in full. Using the 640x480 video will lead to take multiple movies of surface parts, then process the videos to get good pictures, then stick them together to get the full surface. In case of the sun, where events (like a solar flare) are moving in minutes, pictures taken on different time have a different aspect, thus "unstickable" . Using a 1080p video will get the surface in full, but due to cropping a lot of fine details are lost.
So currently the trick is to shoot continuously until the buffer is full, redo several time the process hoping one of the picture in the sequence was taken in steady air. It is a sort of slow fps, short time, high resolution sequence... with random results.
A way to improve the amount of good pictures is to monitor air turbulence. It is possible to do so in liveview, but the time your brain decide the picture is clear enough, order your finger to press the button to shoot... it's always too late. So you try to guess when the air will become steady. Results proove that guessing is not an exact science.
But if the detection can be done via the electronic or via a piece of brilliant code and the shoot is done in the next quarter of a second then the amount of good pictures will improve greatly and will let the astronomer hands available for other concurrent activities.
I hope I wasn't boring you too much with all these details
