Should some of us consider changing from shooting 1920x1080 24p to 1280x720 60p? (That is of course if you can't shoot anything larger in 60p.)
The other day I saw a presentation by Randy Ubillos. He's the guy who coded the original versions of Adobe Premiere and went on to Apple to write Final Cut Pro and FCP-X. He recently retired from Apple and is giving talks about traveling and shooting movies for fun. He advocates shooting at 60fps.
This got me thinking. I come from a film background which has been stuck in a 24fps playback world. (Ok, now we got higher frame rate playback but many viewers hate it because the experience is like watching live television.) In any case, what I'm wondering about is the practicality of shooting 60p and finishing in 24p.
One second of 1920x1080 24p video consists of 49,766,400 pixels while one second of 1280x720 60p video will flash 55,296,000 pixels at the viewer. Ok, I know that technically video runs at 23.98 but the DCP's and theatrical projectors I have worked with were in a studio that set everything for a 24fps workflow so I'll stick with that.
I'm using these numbers because these are the limits of my Canon camera running ML but stay with me because this also applies to 4k video at 24p vs. HD video at 60p. And it gets even more interesting with higher frame rates and smaller image sizes like the iPhone 6's 568x320 240fps mode.
So, starting with 1280x720 60p camera original material first upscale it to 1920x1080 using a method that interpolates new pixels instead of doing a simple upscale then change the frame rate using a method that involves a form of image stacking to increase the resolution. This should result in a video that is fairly close in quality with video shot in 1920x1080 24p with the advantage of having more control over shutter angle, motion blur and the option to conform to slow motion, albeit at a lower resolution. In addition it should work even better when the camera or subject is moving as opposed to shooting locked down on a tripod for reasons better explained in this article:
http://petapixel.com/2015/02/21/a-practical-guide-to-creating-superresolution-photos-with-photoshop/Note that image stacking is nothing new, it has been used in astrophotography to reduce noise and with macro photography to increase depth of field and resolution.
Also note that while manufactures are pushing for higher resolution there have been many features shot in standard resolution video that have stood the test of time so we shouldn't fear shooting on smaller image sizes. Over 10 years ago I worked on "A Day Without A Mexican" which was shot on Panasonic DVX100 cameras set to 24p (with the so-called advanced pulldown) and blown up to 35mm. We actually had several comments over which scenes were shot on video and what was shot on film--it was all shot on video.
Besides image stacking there are other ways to make a better upscale. Adobe has a rather new Detail-preserving Upscale effect and here is a link to a paper on "Super-Resolution from a Single Image" -- thanks to a1ex for the link:
http://www.wisdom.weizmann.ac.il/~vision/single_image_SR/files/single_image_SR.pdfDisclaimer--I don't have any formal technical training nor did I absorb all that much of the conversations that I had with engineers over my past 20+ years working as an assistant editor/editor/DI conformist, etc. so my musings might be completely impractical.