workflow, 30fps HDR recording to 30fps hdr mpeg (linux, bash)

Started by Maarten., July 18, 2012, 10:17:43 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Maarten.

This workflow generates a 30fps movie from a 30fps recording. It's pretty cpu intensive, and is at the moment not usefull for large films. The script is set to process the first 15 frames, to test the procedure, and can be set to the full length offcourse. This process can be made multi-threading if it proves to be usefull :-)  If anybody's got suggestions, I'm glad to hear them!

The script needs the following applications installed to work:

mplayer, for extracting the frames
mencoder, for merging the frames to a movie
align_image_stack, for aligning the dark and light frame, this is the part that is cpu intensitive
enfuse, for generating the hdr image
mogrify, for cropping the image, this should be done by enfuse, but it has a bug that it does not work

This is a test movie I made with it, shot with a 50d:

http://www.youtube.com/watch?v=2_9zwnY-95Q

This is a visual of my workflow:



This is the bash script, make sure you test this script in an empty directory with just the script and the .mov file.


#!/bin/bash

# Usage: vid2hdr.sh namevid.mov

alignFrames() {
noframes=`ls *.png | wc -l`

for i in `seq 1 2 $noframes`; do
in1=`printf "%08d.png" "$i"`
in2=`printf "%08d.png" "$((i+1))"`
out1a=`printf "%08d_A.tif" "$i"`
out1b=`printf "%08d_B.tif" "$i"`

out2a=`printf "%08d_A.tif" "$((i+1))"`
out2b=`printf "%08d_B.tif" "$((i+1))"`

align_image_stack -a first_frame $in1 $in2
mv first_frame0000.tif ./align/$out1a
mv first_frame0001.tif ./align/$out1b

align_image_stack -a second_frame $in2 $in1
mv second_frame0000.tif ./align/$out2b
mv second_frame0001.tif ./align/$out2a

done
}

myCropFrames() {
noframes=`ls ./align/*.tif | wc -l`
noframes=$(($noframes/2))  #delen door twee
for (( n=1; n<=$noframes; n++ )); do
in1=`printf "./align/%08d_A.tif" "$n"`
in2=`printf "./align/%08d_B.tif" "$n"`
echo "cropping frame $out from $in1 and $in2 ..."
#w=1920:h=1080
mogrify -crop 1820x1024+50+28 $in1    # crop, but maintain aspect ratio
mogrify -crop 1820x1024+50+28 $in2
done
}

myHdrFrames() {
noframes=`ls ./align/*.tif | wc -l`
noframes=$(($noframes/2))  #delen door twee
for (( n=1; n<=$noframes; n++ )); do
in1=`printf "./align/%08d_A.tif" "$n"`
in2=`printf "./align/%08d_B.tif" "$n"`
out=`printf "./align/hdr/%08d.jpg" "$n"`
echo "creating HDR frame $out from $in1 and $in2 ..."
#enfuse -f 1820x1024+50+28 --compression=95 -o $out $in1 $in2  #Doesn't crop, bug...
enfuse --compression=95 -o $out $in1 $in2
done
}

encodeVideo() {
vid_input_width=1820
vid_input_heigth=1024
vid_output_width=1280
vid_output_height=720

# 2 stage encoding with mpeg4 codec, settings from mencoder wiki
# optimal_bitrate = (40..60) * 25 * width * height / 256
opts="vbitrate=12150000:mbd=2:keyint=132:v4mv:vqmin=3:vlelim=-4:vcelim=7:lumi_mask=0.07:dark_mask=0.10:naq:vqcomp=0.7:vqblur=0.2:mpeg_quant"
#w=1920:h=1080
mencoder mf://./align/hdr/*.jpg -mf w=$vid_input_width:h=$vid_input_heigth:fps=30:type=jpg -oac copy -ovc lavc -vf scale=$vid_output_width:$vid_output_height -lavcopts vcodec=mpeg4:vpass=1:$opts -o /dev/null
mencoder mf://./align/hdr/*.jpg -mf w=$vid_input_width:h=$vid_input_heigth:fps=30:type=jpg -oac copy -ovc lavc -vf scale=$vid_output_width:$vid_output_height -lavcopts vcodec=mpeg4:vpass=2:$opts -o output.mpeg
}

cleanup() {
rm -rf align
mkdir align
mkdir align/hdr
rm *.tif
rm *.png
rm *.jpg
}

cleanup                            # Be sure this is what you want ;-)
mplayer -frames 15 -vo png:z=1 $1  # process only 15 frames, for testing
#mplayer -vo png:z=1 $1            # Process all frames, takes a while ;-)
alignFrames
myCropFrames
myHdrFrames
encodeVideo
#cleanup
exit

scrax

Will try it on Osx i've all requirement installed, but need to do a test video before :)

The one posted by Ro-Man on vimeo is similar and worked on osx.
I'm using ML2.3 for photography with:
EOS 600DML | EOS 400Dplus | EOS 5D MLbeta5- EF 100mm f/2.8 USM Macro  - EF-S 17-85mm f4-5.6 IS USM - EF 70-200mm f/4 L USM - 580EXII - OsX, PS, LR, RawTherapee, LightZone -no video experience-

ilguercio

I got the 50D as well but i don't actually have Linux installed now and anyway i suck at it so... :(
Canon EOS 6D, 60D, 50D.
Sigma 70-200 EX OS HSM, Sigma 70-200 Apo EX HSM, Samyang 14 2.8, Samyang 35 1.4, Samyang 85 1.4.
Proud supporter of Magic Lantern.

menoc

Martin, I have a 50D. Will this work on an Mac with OSX 7?

Maarten.

Maybe scrax can tell if it works on Mac or not. I have very little experiance with mac, and do not own a macintosh.

But I know you need to know a little bit about bash scripting for this workflow to work for you, because it is work in progress.


Michael Zöller

Thanks Maarten. I think this this could prove to be very useful. It only uses open source tools, should be portable and is completely automized (except for maybe setting proper in/out resolutions and codec requirements). I cannot test this right now but I hope I will at the end of the week.
neoluxx.de
EOS 5D Mark II | EOS 600D | EF 24-70mm f/2.8 | Tascam DR-40

jordancolburn

Thanks for the awesome script, I'll have to give it a try tonight. 

How important is the image alignment?  A few of the other workflow seem to do a simple split and then HDR between the frames without alignment.  If the motion during the time between frames (1/60 s) isn't too much, I wonder if you could save some processing time and make the script more useful for longer clips.

Thanks again!

ilguercio

Canon EOS 6D, 60D, 50D.
Sigma 70-200 EX OS HSM, Sigma 70-200 Apo EX HSM, Samyang 14 2.8, Samyang 35 1.4, Samyang 85 1.4.
Proud supporter of Magic Lantern.

Maarten.

Well, the alignment part is essential for two reasons. One, the quality of the hdr frame will be significantly better. Two, the frame rate of you movie is not being divided by two. If you are shooting at 50/60 fps the problem might be not that big, but with my 50d I end up with 15fps when using a conventional way. The penalty you have to pay is the time it takes to render.....   

When you make it multi threading, it will be faster depending on the amount of cores and processing power you got, but it will still be time consuming.

On windows, anything is possible but I haven't heard anyone thought it was usable after trying it. So maybe there is a better way.

I've stopped using the video function on my 50d though, because the battery drain is big, and I worry about burning my camera.

ilguercio

Canon EOS 6D, 60D, 50D.
Sigma 70-200 EX OS HSM, Sigma 70-200 Apo EX HSM, Samyang 14 2.8, Samyang 35 1.4, Samyang 85 1.4.
Proud supporter of Magic Lantern.

Maarten.

burning is maybe a bit exaggerated :), but the battery runs out of juice pretty quickly, where does the power go and why doesn't the 50d have native filming support....

ilguercio

Quote from: Maarten. on September 06, 2012, 09:14:22 PM
burning is maybe a bit exaggerated :), but the battery runs out of juice pretty quickly, where does the power go and why doesn't the 50d have native filming support....
Because it was going to spoil 5DII sales.
If you had 2 cameras to be sold, would you give the same amazing feature to the less expensive one?
There's no reason why the 50D can't do videos, in fact my battery lasted a bit more than an hour during a test.
Canon EOS 6D, 60D, 50D.
Sigma 70-200 EX OS HSM, Sigma 70-200 Apo EX HSM, Samyang 14 2.8, Samyang 35 1.4, Samyang 85 1.4.
Proud supporter of Magic Lantern.

Maarten.


jordancolburn

I gave it a go the other night and got some very nice looking video (A little dark, but it might have been more my source footage).  I plan on playing around with this a lot more, as well as scripting for other video tasks such as creating timelapses.

scrax

I'm testing this on osx right now, planning to add it to MLTools maybe.

EDIT: Confirmed working on OsX too.
I'm using ML2.3 for photography with:
EOS 600DML | EOS 400Dplus | EOS 5D MLbeta5- EF 100mm f/2.8 USM Macro  - EF-S 17-85mm f4-5.6 IS USM - EF 70-200mm f/4 L USM - 580EXII - OsX, PS, LR, RawTherapee, LightZone -no video experience-

PhilFree

I wonder why this topic does not seem to be popular. Is there a better/faster way on Linux, or no one is using Linux for processing? is I've tried the script on linux and I am satisfied with the results. Thanks to the author of the original script. The only disadvantage is the processing speed.
I modified it to use ffmpeg instead of mplayer/mencoder, since the latter were not available on ubuntu 15.10. And the output is in some lossless format. So the current code looks like this:


#!/bin/bash

# Usage: vid2hdr.sh namevid.mov
# Based on http://www.magiclantern.fm/forum/index.php?topic=1477.0

alignFrames() {
noframes=`ls frames/*.png | wc -l`

for i in `seq 1 2 $noframes`; do
in1=`printf "frames/%08d.png" "$i"`
in2=`printf "frames/%08d.png" "$((i+1))"`
out1a=`printf "%08d_A.tif" "$i"`
out1b=`printf "%08d_B.tif" "$i"`

out2a=`printf "%08d_A.tif" "$((i+1))"`
out2b=`printf "%08d_B.tif" "$((i+1))"`

align_image_stack -a first_frame $in1 $in2
mv first_frame0000.tif ./align/$out1a
mv first_frame0001.tif ./align/$out1b

align_image_stack -a second_frame $in2 $in1
mv second_frame0000.tif ./align/$out2b
mv second_frame0001.tif ./align/$out2a

done
}

myCropFrames() {
noframes=`ls ./align/*.tif | wc -l`
noframes=$(($noframes/2))  #delen door twee
for (( n=1; n<=$noframes; n++ )); do
in1=`printf "./align/%08d_A.tif" "$n"`
in2=`printf "./align/%08d_B.tif" "$n"`
echo "cropping frame $out from $in1 and $in2 ..."
#w=1920:h=1080
mogrify -crop 1820x1024+50+28 $in1    # crop, but maintain aspect ratio
mogrify -crop 1820x1024+50+28 $in2
done
}

myHdrFrames() {
noframes=`ls ./align/*.tif | wc -l`
noframes=$(($noframes/2))  #delen door twee
for (( n=1; n<=$noframes; n++ )); do
in1=`printf "./align/%08d_A.tif" "$n"`
in2=`printf "./align/%08d_B.tif" "$n"`
out=`printf "./align/hdr/%08d.png" "$n"`
echo "creating HDR frame $out from $in1 and $in2 ..."
#enfuse -f 1820x1024+50+28 --compression=95 -o $out $in1 $in2  #Doesn't crop, bug...
#enfuse --compression=95 -o $out $in1 $in2
enfuse -o $out $in1 $in2
done
}

encodeVideo() {
rm $output_filename
ffmpeg -start_number 1 -i ./align/hdr/%8d.png -vcodec ffv1 $output_filename
}

cleanup() {
rm -rf align
mkdir align
mkdir align/hdr
rm -rf frames
mkdir frames
}

output_filename=${1%.*}_hdr_ffv1.avi

echo === Cleaning up ===
cleanup                                     # Be sure this is what you want ;-)
echo === Extracting frames ===
ffmpeg -i $1 -t 00:00:01.0 "frames/%08d.png"  # process only first second of the video
#ffmpeg -i $1 %08d.png                      # process everyting

echo === Aligning frames ===
alignFrames
echo === Cropping frames ===
myCropFrames                               
echo === HDRing frames ===
myHdrFrames
echo === Encoding video ===
encodeVideo
echo === Cleaning up ===
cleanup
exit


Ther's still a lot to improve:
- multithreading - I have no idea how to implement this
- create less temp files - it could keep only a few tif files and delete them as soon as final output frame is created
- autocrop calculation - is there a way to get data from align_image_stack about maximum offset, that is encountered?
- any other approach instead of cropping - is there some temporal filter, that could take data from previous/consecutive frame to avoid cropping?

RelUnrelated

Thanks Maarten for the original script, and PhilFree for the modifications. I'm finally taking the jump to some video in HDR, though I've only done stills before. I'm going to take some test footage and play around with the scripts to see what might work out best for me. Multithreading would definitely be a plus. I've got an 8-core 4GHz CPU in my Linux box, and using more of its power would help!