Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Messages - dyfid

#51
Raw Video Postprocessing / Re: GPU/CUDA acceleration
August 10, 2014, 01:11:04 PM
Quote from: Midphase on August 09, 2014, 08:30:59 PM
I think for raw post, you really need a fairly fast machine with a dedicated GPU (like a MacBook Pro Retina), and Resolve which IMHO is still the fastest way to convert CDNG to Quicktime or whatever other formats.

Only as a guide and quick test on Resolve 11 I get 36fps on mac encoding to ProRes any flavour which includes using force full quality resizing from 1280x544 (550D DNG's) to 1920x1080 letterboxed and full quality debayer both at encode time. 15fps on mac using same settings but going to QT h264. I'd encode without letterbox really.

DNG's are straight from MLVDump no CDNG conversions first. 23.976 source frame rate.

On exact same machine as it's a Hackintosh under Windows 8.1 I get 20fps going to h264 from DNG's, same force settings at encode time. No Prores options so would have to be letterboxed 1920x1080 in DNxHD. I'd use x264 from DNxHD intermediate.

Doing the same with 550D h264 source files instead of DNG's, obviously no force high quality resizing or debayer, I only get 15fps encoding to h264. In other words working from DNG's rather than decompress / recompress h264 gives faster encode times even with high quality debayer and resizing overhead of DNG's.

Machine spec is a quad 3.9, 32GB 4 channel ram and 4GB GTX770. Mavericks on SSD. Windows 8.1 on h/d. Budget machine spec really. I was pulling the h264 & DNG's from an external USB h/d this time but would use my RAID for a proper project.
#52
10bit output from the mini monitor or similar PCIe or TB device is not really the most important part imho, the strengths of the mini monitor type device is it bypasses the graphics card output interfaces and therefore avoids any levels scaling and OS Level ICC colour management. Which needs to be ignored because Resolve doesn't use ICCs.

It requires 3D LUTs for final calibration of the display created sensing the patches sent to Resolves display output, if a ICC display profile is active in that chain then the 3D LUT calibration will be screwed up by the active ICC profile. Which is even more an issue on a mac. Also if the monitors are also used for applications that use ICC profiles for colour management, such as LR or PS the ICC (1D gamma curve part) can load at boot or login as normal for use with ICC based apps without affecting Resolve IF a mini monitor is used to bypass active ICC's.

Not directly related to dual monitor set up but there's also factors like insufficient screen refresh rate support from computer monitors or automatic switching between refresh rates depending on project settings which would be supported by a dedicated display fed from the mini monitor and native screen resolutions rather than scaling to fit the GUI. Many computer monitors lack sufficient hardware controls to correct RGB separation, grey scale and pull the display into rec709 before final calibration with the 3D Display LUT in Resolves Monitor LUT project options. Fine if the computer monitor is decent and calibrates well.

But it can be better to use a couple of cheap 20 - 22" monitors for the GUI fed by the graphics card and spend the money instead on a larger dedicated display fed by a mini monitor over hdmi, even a decent 32 - 42" LED TV as dedicated display if finances are tight, upgrade later.

More refresh rates are generally supported, generally better hardware calibration controls although screen uniformity can be an issue and onboard TV colour management can be pretty poor screwing with RGB separation unless picking a decent make and screen size large enough to more easily see noise in the image at native resolution with no scaling artifacts. After switching all the motion smoothing and noise reduction off.

#53
Raw Video Postprocessing / Re: GPU/CUDA acceleration
August 08, 2014, 07:02:10 PM
The rendering part could be GPU accelerated, creating the actual image frame data from the DNGs, GPU debayer perhaps not that common, but going from image frame to prores isn't rendering but encoding and taxes the hardware differently and is most likely CPU bound so multiple cores and multithreaded encoders are going to perform better.

So the restriction of 1fps on your macbook could be slow debayer to image frame or non multithreaded prores encoder.
#54
Quote from: Francis Frenkel on August 07, 2014, 12:27:23 PM
Is it possible to use 2 monitors (with windows)?
One to "work" with tools, and one to vizualise (play back)

I got only one Graphic card... is it possible now ?

If yes : how ?

Francis

Best to avoid using the graphics card interfaces at all for the display you 'grade' on and instead use a mini monitor PCI on Windows and Ultrastudio mini monitor TB for mac.

You could feed your GUI monitor or monitors with the DVI, hdmi or Display port from your graphics card or multiple graphics cards and even a small monitor for the scopes.
#55
Quote from: reddeercity on July 26, 2014, 02:21:00 AM
If you are getting gamma shift and you have used your scopes to keep it in the Rec.709 Color space (16-235) The most likely problem is color correction on a un-calibrated monitor usually in full color (0-255) and gamma of 1.8 (default on Mac)

Whether using restricted or full range for monitoring / scopes doesn't matter as long as the display chain in Resolve is expecting full or restricted, could be limited RGB over DVI/SDI/hdmi or 4:2:2 or 4:4:4 as long as the display can handle and capable of excepting that signal. Quite acceptable and preferred by many to monitor full range, as long as when encoding for final output limited range encoding is chosen in Resolve's delivery tab. Or full range if going to an intermediate. So full range RGB on a mac is no problem if that's the default for an Apple display.

Gamma is 2.2 default for me on Mavericks not 1.8 and I've not been in messing with those settings it's worthless.

QuoteNow if you have adjusted your display with a colorimeter or at least with the Mac Monitor calibrated software

1. Get the display into a acceptable state (as close to 709 standard as possible) for profiling and 3D view LUT creation by using the displays hardware controls, if the display doesn't have decent controls it could well be limited for calibration if it's drifting, just useful for GUI.

2. Ensure no ICC profiles are interfering with the display's gamma. Difficult to kill on a mac.

EVZML is using Resolve so:

3. Use a probe (i1DisplayPro) and suitable calibration software (hcfr to report, DispcalGUI to profile/LUT) to control the Patch Generator in Resolve, probe checks patches from within Resolve GUI or preferably patches on an external display fed by a Blackmagic Mini Monitor to bypass any levels scaling through the vidcard interfaces, DVI, hdmi, DP.

4. Build a 3D LUT for monitor based on rec709, D65 & BT1886 gamma with 2.4 power curve.

5. Put 3D LUT (.cube) in Resolves 3D Monitor LUT slot and set Viewer LUT to Monitor LUT.

Now have calibrated monitor and colour managed app. Resolve doesn't use ICC profiles so all that is redundant including OSX's calibration by eye guff.
   
QuoteThen turn off Automatic gamma adjustment in the ProRes Codec option , I assume your intermediate mov is ProRes .
I use Apple compressor for all my H264 and never had a gamma problem between PC & Mac , it's the same h264 I upload to Vimeo & YouTube.

Resolve's QT h264 output on the mac is incorrect, Windows isn't. But one thing I agree with you is that going to an intermediate codec and then encoding h264 etc from that is better route. Me I use x264 and apps like Handbrake for that.

Web, Vimeo & Youtube output through a browser on a PC or mac I wouldn't judge anything on personally.
#56
Quote from: Levas on July 25, 2014, 03:33:52 PM
Isn't this issue caused by quicktime and the quicktime H.264 codec.
Quicktime always shows video's brighter then they are, uses higher gamma I believe.

Partly true, QT h264 on the mac is encoded incorrectly. On Windows QT h264 is encoded correctly. But again it's important the display chain is proven correct by even basic calibration. There are a number of links in the display chain where levels and gamma can be mishandled, even giving results that look right after been mishandled but only because the rest of the chain is also mishandling it and pushing it back, but excessive banding can give that away.

QuoteTry this, import a jpg in quicktime, and now open the same jpg in finder.
The shadows in quicktime are brighter...

On Mavericks? That sucks, with QT becoming defunct and AV Foundation taking over, which is what FCPX is based on now I believe, QT only being used for codecs not supported by AV Foundation and being re-encoded to be compatible, which appears to be what's happening, wonder how good AVF will be.
#57
Quote from: jimmyD30 on July 25, 2014, 01:16:00 PM
I think a lot of us are dealing with this issue to some degree. If really going for professional work product, then calibrated external monitor with external vectorscope is necessary, but expensive.

The cost to calibrate a monitor or even HDTV is not that much, compared to the expense of software, camera upgrades and all.

It's more about time, to read, learn and put calibration into practice. £160 for an Xrite i1DisplayPro probe and if possible a £95 Blackmagic Mini Monitor Interface either Thunderbolt for Mac or PCI for PC. Download HCFR & DispcalGUI both free and very capable, DispcalGUI will also build a high precision monitor LUT for Resolve, without it even Resolve is not colour managed.

The calibration, profiling and view LUT building will clear up many of the OP's issues above, apart from various codec handling differences with QT between Windows and Mac, but at least we learn what is working right and what is just wrong and avoid it or fix it.

QuoteSome additional things you can do without that stuff is to include watching on HDTV via HDMI (especially if TV is your final medium), but even different HDTVs will give you different looks. Also try setting color values using waveforms, histograms, and vectorscope with your computer monitor according to where your video will be viewed, web, tv, theater, etc., as each uses a different color space.

Why not just calibrate the display whether TV or monitor to as close 709 as it's capable of and a gamma that suits the room viewing conditions, 80cd/m2 for dim lighting, 100cd/m2 for brighter. 709 has no specified gamma unlike sRGB.? Scopes for analysis and colour correction, grading is about aesthetics by eye.

Video standard for HD is 709 whether web browser, media streamer, theater wherever, SD resolutions are more dependent on location in the world but even then still 709 primaries generally. Only time alternative to 709 for HD and higher resolutions for the time being, would be P3 for cinema but then really only when grading by projector.
#58
Multitude of possible reasons for this and the issues between mac and PC playback is so common. Split into three areas. Display chain, media player handling and encoding.

Like differences in levels handling and colour space caused by different physical interfaces DVI & hdmi hardware acceleration on or off, calibrated display chain or winging it, media player quirks, wrong encoded video levels.

A good place to start is calibrate your screens and nail the display chain, make sure Resolve is properly set up, that you encode to restricted range, so you know output is correct from Resolve then go through the same calibration process with a media player that works properly, that can be colour managed and slowly work through the problems eliminating the problems as you go.
#59
Its going to be a combination of many things in your display set up and player.

Sounds like your computet monitor is way out of wack gamma wise so you are driving up colour values to get anything like  what you want where as a commercial film will be mastered on accurate calibrated displays.

You can also bet your TV is out as well.

Really you need to calibrate your displays this will confirm many things in your set up like DVI and HDMI levels and colourspace handling, expected levels ranges, whether your player is messing with it to.

Invest £160 on an xrite i1 Display Pro probe or borrow one if you can, then usr HCFR and Dispcalgui both free to sort it all out. Plus patience and a bit of reading up.
#60
Quote from: Andy600 on July 20, 2014, 04:24:08 PM
@dyfid - you make some very valid points. A lut box is worth the investment if your monitor cannot load luts.

Only if bypassing the OS level ICC profiles and colour management via something like a Mini Monitor or preventing the ICC loading in the first place if using the graphics card output otherwise the LUT box will probably not be getting a 'clean' signal.

LUT Boxes are £300, rather invest that in the 32"+ screen, use a BM Mini Monitor and monitor LUT in Resolve to a dedicated 32" and use the GUI monitors for apps like Lightroom where sRGB would be the target. Can see the benefit in LUT Box if jumping between display targets.
#61
Quote from: morsafr on July 19, 2014, 07:55:18 PM
Thank you so much Andy, the result is impressive, very realistic colors + the benefits of Cinelog-C :)

Just one complementary question: what to change in the workflow if I want a sRGB output gamma instead of REC709?

For stills? Perhaps do the gamma tweak in an output node or outside of Resolve? Video would not be output with sRGB gamma.

QuoteI have several preset modes on my DELL U2711. I'm using the preset mode called "Custom color" to calibrate the monitor to its native gamut (perfect for photography).

Whilst 'Custom Color' would put your monitor into it's native 'wider' gamut, the bottom line is the purpose of that is to make the monitor more likely, more receptive to achieve 100% 709 coverage in the calibration and profiling process, absolutely no point of the wider gamut if it falls short of any other video or cinema standard, you need to reign it in to 709, produce a monitor / viewer 3D LUT for Resolve from your profiling /LUT creation software.

Resolve doesn't use ICC profiles, if you allow your OS to load an ICC profile at boot or login which will only adjust gamma with a 1D LUT anyway (ICC Color Management within the applications handles the rest) your Resolve session won't be colour managed for your display, if you choose to use a 3D LUT for monitoring in Resolve you really don't want the ICC profile getting in the way adjusting gamma because the monitor 3D LUT may well adjust gamma again and lead to unwanted issues with banding and jacked up black levels.

If you can't prevent the ICC loading at boot or log on, i.e: If you're on a Mac then you would need to feed a monitor directly from a BM UltraStudio Mini Monitor by passing the graphics card output. Even on Windows or Linux a BM Mini Monitor is preferred. At worst if you can't prevent ICC profile affecting output you would need to profile your monitor via a patch generator inside Resolve either by Calman, Lightspace or DispcalGUI, so that the ICC profile gamma mucking around is accounted for in your 3D LUT for monitoring in Resolve.

QuoteI also noticed a preset mode called "sRGB". Should I switched to it, create a new ICC profile with my i1 Display Pro and use this combination as my starting point for Resolve (and switching back to the other preset mode/ICC profile for Lightroom)?

Thank you so much for taking the time to answer all our questions!

Monitor presets are worthless unless you are regularly sending the monitor back for recalibration because monitors drift and change over time. Surprised if they're even accurate enough from the factory. Preferably it's best to calibrate the monitor using whatever preset make the display most receptive to achieving 709 and that would probably be 'Custom Color' plus whatever tweaks the monitor will allow with RGB gain and offset, Backlight, Contrast & Brightness. Then profile it with the i1DisplayPro.

But this only gets you close to 709 gamut, part of the goal, also RGB separation or more importantly the minimum of RGB separation along the display curve, screen uniformity in terms of grayscale, native 1080p resolution without scaling and monitor refresh rates (24, 25, 30, 50i & 60i) play a part for video that is. If your monitor only does 60Hz, duplicate frames are added and frames dropping is going to happen for all frames rates other than 30 & 60fps, messing with those smooth pans and motion in general.

Not directly related as you're using Lightroom and photography too for others it's worth considering that it can be a waste of cash to buy expensive 'wild' gamut 60Hz monitors thinking everything is wonderful and calibrated, better to buy a couple of budget monitors for GUI, a BM Mini Monitor and a decent 32"+ LED TV (refresh rates, native resolution) to go with the i1DisplayPro, then at a later date invest in a better 32"+ display depending on finances. The mini monitor is also supported in major NLE's etc so it's no specific to Resolve.
#62
General Development / Re: SRM job memory buffers
July 12, 2014, 01:47:38 AM
Can anyone confirm what MB/s they're getting out of a 700D raw with the SRM code and FW 1.1.3?

I now only get 20MB/s same as a 550D where as before with FW 1.1.1 and the 700D Feb14th Alpha with exact same card, Sandisk Extreme Pro 95MB/s I was getting 40MB/s and continuous 720P. Can't get anywhere near that now.
#63
General Development / Re: SRM job memory buffers
July 11, 2014, 12:49:46 PM
Quote from: mk11174 on July 11, 2014, 11:25:41 AM
Well there you go, that is your problem, you have to update canon firmware to 113, the new ports are 113 not 111, so go to canon and update your camera, then get the nightly and reflash with the FIR file in the nightly and your good to go with ML, and if you want the SRM just use the zip in this thread.

That was it,  :-[, 154mb reserved but now shooting as 20MB/s instead of 40MB/s with 1.1.1 & Feb14th ML? So not getting many frames, sure it was continuous at 720p?. Sandisk Extreme 95MB/s, ExFAT via EOS Card.
#64
General Development / Re: SRM job memory buffers
July 11, 2014, 11:01:11 AM
Quote from: mk11174 on July 11, 2014, 10:12:29 AM
Of course, I already took care of the 700D, will upload in a bit for you.  :)

SRM version for 700D https://bitbucket.org/mk11174/magic-lantern/downloads/magiclantern-Nightly.2014Jul11.700D113.zip

By the way, I just checked the recent Nightly from here http://builds.magiclantern.fm/#/ and it boots fine. The nighties do not contain the patches for the SRM_Memory though, only these test builds in this thread have the srm_memory patch.

Thanks for the SRM build, but hmm, still no good, with Feb14th build on the card the 700D boots, when I put your build on the card, the battery flashes on/off as soon as the card is detected and it won't boot and I have to pull the battery with the card out to reset for it to boot after at all, even with no card in. The Canon Firmware is 1.1.1, but still the Feb14th works fine. What am i doing wrong?
#65
General Development / Re: SRM job memory buffers
July 11, 2014, 09:39:28 AM
Anyone with a 700D build? The recent nightly's won't boot, but the original 14th Feb Alpha boots everytime. http://www.magiclantern.fm/forum/index.php?topic=12617.new#new and everything's ok with nightly 550D's with SRM so I think I'm doing it right.
#66
Been using the Feb 14th Alpha Build on a 700D, decided to try the latest nightly builds for the 700D to try SRM, but the nightly's won't boot. I've tried numerous times, reformatted the SD card with EOSCard, reinstalled with the 700d.fir and swapped between Feb 14th and nightly's ML Folder & autoexec.bin on the same SD card, Feb 14th boots every time, nightly's never. What's up?

[SOLVED] Needed to update firmware from 1.1.1 to 1.1.3  :-[
#67
Quote from: chmee on April 19, 2014, 01:20:51 AM
ok :) blackmagic is putting a xml-stream into the wav-file, thats not gentle, but possible to realize. thnks btw to dyfid
http://www.magiclantern.fm/forum/index.php?topic=7122.msg104365#msg104365

Thanks for the shout out, my post got buried in that thread.

Really miffed about Magic Lantern DNG format, I can add Exif Metadata 'Timecodes' to Magic Lantern DNG's simple enough using Exif Time & Date but Resolve won't read the Timecodes entry, so Resolve won't sync MLV wav & DNG's by Timecode. BUT if I do the same with a Digital Bolex DNG, adding 'Timecodes' Exif data from Time & Date Resolve does read the Timecodes Exif entry for Digital Bolex DNG's. SubIFD vs IFD0 I think.

Hoping Resolve 11 brings some help.

For the <BLACKMAGIC> iXML entries really could do with a wav file from a BM camera with loads of metadata added via the camera to establish the mark up tags to run through BWFEdit as it'll extract an xml file with them all listed.

The iXML injection via BWFEdit also works with audio from or within h264 MOV's as well.
#68
Has there been any discussion about writing Magic Lantern DNG metadata in iXML format into the MLV wav file (sidecar) for something like Blackmagic Resolve's extensive metadata use?

I've so far tested a 3 line batch script which dumps the contents of the MLV's into named folders and then uses the open source CLI utility BWFMetaEdit, found here: http://sourceforge.net/projects/bwfmetaedit/files/binary/bwfmetaedit/1.3.0/ to write the contents of an iXML file into the .wav acting as audio + sidecar.

For %%a in (*.MLV) do mkdir "%%~na"
For %%a in (*.MLV) do mlv_dump.exe -f 10 --dng --cs3x3 -o "./%%~na/%%~na_" %%a
For %%a in (*.MLV) do bwfmetaedit.exe "./%%~na/%%~na_.wav" --in-iXML="my_iXML.xml"

Contents of iXML file.

<?xml version="1.0" encoding="UTF-8"?><BWFXML><IXML_VERSION>1.5</IXML_VERSION><PROJECT>Test</PROJECT><NOTE>Blackmagic Metadata Write Test</NOTE><BLACKMAGIC-KEYWORDS>magic,lantern,test</BLACKMAGIC-KEYWORDS</BWFXML>

There are a lot of camera related metadata options for the the usual, like ISO, WB, Shutter, Camera Model, Firmware Version as well as more project related tags. Which I could work through testing the tag names and writing the data to wav.

So is it something a dev would consider implementing DNG EXIF into wav sidecar? This would also open up anyone wanting to develop a metadata entry GUI in camera to write maybe project and multicam type data via ML?

If not then I suppose there's someone writing a MLV extractor which could do the extract EXIF data with EXIFTool, write to XML file, write to wav and so on.

It's then simple in Resolve to Import Media by   "Add Folders & Sub Folders To Media Pool (Create Bins)" to give each MLV folder a separate bin containing the DNG sequence and wav, then choose the "Auto Sync Audio Based On Timecode", to link the wav to the DNG sequence resulting in wav metadata sidecar contents being displayed in the Resolve Metadata Interface for any DNG or wav selected.