fastcinemadng

Started by katrikura, February 22, 2017, 12:17:51 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

megapolis

QuoteAs said above, no ProRes 12 bits. Only Avfoundation.
As I know, ProRes is DCT-based compression algorithm, which is, to a certain extent, almost the same (not exactly the same) as JPEG. We have implemented 12-bit JPEG on GPU and it's super fast:
https://www.fastcompression.com/products/jpeg/12-bit-jpeg-codec.htm
Please note that libjpeg library also has 12-bit JPEG option and its performance is sufficient to get decoding in realtime on modest CPUs.
By the way, Blackmagic CDNG RAW 3:1 is also utilizing that compression algorithm.
Could it be a solution?

Quotej2k is interesting because if it's wrapped in MXF, it can be used for editing on all NLE.
I think that j2k is too heavy to be comparable with ProRes, j2k is too slow, though it could have better quality. To get smooth playback for j2k at full resolution, you need very good PC, no chances for laptop. I agree that j2k at MXF can be used at any NLE, but how many users really do that? In that workflow we will need multi-GPU PC. We are going to add such a feature, hopefully this year, but two GPUs will be a must. Our j2k codec on GPU is ready.

QuoteCdng is good and all the options you offer are excellent, but this format is sadly not very well handled by fcp X / Premiere / Avid.
Our software can smoothly process and play CDNG/RAW series at full resolution for 4K and higher. Can you do that with FCPX/Premiere/Avid smoothly not at ½ or ¼ mode together with high quality debayering? At the moment, external FFmpeg is the only option to use our software with any NLE. Interoperability with FFmpeg is working well.

QuoteThen, where we can use it except with Davinci Resolve?
At the moment that software is mostly utilized by our customers which are ok with export of JPEG series. In that case everything is done on GPU and this is very fast. It's called ingestion task for realtime multiple camera systems and that approach is also used in 3D and VR applications. Please have a look how it's working:
http://ir-ltd.net/introducing-the-aeon-motion-scanning-system/

I agree, that export to ProRes or DNx would be necessary for our software, but right now this is not possible. It's a shame. But you can use it with any NLE via FFmpeg.
We also have very positive feedback concerning our MG debayer algorithm and this is the reason why many our users do preprocessing at Fast CinemaDNG Processor software.

QuoteI think ProRes + J2k GPU encoding + GoPro cineform GPU is very futur proof editing codec as it can be used in cross platform.
It looks interesting, but we are not sure that this is standard approach. Implementation of such an algorithm on GPU will take at least a year (we are a small company), so before starting such a development, we need to understand why it's worth doing. Unfortunately it's not clear for us.

theBilalFakhouri

Hi @megapolis

It would be nice if we can export .wav audio file beside Motion JPEG .avi format for MLV so we can have audio playback too after exporting to Motion JPEG.

Thanks!

12georgiadis

Quote from: megapolis on August 03, 2018, 08:34:52 AM
As I know, ProRes is DCT-based compression algorithm, which is, to a certain extent, almost the same (not exactly the same) as JPEG. We have implemented 12-bit JPEG on GPU and it's super fast:
https://www.fastcompression.com/products/jpeg/12-bit-jpeg-codec.htm
Good to know ! But there is also 12 bit with ProRes. It needs AVfoundation to work and not ffmpeg. MLV app is also using AVfoundation to get 12 bit prores. That could be an interesting implementation.

Please note that libjpeg library also has 12-bit JPEG option and its performance is sufficient to get decoding in realtime on modest CPUs.
By the way, Blackmagic CDNG RAW 3:1 is also utilizing that compression algorithm.
Could it be a solution?
I think that j2k is too heavy to be comparable with ProRes, j2k is too slow, though it could have better quality. To get smooth playback for j2k at full resolution, you need very good PC, no chances for laptop. I agree that j2k at MXF can be used at any NLE, but how many users really do that? In that workflow we will need multi-GPU PC. We are going to add such a feature, hopefully this year, but two GPUs will be a must. Our j2k codec on GPU is ready.
Our software can smoothly process and play CDNG/RAW series at full resolution for 4K and higher. Can you do that with FCPX/Premiere/Avid smoothly not at ½ or ¼ mode together with high quality debayering? At the moment, external FFmpeg is the only option to use our software with any NLE. Interoperability with FFmpeg is working well.
At the moment that software is mostly utilized by our customers which are ok with export of JPEG series. In that case everything is done on GPU and this is very fast. It's called ingestion task for realtime multiple camera systems and that approach is also used in 3D and VR applications. Please have a look how it's working:
http://ir-ltd.net/introducing-the-aeon-motion-scanning-system/

I agree, that export to ProRes or DNx would be necessary for our software, but right now this is not possible. It's a shame. But you can use it with any NLE via FFmpeg.
We also have very positive feedback concerning our MG debayer algorithm and this is the reason why many our users do preprocessing at Fast CinemaDNG Processor software.
It looks interesting, but we are not sure that this is standard approach. Implementation of such an algorithm on GPU will take at least a year (we are a small company), so before starting such a development, we need to understand why it's worth doing. Unfortunately it's not clear for us.

Ok for j2K.
Your example of VR project etc. is an interesting but specific approach. Here is what I'm trying to get for 2 years :

traditional digital cinema workflow

1) You're shooting MLV, which are the raw files, right ?
2) common cinema workflow is offline/online workflow. A good news is that 5DmkIII can records both MLV + H264 proxy. Thanks to Danne's switch app, it's possible to stamp the same tc to the proxy file than the MLV and correct synching the offrame of the proxy (it doesn't start in the same time as the mlv record). This way, we can edit offline (with proxy) and conform later online (relink) to the MLV. Which means, no transcoding and direct editing with h264 proxy. For Fcpx users, background proxies can improve the speed of playback.
3) When it's finished, you send an FCPXML to DavinciResolve and use MLVFS to generate on the fly CDNG readable by Resolve. the conformation is perfect thanks to the TC, like in any other cinema camera workflow (sony, red, arri...)

What is the benefit ?
1) direct editing on any modern NLE
2) save tons of data space and time !
3) With a standard and fast forward workflow, more pro people from cinema could jump into the magic lantern adventure to shoot feature with old DSLRs, open-source workflow and share their tips & experiences to the community.

What are the cons ?
1) poor performance during grading (because CPU MLVFS)
2) no interface with switch (command line only) and no GPU rendering and playback.

What could be improved with fast cinema DNG ?
1) generate ProRes Proxy via FFmpeg with a script that timestamp the same TC on proxy than on the MLV. Most cameras cannot generate proxy mlv in the same time (all except 5DmkIII) + script for 5DmkIII that correct black off frame from proxy & timestamp same tc on proxy than MLV
2) a system to generate on the fly CDNG with GPU rendering to speed workflow with resolve ? Or if not possible, generate CDNG that maintains the same TC and synch them to the proxy file (H264 or prores proxy)

More info on h264 proxy + MLV workflow here : https://www.magiclantern.fm/forum/index.php?topic=15108.msg189532#msg189532
see all thread.


megapolis

@theBilalFakhouri

QuoteI was using a build from about three months without problems in playback but the latest one I can't playback the file until I press the mute sound button and when sound is on the playback stuck, can you confirm that in Windows ? (I am on Windows 8.1).

QuoteIt would be nice if we can export .wav audio file beside Motion JPEG .avi format for MLV so we can have audio playback too after exporting to Motion JPEG.

In the latest release you can see bug fixes both for audio sync and for .wav audio at MJPEG:
https://www.fastcinemadng.com/download/download.html

megapolis

@12georgiadis
QuoteWhat could be improved with fast cinema DNG ?
1) generate ProRes Proxy via FFmpeg with a script that timestamp the same TC on proxy than on the MLV. Most cameras cannot generate proxy mlv in the same time (all except 5DmkIII) + script for 5DmkIII that correct black off frame from proxy & timestamp same tc on proxy than MLV
2) a system to generate on the fly CDNG with GPU rendering to speed workflow with resolve ? Or if not possible, generate CDNG that maintains the same TC and synch them to the proxy file (H264 or prores proxy)

I think that it could be done, but we need more sofisticated specification for the full task. Could you please prepare it in more detail from the very beginning till the end? We will probably be able to generate proxy .mp4, .wav and full-frame .dng series at the same time.
Here are some questions:
1. If we choose new start and stop positions for CDNG series, which time code we need to write to CDNG file at start position? Is it zero or not? The same question for CDNG file at stop position – which time code should be there?
2. If we choose correct start and stop positions at CDNG series, what we have to do with audio file? Should it be truncated according to start and stop coordinates?
3. I don't think that we need to do anything with black frame series because we will choose start and stop positions manually. Is it correct?

theBilalFakhouri

@megapolis

I have just download the latest version but the problem is still there and I can't see .wav beside Motion JPEG. Maybe you have uploaded a wrong version?

megapolis

Sorry, we've fixed that bug for CDNG series, not for MLV. If you do fast export from CDNG to MJPEG, then wav file is also exported to the same folder. Will try to do that for MLV tomorrow.

theBilalFakhouri

Oh thanks for this nice software! :D I am waiting ;D

megapolis

@theBilalFakhouri
We've fixed bugs with mlv audio and added .wav for MJPEG export option:
https://www.fastcinemadng.com/download/download.html
If you see any bugs, please let me know.

theBilalFakhouri

Thank you! it's working now. I will do my best to support this nice software soon.

megapolis

@12georgiadis
QuoteWhat could be improved with fast cinema DNG ?
1) generate ProRes Proxy via FFmpeg with a script that timestamp the same TC on proxy than on the MLV. Most cameras cannot generate proxy mlv in the same time (all except 5DmkIII) + script for 5DmkIII that correct black off frame from proxy & timestamp same tc on proxy than MLV
2) a system to generate on the fly CDNG with GPU rendering to speed workflow with resolve ? Or if not possible, generate CDNG that maintains the same TC and synch them to the proxy file (H264 or prores proxy)

Here you can download such a draft version:
https://yadi.sk/d/TTvMJIEKzCTVT

Please note that for FFmpeg you need to add the following to the command line:
-timecode 00:00:00.00 if fps = 29.97
-timecode 00:00:00:00 if fps = 59.94

As we understand, you will do the following:
Export CDNG series with FastCinemaDNG
Export to mp4 or anything else via FFmpeg (from Fast CinemaDNG)
Do the rest in Davinci or Final Cut.

Please let us know whether this is correct or not.

12georgiadis

Hello Megapolis,

thank you and sorry for late reply, I'm on vacations  8)

Quote from: megapolis on August 21, 2018, 11:57:14 AM
@12georgiadis
Here you can download such a draft version:
https://yadi.sk/d/TTvMJIEKzCTVT


Thank you. I'll do the test in middle of september.

Quote from: megapolis on August 21, 2018, 11:57:14 AM

Please note that for FFmpeg you need to add the following to the command line:
-timecode 00:00:00.00 if fps = 29.97
-timecode 00:00:00:00 if fps = 59.94

As we understand, you will do the following:
Export CDNG series with FastCinemaDNG
Export to mp4 or anything else via FFmpeg (from Fast CinemaDNG)
Do the rest in Davinci or Final Cut.

Please let us know whether this is correct or not.


Not exactly.
1) I shoot MLV raw and simultaneously record in Proxy H264 mp4. To do that, the canon has to start first the H264 rec and then the MLV raw. There is a timing difference. The timing difference is filled by a black on the mp4 Proxy. The sound is recorded on the mp4 proxy only.
2) in switch, I use a script that dupplicates the h264, reset the tc to 00:00:00:00 on the new dupplicate file and cut the black and export the audio.
it takes maximum 10' (depends on the volume of files)
3) I edit directly with proxy on premiere / Fcpx  etc. (offline editing). I mean, I'm editing 10' after the offload. No need to transcode. No time wasted. My last documentary was 30 hours of MLV files and I edit on laptop ;-) If the laptop is lagging, I activate Prores proxy background transcode. And i can switch between proxy mp4 and prores proxy from fcpx if neccessary. It's automatic, and you have better performances. at the end I switch again on mp4 proxy (it's one button).
4) I finish the edit, I conform by exporting an XML/FCPXML, AAF, depends on the software NLE. I don't export video file or audio file.
5) I generate on the fly CDNG with MLVFS. They have TC 00:00:00:00, same as the proxy mp4
6) I open resolve, I put my CDNG files in a bin, I put the exported audio in a bin, I import the XML/FCPXML/AAF, I relink automatically thanks to the TC + exact same filename and it's conformed. My editing timeline is back, but with the CDNG files now.
7) I can grade, export the master etc.

I don't want to generate durable CDNG files. I already have MLV files to store. And it's a lot of storage. And it's so slow ! We have to transcode before edit ! Too long for nothing. And I don't want to double the storage. This is nonsense.
The problem is that MLVFS is slow because it's CPU. The idea is to have a GPU solution for on the fly CDNG, because Resolve won't read MLV.
And your software has good settings, good corrections and good performance CDNG, this can be a very good solution. Also because MLVFS doesn't have all the options that you have.


megapolis

QuoteThe problem is that MLVFS is slow because it's CPU. The idea is to have a GPU solution for on the fly CDNG, because Resolve won't read MLV. And your software has good settings, good corrections and good performance CDNG, this can be a very good solution. Also because MLVFS doesn't have all the options that you have.

I don't think that MLVFS is the slowest processing module. If you disable focus dots removal with chroma smooth, the only task for mlvfs is to find frame data block on disk, to generate dng header and to send it to application. These stages are not CPU hungry, their speed is rather limited by HDD performance.

DNG processing is slow because of demosaic and denoise stages. High quality demosaic and denoise algorithms are CPU hungry. We've managed to develop extremely fast and high quality demosaic and denoise GPU filters. That's why dng processing in Fast CinemaDNG is faster than in Resolve. And this is the reason why it is useless for us to boost MLVFS with GPU. The most heavy tasks are perfomed by rendering application, not by MLVFS.

Fast CinemaDNG can produce synchronized proxy, CinemaDNG and audio from MLV. But still you have to process DNGs in Resolve, that wouldn't be as fast as mp4, h.264 or ProRes processing.

12georgiadis

Quote from: megapolis on August 23, 2018, 02:20:59 PM
I don't think that MLVFS is the slowest processing module. If you disable focus dots removal with chroma smooth, the only task for mlvfs is to find frame data block on disk, to generate dng header and to send it to application. These stages are not CPU hungry, their speed is rather limited by HDD performance.

DNG processing is slow because of demosaic and denoise stages. High quality demosaic and denoise algorithms are CPU hungry. We've managed to develop extremely fast and high quality demosaic and denoise GPU filters. That's why dng processing in Fast CinemaDNG is faster than in Resolve. And this is the reason why it is useless for us to boost MLVFS with GPU. The most heavy tasks are perfomed by rendering application, not by MLVFS.

Fast CinemaDNG can produce synchronized proxy, CinemaDNG and audio from MLV. But still you have to process DNGs in Resolve, that wouldn't be as fast as mp4, h.264 or ProRes processing.

Ok very interesting answer. I have another proposition :
If you add a feature to import xml/fcpxml/aaf, we can import the timeline's information at the end of the edit and detect by the clip name only the clips that have been used in the edit. This way, we process the edited files and still save a lot of storage (for a doc, we often use 1/30 of rushes or less). Plus it's automatic and a sort of conformation / consolidate combo. Do you think it's possible ?


Envoyé de mon iPhone en utilisant Tapatalk

megapolis

QuoteOk very interesting answer. I have another proposition:
If you add a feature to import xml/fcpxml/aaf, we can import the timeline's information at the end of the edit and detect by the clip name only the clips that have been used in the edit. This way, we process the edited files and still save a lot of storage (for a doc, we often use 1/30 of rushes or less). Plus it's automatic and a sort of conformation / consolidate combo. Do you think it's possible?

We understand that with MLVFS you have just one source, which is finally processed with Resolve, so you spend less HDD storage, but processing is slow due to both MLVFS and Resolve.

As we see, you would like to get fast MLVFS to solve your problem. With such MLVFS you will still process DNG not very fast due to Resolve. Unfortunately, that feature is outside the scope of our software. We process both DNG and MLV directly. For your task you can temporarily save DNG series to process them fast with our software.

12georgiadis

Quote from: megapolis on August 24, 2018, 11:39:09 AM
We understand that with MLVFS you have just one source, which is finally processed with Resolve, so you spend less HDD storage, but processing is slow due to both MLVFS and Resolve.

As we see, you would like to get fast MLVFS to solve your problem. With such MLVFS you will still process DNG not very fast due to Resolve. Unfortunately, that feature is outside the scope of our software. We process both DNG and MLV directly. For your task you can temporarily save DNG series to process them fast with our software.
@megapolis,
It is a good summary of my previous hypothesis. I perfectly understood that resolve is the problem and that your software is the fastest to process direct dng/cdng.
There is stills another way to bypass resolve's problem. If you add a feature that import xml/aaf/fcpxml and reconnect to the mlv, it can be amazing.
1) if I shoot 100hours of shooting and at the end I have a 1hour film, do you think it's a good idea to process all Mlv files in dng before the edit ? No, especially if you have proxy h264 (for now only with 5dmkiii but it could be generalized if lossless 8...11 is done on digic4/5/6 caméras). This way you can edit with it and save storage + time !
2) so the reliable solution is to process direct mlv in your software with only the clips that have been edited in the timeline after the editing in proxy h264. To reference automatically the clip used in the timeline, we use Xml / aaf / fcpxml. It avoids the need to reference by hand and one by one the clips and then import it in your software. If you add an import feature with xml/aaf/fcpxml, we can have all the used clip in the batch list automatically. A timeline of one hour, it can go up to 400 clips. So, it's better to do it automatically, like it's done in cinema since 1990 with the Analog roll / proxy video clip and EDL/avid.

Ideally, we could pre-grade these clips and export in ProRes cine-log to have good performance in resolve.



Envoyé de mon iPhone en utilisant Tapatalk

megapolis

QuoteIt is a good summary of my previous hypothesis. I perfectly understood that resolve is the problem and that your software is the fastest to process direct dng/cdng.

If this is the case, why are you going to process dng/cdng in Resolve? You can process mlv at FastCinemaDNG and get output with any intermediate codec to work with further. This is the way to bypass mlv reading and dng debayering at Resolve. Yes, you will need more HDD space (though not too much due to compression), but you will not work with mlv/dng at Resolve anymore. Why don't you consider that approach?

QuoteThere is stills another way to bypass resolve's problem. If you add a feature that import xml/aaf/fcpxml and reconnect to the mlv, it can be amazing.

I agree that this is one more possible way to solve the problem, but it's not simple. We will need to get into xml/aaf/fcpxml formats and at the moment we don't have such experience.

Quote1) if I shoot 100hours of shooting and at the end I have a 1hour film, do you think it's a good idea to process all Mlv files in dng before the edit ? No, especially if you have proxy h264 (for now only with 5dmkiii but it could be generalized if lossless 8...11 is done on digic4/5/6 caméras). This way you can edit with it and save storage + time!

I think that still it's a viable idea, as soon as there is a reason to preview all your footages with maximum image quality and at full resolution, not just with proxy h264.

Quote2) so the reliable solution is to process direct mlv in your software with only the clips that have been edited in the timeline after the editing in proxy h264. To reference automatically the clip used in the timeline, we use Xml / aaf / fcpxml. It avoids the need to reference by hand and one by one the clips and then import it in your software. If you add an import feature with xml/aaf/fcpxml, we can have all the used clip in the batch list automatically. A timeline of one hour, it can go up to 400 clips. So, it's better to do it automatically, like it's done in cinema since 1990 with the Analog roll / proxy video clip and EDL/avid.
Ideally, we could pre-grade these clips and export in ProRes cine-log to have good performance in resolve.

I agree that this is an ideal case, but I can't tell you when it could be ready. Currently we are also working on server version of Fast CinemaDNG and it will work with multiple clips, multiple GPUs and with scripting. Your suggested workflow in not included yet, but I think that it could be possible.

12georgiadis

Quote from: megapolis on August 26, 2018, 09:38:08 AM
If this is the case, why are you going to process dng/cdng in Resolve? You can process mlv at FastCinemaDNG and get output with any intermediate codec to work with further. This is the way to bypass mlv reading and dng debayering at Resolve. Yes, you will need more HDD space (though not too much due to compression), but you will not work with mlv/dng at Resolve anymore. Why don't you consider that approach?
Thank you for your answer Megapolis. That's what I meant. Editing h264 on an NLE, exporting xml/aaf/fcpxml to preserve the in/out of each clip, order and name clip, import xml etc. To fast cinema dng, conform (offline h264 to Online MLV), get the list clips used in the editing, make pre-correction on mlv (debayering, chromasmooth, dot removal, vertical stripes, dark frame...), export in ProRes c-log, conform from xml/aaf/fcpxml, reconnect the timeline editing with ProRes, color grade, import mix, export master to any deliverable.
[/quote]

Quote from: megapolis on August 26, 2018, 09:38:08 AM
I agree that this is one more possible way to solve the problem, but it's not simple. We will need to get into xml/aaf/fcpxml formats and at the moment we don't have such experience.

I think that still it's a viable idea, as soon as there is a reason to preview all your footages with maximum image quality and at full resolution, not just with proxy h264.

Most of the time Editors & directors currently Edit offline and don't need maximum quality to preview. A color grader need the best quality Most of them are working on resolve. So we need to do a conformation. That's why I suggested to do the transcode from mlv to ProRes after the edit (to save storage & time) by importing an xml in fast cinema dng and then reconnect it in resolve. Resolve is a finishing software. We grade, we're assembling the mix and exporting the masters. All this process need to be done automatically.
[/quote]

Quote from: megapolis on August 26, 2018, 09:38:08 AM

I agree that this is an ideal case, but I can't tell you when it could be ready. Currently we are also working on server version of Fast CinemaDNG and it will work with multiple clips, multiple GPUs and with scripting. Your suggested workflow in not included yet, but I think that it could be possible.
Could you explain more the advantage of a server version ?
For the rest, it would be more than necessary to include it. It's essential for a good cinema workflow. Offline/online is just the standard, as well as XML/aaf/fcpxml. We cannot transcode everything even if it's compressed. It's doubling the online storage. It's ok if you have to generate intermediate from timeline's file. Not the best for a production but it's ok for storage's cost. If you need some case examples of workflow, I can send it to you. I worked for feature and shorts with some very original & exotic workflows but in all cases, for time and money, offline/online was used.

Envoyé de mon iPhone en utilisant Tapatalk

12georgiadis

@megapolis :
For cameras without h264 proxy's, the ideal workflow would be :
1) shoot mlv
2) import in fast cinema DNG and export all footage to ProRes proxy (36mbit) (it's one step more than with 5dmkiii)
3) edit in NLE
4) export the editing in xml/aaf/fcpxml
5) import xml/aaf/fcpxml in fast cinema dng, pre-correct, export ProRes c-log (444 or above)
6) conform in resolve with ProRes c-log, grade and export master.



Envoyé de mon iPhone en utilisant Tapatalk

megapolis

@12georgiadis
QuoteCould you explain more the advantage of a server version?

That solution is not yet ready, so it's not worth speaking about advantages. We are working on the software which will work as a server, without GUI, to process big quantities of frames on one or several GPUs. As soon as we could have many different workflows, we need to develop scripting to be able to cope with different tasks. This is concerned not only with mlv/dng, but also with much more solutions for massive image processing.
We already can process mlv/dng over network, so there is no need to copy raw data to your local PC if your network is fast enough.

QuoteIf you need some case examples of workflow, I can send it to you. I worked for feature and shorts with some very original & exotic workflows but in all cases, for time and money, offline/online was used.

Thanks for your suggestion. We will need your examples, and it would be better to start from standard, non-exotic workflow.

QuoteFor cameras without h264 proxy's, the ideal workflow would be:
1) shoot mlv
2) import in fast cinema DNG and export all footage to ProRes proxy (36mbit) (it's one step more than with 5dmkiii)
3) edit in NLE
4) export the editing in xml/aaf/fcpxml
5) import xml/aaf/fcpxml in fast cinema dng, pre-correct, export ProRes c-log (444 or above)
6) conform in resolve with ProRes c-log, grade and export master.

There is also an opportunity to create MP4 with our software and FFmpeg at step #2,  then you won't have any problems with storage.

12georgiadis

Quote from: megapolis on August 26, 2018, 09:51:50 PM
@12georgiadis
That solution is not yet ready, so it's not worth speaking about advantages. We are working on the software which will work as a server, without GUI, to process big quantities of frames on one or several GPUs. As soon as we could have many different workflows, we need to develop scripting to be able to cope with different tasks. This is concerned not only with mlv/dng, but also with much more solutions for massive image processing.
We already can process mlv/dng over network, so there is no need to copy raw data to your local PC if your network is fast enough.

Thanks for your suggestion. We will need your examples, and it would be better to start from standard, non-exotic workflow.

There is also an opportunity to create MP4 with our software and FFmpeg at step #2,  then you won't have any problems with storage.
Mp4 is gop Ibp codec. It's not
Optimized for editing. ProRes proxy is 36 mbit, just a little bit more than dv bitrate. It's widely used. Mp4 proxy is ok because this is the only option possible With a 5dmkiii. But if we can choose, it's definitely better An i-frame based codec for performances


Envoyé de mon iPhone en utilisant Tapatalk

12georgiadis

Quote from: megapolis on August 26, 2018, 09:51:50 PM
@12georgiadis
That solution is not yet ready, so it's not worth speaking about advantages. We are working on the software which will work as a server, without GUI, to process big quantities of frames on one or several GPUs. As soon as we could have many different workflows, we need to develop scripting to be able to cope with different tasks. This is concerned not only with mlv/dng, but also with much more solutions for massive image processing.
We already can process mlv/dng over network, so there is no need to copy raw data to your local PC if your network is fast enough.

Thanks for your suggestion. We will need your examples, and it would be better to start from standard, non-exotic workflow.


Ok for server solution.
For workflows give me a contact e-mail and I send you some pdfs.




Envoyé de mon iPhone en utilisant Tapatalk

12georgiadis





one example of my previous film

megapolis

@12georgiadis
Thanks a lot for your info. You can use my email from https://www.fastcinemadng.com/contacts/contacts.html
or you can send PM to me.

bouncyball

QuoteFast CinemaDNG Processor for Linux Ubuntu 16.04, 64-bit - expected at August 2018

August has gone ;)