Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Messages - maxotics

#101
Tragic Lantern / Re: Tragic Lantern for EOS M
November 11, 2013, 09:42:30 PM
What is the most stable build for RAW shooting (assuming there is no shutter bug, or it wouldn't be noticed by anyone shooting only video)?

Users, please post any bugs you experience, or features you believe necessary for a stable VIDEO build.

1%, your thoughts?  For example, what features are on your to-do list?  I can add a page to the shooter's build that has your "release" notes, so to speak.  I'll maintain it for you, if you like.

I will be releasing a window PDR console app soon.  I'd like to update my download files from maxotics.  Does this make sense to everyone?
#102
@Marsu42.  All your comments are valid.  Net-net, I think we'd like to see the same outcome.
#103
Feature Requests / Re: $300 offered to developer
November 11, 2013, 07:38:32 PM
Quote from: 1% on November 11, 2013, 07:23:43 PM
There are so few people to develop, that's part of the problem. As a handful of people how to we write code + documentation with the time available, then answer questions from people who didn't read the docs by choice.

Absolutely 1%.  If anyone knows how hard you, g3gg0, Andy and Alex work it's me ;)  Although you would need to change your work priorities a little bit, nothing is possible, or worth doing, unless others start to put in the same care to create a stable build as you do in developing new features.  We need more programmers and we need documentation and some sort of better recognition system. 

5D3Shooter put up $200.  Mva offered $100.  The interest is there--the generosity of spirit.  We offered you a lens a few weeks back.  The question is, can enough people doing ML agree to work together to create a plan and timeline to bring the fruits of your labor to more people, I believe, would love to have it. 

The question is, what would you want out of it, if you were to devote 10% of your time on creating a stable build?  Or what does anyone want to take up the OP on his original offer?  (Yes, BM has problems, pain is unavoidable in every pursuit.  I still believe we can do a lot better).
#104
Quote from: Andy600 on November 11, 2013, 06:06:18 PM
@maxotics - sorry but I think your frustration is unwarranted
Just my 2c ;)

That's the problem with all my frustrations, they're all unwarranted ;)  Seriously, thanks for your 2c!  I don't plan on leaving.   My point is only that ML is only going to reach, in my estimate 10% of it's actual potential as a real-world solution to film-maker's needs.  Does it really have anything to do with commercialism?  I'm not getting paid to do what I share with ML either.  But I'd like to see as many people get something good out of their EOS-M, for example, as possible.  Instead of working on code or shooting I spent time on that shooter's manual because that's what users need.  I don't like doing it, but I recognize that the more people who use the EOS-M, the more developers will work on it, so the irony many devs don't see is that going backwards often propels you forward.

Quote from: Andy600 on November 11, 2013, 06:06:18 PM
ML is basically a skunkworks and a committee will not work in that environment without seriously hindering development times.

That's why most development efforts that seek larger audience do both.  There are trade-offs.  I do believe I am not alone in this.  There are many people who would rather have stability than new features.  What gets developed, and how fast, is a philosophical question.  What would you rather have, new features that bring on 2-5 people, or stability that brings in hundreds?  So I'd put this another way, focusing only on development hinders acceptance and low acceptance ends up with loneliness and failure ;)  (if I may wax poetic).

I can shoot RAW on both my 50D and EOS-M.  What I can't do is point someone to a stable build, release notes and a thread of known bugs in the stable release.  There are people who can do it (if properly motivated), but we all need to recognize their effort.  And that brings us back to the OPs original post.  He offered $200 to make the camera more useful to film-makers like himself.  Most of what he got back was, again, in my book, woefully short-sighted.

For my part, I will do my best with the EOS-M.  I can only hope others follow my (and others like yourself) lead at some point.


#105
Quote from: engardeknave on November 11, 2013, 05:02:55 PM
What's being done to make ML a real world solution to Arkanoid? I don't understand why everyone doesn't drop their own petty interests and focus on my vastly more important ones.

Well, that's my question.  I don't believe anyone is being purposely petty.  Just like most people who loot during a black-out aren't typically thievish.  What I do believe, is that the current "culture" of ML is not conducive to productivity in a professional sense; that is, software delivered to the "customer" that can be depended upon to meet their requirement(s).  Yes, I know everyone works for free and it's not my right to say how ML should proceed.  I can only give my opinion.  And my opinion is that ML is not a real-world solution to RAW video and never will be until people gather around and

1. Create some sort of mission statement (goal)
2. Create some sort of organization (committee, board, working-group)
3. Work to create a supportive environment to both the lead devs and those who do dirty work.

Are there downsides to the above?  Yes.  I HATE committees.  We can't have a perfect world. 

I'd like to see a dev take the OP up on his offer.  Why, because it would show some VALUE back to the end-user! (I couldn't care less about that feature, btw).  Again, my opinion, is that ML is going to remain a card-trick curiosity until it starts working for the people who want to use this technology in THEIR way. 





#106
Feature Requests / Re: $300 offered to developer
November 11, 2013, 04:15:38 PM
Quote from: engardeknave on November 11, 2013, 03:43:42 PM
Get to work.

Here is my shooter's guide for the EOS-M on this forum

http://www.magiclantern.fm/forum/index.php?topic=8825.msg82944#msg82944

I have done a tremendous about mount research into hybrid focus pixels and have posted it.  I have written scripts.  Here

http://www.magiclantern.fm/forum/index.php?topic=7908.msg70298#msg70298

I am about to post an open-source project that fixes Pink Dots in C#

So back to my question, what is being done about making ML a "real world" solution to RAW video?  And I don't mean new features.  I mean a plan and timeline to get to a stable version.
#107
You are getting the reaction I feared you would get.   It is why Plato said, of all the forms of government, democracy is the worst.  ML is the Tragedy of the Commons.  ML is anything you want it to be... as long as you don't expect people to work together for a common goal.  Each dev works on what they want.  If you question them, others reprimand you.  There is NO PLAN or TIMELINE I can see, to make ML RAW stable.  I don't want to stereotype, but I will.  The devs are like kids saying they'll clean up their room.  Talk, talk, talk.  ML RAW works well enough now for a stable version to be created.  No one wants to put in the time because, again, the Tragedy of the Commons.  While one dev works on documentation others would work on new cool features that got compliments and recognition. 

In basketball terms, most ML devs do not share the ball!

People who value teamwork are not valued and eventually leave.

The devs have exceptional hacking skills, but just as exceptionally bad social and team-work skills.

Money irritates many here because with money comes responsibility.

I found my time with ML a rich and rewarding learning experience.  It seems it will end there.  The Black Magic Pocket Cinema Camera is now in stock and why mess with Canon cameras to get what others have professionally designed and built in a $1,000 camera? 

5D3shooter, I THANK YOU for trying move ML into a more usable version, in the way you were comfortable doing. 
#108
Quote from: 3pointedit on November 11, 2013, 03:29:41 AM
So these 'boundary skipped sensels' are being targeted in OLPFs? I guess that they get an averaged value of light at that location so that there are no transient peaks that would confuse the sensel groupings. Duplicating the previous line would result in jagged or missing edges. Can any information be reconstructed from those corrupted sensel sites?

I guess easiest solution would be interframe substitution, but that would be slow and require movement across the sensels.

I don't believe the camera devs have any control over which sensels they read from.  They intercept the RAW stream that goes to the LiveView.  I've thought about analyzing nearby pixels for lines, but not what I think you suggest, which is an interesting idea, that the corrupted sensels may "inform" what kind of image is surrounding them.

This problem is not very high on my list anymore because ML, in general, is too disorganized and unprofessional.  This problem would require a team effort, and that's not ML's strong suit.   I don't mean that in a mean way.  It just is what it is.  The Black Magic Pocket Cinema cameras seem to be in supply now and that, and similar cameras, make ML worth some effort, but not too much ;)
#109
Tragic Lantern / Re: Tragic Lantern for EOS M
November 11, 2013, 01:45:13 AM
1%, he wrote " just yesterday the GPU temperature sensor on my mac died"  :) :) :)  I'm thinking he meant mac as in Macbook Pro.  EDIT: Oh, now I'm the dummy!!!! You meant the temperature sensor!  I hope... ;)
#110
Tragic Lantern / Re: Tragic Lantern for EOS M
November 11, 2013, 01:12:44 AM
Quote from: 1% on November 10, 2013, 11:42:00 PM
If its just the sensor you can either underclock or run the fan at full speed... if the fan died then bigger problems :(

1%, you have cameras on the brain.  I think you meant CPU, not sensor ;)
#111
Tragic Lantern / Re: EOS M Alpha shutter-bug discussion
November 09, 2013, 02:39:24 AM
delete
#112
Tragic Lantern / Re: EOS M Alpha shutter-bug discussion
November 08, 2013, 09:56:53 PM
Quote from: gary2013 on November 08, 2013, 09:35:48 PM
Max. how do you extend and retract the 22mm? It is a fixed lens. I think he means to say zoomed in and out.

Gary

The 22mm actually extends out a few millimeters when you turn the camera on, and retracts when you turn it off.  I noticed this when I put on the .20 adapter (to get a lower focal length for crop mode).  If you look at the center part of the lens when you turn the camera on/off you'll see it too!  The 18-55mm does not do this.
#113
Tragic Lantern / Re: EOS M Alpha shutter-bug discussion
November 08, 2013, 05:25:23 PM
Quote from: Stefano on November 08, 2013, 05:23:51 PM
It's not currently sold in the US, pity because it's razor sharp

What's REALLY nice about it, for RAW video, is it has image stabilization.  And in crop mode you risk very little moire or aliasing issues.
#114
Tragic Lantern / Re: EOS M Alpha shutter-bug discussion
November 08, 2013, 05:19:18 PM
You really want UHS 1, writing at least 40MBS (that's 40 megaBYTES), for RAW video at any rate. 

Yes, you just want FPS override OFF.  Otherwise the screen goes crazy in my experience.

It's confusing because many cards say 40mbs, which is BITS, and are really =40/8, or 5 MBS. 

None of this is to say you're not right about the lens!  Few have it unfortunately.
#115
Tragic Lantern / Re: EOS M Alpha shutter-bug discussion
November 08, 2013, 05:10:35 PM
Hi Stefano, what speed is your best SD card?  Thanks!

Also, it's not about 24fps, specifically, it's about having the FPS override set to anything but 30fps in photo mode.

BTW, have you shot any raw video with that lens.  At 1280x720 crop mode it would be effective 44mm (approx) and should look really nice.  They don't sell that lens here in the U.S.  :-[
#116
Tragic Lantern / Re: EOS M Alpha shutter-bug discussion
November 08, 2013, 03:49:23 PM
I had that theory a while ago! ;)  My "speculation" is that you won't have the shutter bug if you

1. Use the 95MBS Sandisk card
2. DO NOT use USB
3. If you go into photo mode, remember to TURN OFF your 24FPS, or whatever, video mode setting.

I was hoping someone might have an answer the guy's question about time-lapse shooting in the main thread.  I haven't done it in a while, but I agree with him, would be an awesome camera to get back into it with!
#117
Tragic Lantern / Re: EOS M Alpha shutter-bug discussion
November 07, 2013, 05:10:11 PM
I have the 18-55.  I think you mean zoomed out/in?  That would be interesting.
#118
Quote from: 1% on November 07, 2013, 04:27:09 PM
but the problem is you're asking near the impossible, most cameras can't deal with writing the video alone, much less an audio stream with it. g3gg0 did this with MLV and it has issues just from the meta data being written.

Sorry, I'm not making myself clear.  If you could read a simple voltage spike from the mic input say, or flash, etc., then you can read 0s and 1s, right.  I don't know how high a frequency you need with, but lets say it's 10 times the frame rate for easy sampling, so you'd need to read at 240bits per second.  Okay, so you're reading 0s and 1s from whatever and in the mlv file you're writing either a 0 or 1 for each frame, which gives you a 24bit signature for each second.

At the same time, the recorder is also recording those 0s and 1s (click track) to say channel 4, or the channel 2, if you can live with mono.

Now, in the recorder you hack it to plug in a cable that sends out these spikes every 240 seconds (which are sampled down to 1 bit per frame).  You also program the recorder to send out a specific "smoke signal" once it's started, so it sends batches of 12 0s and 1s for the first few seconds, but in a way that you can match up later.

So when you load the MLV file, it builds a signature from the encoded click track.  It builds a signature from the audio file and then matches them up.  Once they're matched up other software would know where the audio starts and cut it to that point and put the start point in the MLV file which the NLE (thinking big here) would ultimately use to match up the video and audio track.

I'm extemporizing here, but that's the general idea.  You mount the hacked audio recorder on a cold-shoe, and maybe, if the dev is a real genius, the camera sends a signal to start and stop the audio recorder and then we're really farting through silk ;)
#119
Tragic Lantern / Re: EOS M Alpha shutter-bug discussion
November 07, 2013, 04:25:45 PM
The 22mm extends and retracts, though no shutter bug.  The 18-55 doesn't extend/retract (that I can see).  My question is, doesn't the camera automatically retract the lens when you turn it off.  How can you turn the camera on with it extended?
#120
Quote from: 1% on November 07, 2013, 04:04:28 PM
Well right now clapper board works for me and syncs up perfectly. I can shoot raw on 7D or 50D and then record audio on 600D or 6D, etc.

Me too!

Maybe my point is though the devs are maxed out, stuff like this can get done, IMHO, if people show enough interest by putting in the time to organize it and make it happen.  You don't need to be a dev to do project management, or raise money, or administrate bug reporting software, etc.  If a user asks you for something you can get done in a day or two, and it seems interesting, you do it.  But if they ask you to do something that would take weeks, why do it?  You get the same thanks for the first feature as the second.   And we can't forget, the dev is also doing stuff HE wants.

So if users want audio in this way they need to band together so there is more reward, challenge AND they need to make sure the dev isn't doing all the work. 
#121
I did some quick Googling and see someone has done some minor hacks to a Teac.   An Arduino project is possible, though the sample rate would probably be very low.  Maybe some other 16/32bit DIY board? I think the ultimate solution is not to record camera audio, but to link up an audio recorder and the camera through some sort of time-code or click-track.  Either

A.) Hacked Audio recorder spits out timecode that is burned into the MLV while shooting through USB, the mic input, flash connector...? (thoughts anyone?)
B.) Camera passes a click track to the audio recorder that maybe starts with an easy to synch pattern, like da, da, dum, dum, da dum, da dum. etc. 
C.) MLV software on the backend synchs up both streams using timecode or clicktrack, etc.

Any of the master devs here could do this, if time and money were no object :)

For it to happen, someone would have to project manage the thing.  Get a time commitment from g3gg0 on the MLV stuff, then one of the camera devs, Alex or 1% or both, and then another dev to work on the audio recorder part and finally, we can't forget about $$$ so each dev can have a complete setup for testing.  Also, you want testers, documentation, and some junior devs to assist the master devs.



#122
Quote from: Toffifee on November 07, 2013, 08:16:16 AM
What's the news on audio implementation?  :)
Haven't seen any comments about it for a while
I think it's because no one wants to make the sacrifice to make MLV a success.

It doesn't matter how good MLV is as a technical spec, and how much potential it has, it requires politics (buy-ins) from other devs and users to be a success.   I don't see it happening any time soon because the simple fact is MLV requires precious bandwidth from the camera.  It's a trade-off.  Do I think it's worth it?  Yes.  But even my few attempts to help MLV gain traction have met with failure because the devs don't want to spend time on anything else but work on some new, cool feature (even devs who should want MLV to succeed).  Users keep asking for new things and the devs can't say no, or group together and prioritize.  Because they work for free I can't say anything without attracting a lot of sycophantic support for the devs--which is misdirected. 

On a bright note, ML is a hack that should be dead (because it's a steep development learning curve), yet it lives on, and grows.  So who knows, maybe someone will implement an audio solution tomorrow.
#123
Tragic Lantern / Re: 50D and 40D Raw video
November 07, 2013, 01:44:40 PM
Thanks, now I just need to bookmark this! :)
#124
Tragic Lantern / Re: 50D and 40D Raw video
November 07, 2013, 05:00:17 AM
Quote from: 1% on November 06, 2013, 10:47:38 PM
Yea, the first time I shoot after turning off dialog timers its slowish. Instead of card warmup I shoot till I see 75-80 and then delete the raw file. After that its good until power off.

Thanks for the tip 1%.  I'm going to try that next time!
#125
Quote from: gary2013 on November 07, 2013, 04:42:25 AM
Bugs the hell out me and it keeps making me seriously think about giving up on all these DSLRs for video recording. If anyone can crack these three things, that would be right up there at the top with all the other huge improvements since ML started.

I couldn't agree more.  First, to answer the question above, there is both line skipping horizontally AND vertically.  One of the obstacles to solving this problem is that there is little software that works with RAW video in a workbench way.  You can view RAW images in Photivo, for example, which I used in the above, but not video.  Part of what I'm doing is creating a viewer, based on g3gg0's MLV player, that would make it easy for me, or anyone to test algos for fixing this with RAW video NOT debayered.  Again, one of the reasons I don't believe this has been fixed is most people only have access to video that is already de-bayered, and that's too late!  You need to create, and test algo, against RAW images and analyze them for success, then de-bayer for test.