wifi monitoring via USB

Started by carlo.licenziato, November 21, 2012, 11:07:19 AM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

carlo.licenziato

A new idea for wifi monitoring via USB

Reflex ---- Micro PC Android 4 (for example http://www.nexus-lab.com/2012/06/14/micro-pc-android-4-chiavetta-usb-offerta-offerum/) with hdslr sw control ---- tablet with wifi

who is the crazy man that can realize it??

  ;)


SDX

I'm currently working on something like this based on a Raspberry Pi and a simple AP (a TPlink WA500g, to be precise). I'll use it for a Skycam I'm building.

I don't recommend you to use those Android-sticks. It's ways easier to develop for Unix or Windows.
What you might want to take a look at:

CHDK PTP Should have some liveview-stream functionality.
gPhoto Well documented.

Not matter what, you'll need to write yourself something to stream it. I haven't had luck finding a lib that does that in a satisfyingly simple way. I hope on finding something that uses a common protocol like RTP/RTSP, NMSH, IceCast or HTTP stream. I must admit, that I haven't taken a look at them all jet.

Maybe someone can help?

EDIT: pjsip Jrtplib

jplxpto


SDX

Looks great, I'll dig into that.

I just found this one here: http://www.live555.com/liveMedia/
It looks really promising. The only thing is, that all examples use a video file as a source. Allso the documentation is written with video files as source in mind. I'll have to figure out how to "feed" it with my own image data, and how that could work out with codecs.



SDX

Wow, why haven't I found that before, that's brilliant!

I can already now see several solutions:
- Feed ffmpeg through a named pipe with YUV video data.
     This might be the solution with the lowest latency. Since I couldn't figure out jet, how the image data from CHDK PTP looks like, I don't know how easy it will be to convert data and if that even will be necessary.
- Put the image from the camera directly into a named pipe and use -loop_input on ffmpeg side.
     Since ffmpeg can convert a hell lot of different formats, this would be a quite easy solution. Also, I wouldn't have to take care of anything related to the framerate. The latency shouldn't play a big role in this case either.
- Same solution as above, just with files instead of pipes.
     High latency, especially on a RPI, since everything goes through a SD card. But at least a plan-B.

I'm quite happy with this, since it saves me quite a lot of coding work.

Now to the getting-image-data-from-camera part.
It appears to be very hard to figure out how this could work. While browsing through the source of chdkptp I found some indications on, that it actually could be YUV data (YAY!). But I havn't fully understood the code jet, so I'm not sure.
Also, I'm not sure if CHDKPTP will work with our EOS cameras. PTPcam does, but that doesn't mean that CHDKPTP does!? There is no real documentation about anything here  :(


A random link
I'll keep you up to date


EDIT: If i would choose to go for the 3rd solution, everything would be quite easy. I could use CHDKPTP -dumpframes in order to save the frame into a file. Almost too easy to be true. I'll give it a try and see the latency. But since I currently don't have access to my own computer nor the RPI, things will have to wait.
dumpframes   [options] [file]: - dump camera display frames to file
file: optional output file name, defaults to chdk_<pid>_<date>_<time>.lvdump
options:
   -count=<N> number of frames to dump
   -wait=<N>  wait N ms between frames
   -novp      don't get viewfinder data
   -nobm      don't get ui overlay data
   -nopal     don't get palette for ui overlay
   -quiet     don't print progress


EDIT2: The lag of documentation doesn't make things easier. Look what I found in PTPcam source: live_view_get_data(); Is there a feature in PTPcam I'm note aware of?

jplxpto

Quote from: SDX on November 25, 2012, 12:43:02 AM
Wow, why haven't I found that before, that's brilliant!

I'm glad you liked my links :)
What is the ptpcam are you using?

https://bitbucket.org/hudson/magic-lantern/src/005f779e5a01/contrib/ptpcam?at=unified  ???

Please read this:

Some information about PTP & GDB
http://www.magiclantern.fm/forum/index.php?topic=3401.msg16879#msg16879


I've used the PTPcam with my 40D. I can download photos and do a memory dump.

jplxpto

Just now I saw a PTP version you are using. :)


Michael Zöller

neoluxx.de
EOS 5D Mark II | EOS 600D | EF 24-70mm f/2.8 | Tascam DR-40

SDX

EOS Movie Record uses the official SDK from Canon. Non-open :(

I think we have collected some knowledge now. I'm done with the work on the physical part (pictures will follow).

Just so everything is clear:
- ptpcam http://chdk.wikia.com/wiki/PTP_Extension#ptpcam
- chdkptp http://chdk.wikia.com/wiki/PTP_Extension#chdkptp

I know ptpcam works well with our EOS DSLRs, but currently chdkptp looks more useful to me (not that ptpcam wouldn't work with some modifications). In first place, I'll check how well chdkptp works. Modifying the dumpframes function to output to a pipe instead of a file shouldn't be that hard. Or is it?

I'll keep you up to date.

SDX

When looking at the official file page of chdkptp I found this:
There are three different downloads for binaries: Windows, Linux and Raspbian! Yes, chdkptp should run without any problems on RaspberryPis Debian. Great, perfect, fantastic!

EDIT: also, very useful http://chdk.wikia.com/wiki/Chdkptp_in_headless_linux_Dockstar_-_remote_control (sort of the same we are doing here and now)

EDIT2: since I don't have my computer here right now, and I can't install libusb on this one, things will have to wait until Tuesday.

jplxpto

Quote from: SDX on November 25, 2012, 08:28:47 PM
When looking at the official file page of chdkptp I found this:
There are three different downloads for binaries: Windows, Linux and Raspbian! Yes, chdkptp should run without any problems on RaspberryPis Debian. Great, perfect, fantastic!

EDIT: also, very useful http://chdk.wikia.com/wiki/Chdkptp_in_headless_linux_Dockstar_-_remote_control (sort of the same we are doing here and now)

EDIT2: since I don't have my computer here right now, and I can't install libusb on this one, things will have to wait until Tuesday.

Good work...

SDX

Finally, time to start.
Unfortunately I quite quickly ran into the first problems. No matter what I do with both ptpcam and CHDKptp, I just get and unexpected return value 0x2005. On my SD card I found a autoexec.bin I compiled a while ago, when I played around with PTP on ML side. It didn't work either, the return value just differed (0x2002). So, that problem appears to be coming either from the Camera or libusb (yes, currently testing under windows). I have had connected and used my camera successfully through ptpcam on windows before (under win7, now running win8, might be a source for the problem as well).

Good news!
Very, very interesting is this stuff. A python binding for gPhoto2. It also comes with an example on how to access liveview data and use it in pygame.

I'll now try to get it running on the RPI. I just hoped to be able to avoid to do the tests and development on the RPI directly.

jplxpto

Quote from: SDX on November 27, 2012, 11:10:55 PM
Finally, time to start.
Unfortunately I quite quickly ran into the first problems. No matter what I do with both ptpcam and CHDKptp, I just get and unexpected return value 0x2005. On my SD card I found a autoexec.bin I compiled a while ago, when I played around with PTP on ML side. It didn't work either, the return value just differed (0x2002). So, that problem appears to be coming either from the Camera or libusb (yes, currently testing under windows). I have had connected and used my camera successfully through ptpcam on windows before (under win7, now running win8, might be a source for the problem as well).


I've had these errors before

http://www.magiclantern.fm/forum/index.php?topic=1512.msg15327#msg15327

SDX

Okay, status. I compiled both ptpcam and chdkptp on my RaspberryPi. Took me some time to figure out what libs they need, but the rest was quite straight forward. Both ptpcam and chdkptp work fine, as they should. I compiled ML with PTP on, works fine as well.

Now the problems: CHDKPTPs "dumpframes", the function I hoped to do the job, doesn't work - incopatible API. ptpcam -c doesn't work either, same issue.
The only chance we have now is gPhoto, and I think I'll go for the python wrapper. I'll take a look at how to effectively get image data from python to ffmpeg. Unfortunately there is no wrapper of ffmpeg for python that could do the job (pyffmpeg can't stream).
I think I might go for the pipes, but I'm not sure about the performance. Don't pipes use the filesystem? I did rather prefer a other kind of IPC, but ffmpeg only supports pipes when it comes to that. I don't want to modify ffmpeg and compile it myself in order to use fx. shared memory. I don't think that that would be worth the effort, I can life with those few ms. more in latency.

I know, all solutions aren't state of the art, not to say outdated, but currently I just want this to work somehow.
As an alternative, I can always use a video grabber and connect it to the composite out of the camera ^^

jplxpto

Quote from: SDX on November 29, 2012, 12:47:26 AM
I know, all solutions aren't state of the art, not to say outdated, but currently I just want this to work somehow.
As an alternative, I can always use a video grabber and connect it to the composite out of the camera ^^

One of my colleagues used a framegrabber with ffmpeg to create a stream of H264.
It was he who introduced me ffmpeg ... I think he used pipes at an early stage.
He created a video surveillance system. customers use a plugin / libs VLC.

jplxpto

Quote from: SDX on November 29, 2012, 12:47:26 AM
Now the problems: CHDKPTPs "dumpframes", the function I hoped to do the job, doesn't work - incopatible API. ptpcam -c doesn't work either, same issue.

I think, should not be too hard to do the merge of ptpcam existing in hudson repository with the one you're testing.

the PTP used in ML does not support "dumpframes" but we can work on that ...
@!ex and G3gg0 can help us in this ... they also have experience in PTP CHDK/ML

SDX

some information from a1ex or g3gg0 would be very appreciated.

I tried to install the gphoto wrapper, but failed miserably. Files aren't where they are expected to be, so the previously linked documentation doesn't help that much. I just can't figure our, who made that. Someone called alexdu. Is that our a1ex? I should ask that person.

I just tried a frame grabber, but the result was not satisfying me. The delay from camera to computer was ~1 second. And that was the lowest, it even got up to many, many seconds at some points in time (we talk about 20 to 30 seconds here). That is in fact quite strange, since I have used my video grabber quite allot. I was aware of the delay of 1 second, but not that. Adding the latency that would come from streaming, it will barley be usable.
Another alternative would be to put a little webcam next to the main camera and stream that guy. But that simply doesn't feel right.

I'll now try to play a bit with gphoto. I doesn't have to be the python wrapper, I can perfectly life with that. Time to find gphotos documentation and start reading!

wolf

Maybe this Project  helps you. They use libgphoto.
http://entangle-photo.org/

jplxpto

Quote from: SDX on November 29, 2012, 06:01:23 PM
some information from a1ex or g3gg0 would be very appreciated.

I tried to install the gphoto wrapper, but failed miserably. Files aren't where they are expected to be, so the previously linked documentation doesn't help that much. I just can't figure our, who made that. Someone called alexdu. Is that our a1ex? I should ask that person.

I just tried a frame grabber, but the result was not satisfying me. The delay from camera to computer was ~1 second. And that was the lowest, it even got up to many, many seconds at some points in time (we talk about 20 to 30 seconds here). That is in fact quite strange, since I have used my video grabber quite allot. I was aware of the delay of 1 second, but not that. Adding the latency that would come from streaming, it will barley be usable.
Another alternative would be to put a little webcam next to the main camera and stream that guy. But that simply doesn't feel right.

I'll now try to play a bit with gphoto. I doesn't have to be the python wrapper, I can perfectly life with that. Time to find gphotos documentation and start reading!

Good luck


carlo.licenziato