Multiple c920s on GStreamer 1.2.3

Mark Scudder mark at markscudder.com
Thu Feb 27 12:02:52 PST 2014


Hello,

I have GStreamer 1.2.3 built and working on a Linux Mint 16 system following Alex Csete's instructions on his wiki (so it's built in my home directory, ~/gst/runtime).

I have three Logitech c920s I would like to record from, and since the camera has a built-in h.264 encoder, my Linux box has enough power to record multiple cameras at once, because it's not doing three realtime h.264 encodes on the CPU, just writing 3mbit/sec streams to disk. I did some experimenting with v4l2src, and verified I could access and write the h.264 stream from all three c920s at once. But some issues with v4l2src led me to want to try uvch264src instead.

Following the examples on Alex's blog, I open a terminal window, run the set_env.sh script, and issue the following command (based on his example but using a filesink instead of videosink, to save the video):

~/gst/runtime/bin/gst-launch-1.0 -v -e uvch264src device=/dev/video0 name=src auto-start=true src.vfsrc ! queue ! video/x-raw,format=\(string\)YUY2,width=320,height=240,framerate=10/1 ! xvimagesink sync=false src.vidsrc ! queue ! video/x-h264,width=1280,height=720,framerate=30/1 ! filesink location=/home/mark/test1.mp4

This works. A small preview window is displayed on the screen, and the h.264 stream is recorded to test1.mp4, though it has to be run through ffmpeg -i test1.mp4 -vocdec copy fixed.mp4 before it's usable in non-GStreamer-based players. (I don't mind that, if there's no way to properly mux it into a compatible mp4 file in the pipeline.)

I then open another terminal window, run the set_env.sh script, and change the command to connect to the next camera, and write a different file:

~/gst/runtime/bin/gst-launch-1.0 -v -e uvch264src device=/dev/video1 name=src auto-start=true src.vfsrc ! queue ! video/x-raw,format=\(string\)YUY2,width=320,height=240,framerate=10/1 ! xvimagesink sync=false src.vidsrc ! queue ! video/x-h264,width=1280,height=720,framerate=30/1 ! filesink location=/home/mark/test2.mp4

And it doesn't work. GStreamer displays the following in the terminal window:

Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = /dev/video1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = /dev/video1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter2: caps = video/x-raw, format=(string)YUY2, width=(int)320, height=(int)240, framerate=(fraction)10/1
/GstPipeline:pipeline0/GstUvcH264Src:src: ready-for-capture = false
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = /dev/video1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter3: caps = video/x-h264, width=(int)1280, height=(int)720, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter4: caps = video/x-raw, format=(string)YUY2, width=(int)320, height=(int)240, framerate=(fraction)10/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0: num-clock-samples = 0
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0: device-fd = 12
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter5: caps = image/jpeg, width=(int)320, height=(int)240, framerate=(fraction)30/1
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter5.GstPad:src: caps = image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0.GstPad:jpeg: caps = image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0.GstPad:sink: caps = image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter5.GstPad:sink: caps = image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
ERROR: from element /GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0:
streaming task paused, reason not-negotiated (-4)
EOS on shutdown enabled -- waiting for EOS after Error
Waiting for EOS…

And does nothing until I Ctrl-C it.

If the first pipeline is not running, this pipeline runs just fine, but they won't run in parallel. I don't feel that GStreamer is being clear about what/where the problem is, but I understand since it's in development I have to dig and experiment.  I also don't see a reason it wouldn't work, since Linux seems to enumerate the cameras as separate entities, and any binary should be able to run multiple times on the same box if it's opening different devices. I just don't know where to start. Can someone help me get these working in parallel? Like I said, even though c920 support isn't perfect in v4l2src, I was able to record three cameras in parallel.

Also, in case this is an easy question, here's another:

Does anyone know if the c920 sends a muxed audio/video stream to the computer, and GStreamer is just demuxing the video off it? In other words, do I always need to explicitly set up audio in my pipeline from the audio device on the camera and then mux it in the pipeline, or is there a way to pull a fully muxed, ready to go mp4 stream off the camera without having GStreamer do the muxing if it's not necessary?

Thanks much, love the potential of GStreamer and I want to use it every day.

-mark


More information about the gstreamer-devel mailing list