Capturing jpegs from an h264 stream

Alex Hewson mock at mocko.org.uk
Wed Jun 5 03:42:15 PDT 2013


On 04/06/2013 16:05, Nicolas Dufresne wrote:
> This pipeline should work if the source can be streamed. What's the 
> transport being used by raspivid ? As a reference, I took you pipeline 
> and I made it work this way:
>
> gst-launch-1.0 videotestsrc ! x264enc tune=zero-latency byte-stream=1 ! mpegtsmux ! fdsink fd=1 | \
> gst-launch-1.0 -e fdsrc fd=0 ! decodebin ! jpegenc ! multifilesink location=img_%03d.jpeg

I've already managed to get it working a little while ago by dumping 
frames into a multifilesink, but the problem is it tries to write 25 
jpegs a second and swamps my little raspberry Pi.  This is why I need to 
throw away all but 1-2 a second.  The plan is to use these for movement 
detection.

I think there is no transport since it's pure video with no need to mux 
in anything else.  I've uploaded an example video to 
http://mocko.org.uk/objects/2013/06/testvideo.h264 if you'd like to 
examine it.



> In this example I use MPEG Transport Stream. I also got it to work 
> with avimux, but not with qtmux as it's not streamable format (not 
> without configuration). In the case there is no transport (H264 
> byte-stream), I could make it work using h264element, or simply 
> setting caps after the fdsrc:
>
> gst-launch-1.0 videotestsrc ! x264enc tune=zero-latency byte-stream=1 ! fdsink fd=1 | \
> gst-launch-1.0 -e fdsrc fd=0 ! h264parse ! decodebin ! jpegenc ! multifilesink location=img_%03d.jpeg
This does work (produces a bunch of valid jpegs) but with one problem - 
I get a jpeg for every single frame and not just the occasional ones I need.

So I think the real problem here is persuading the pipeline to throw 
away most of the frames and turn remaining ones into jpegs.

-- 
Alex Hewson
m: +44 7895 265219 | e: mock at mocko.org.uk | Skype: alex.hewson



More information about the gstreamer-devel mailing list