Capturing jpegs from an h264 stream

Mailing List SVR lists at svrinformatica.it
Fri Jun 7 01:39:28 PDT 2013


Hi,

I don't have a Raspeberry PI, I only tested your file on desktop pc and 
this pipeline works fine (7 jpgs are produced):

cat /tmp/testvideo.h264 | gst-launch-0.10 -v fdsrc fd=0 ! h264parse ! 
ffdec_h264 ! videorate ! video/x-raw-yuv,framerate=1/1 ! jpegenc ! 
multifilesink location=/tmp/img_%04d.jpg

I was not able to have the same works in 1.0 (pipeline runs but no jpeg 
are produced) so there is a regression somewhere in 1.0 or simply I'm 
doing something wrong converting the above pipeline to 1.0,

hope this help,

Nicola

Il 05/06/2013 12:42, Alex Hewson ha scritto:
> On 04/06/2013 16:05, Nicolas Dufresne wrote:
>> This pipeline should work if the source can be streamed. What's the 
>> transport being used by raspivid ? As a reference, I took you 
>> pipeline and I made it work this way:
>>
>> gst-launch-1.0 videotestsrc ! x264enc tune=zero-latency byte-stream=1 
>> ! mpegtsmux ! fdsink fd=1 | \
>> gst-launch-1.0 -e fdsrc fd=0 ! decodebin ! jpegenc ! multifilesink 
>> location=img_%03d.jpeg
>
> I've already managed to get it working a little while ago by dumping 
> frames into a multifilesink, but the problem is it tries to write 25 
> jpegs a second and swamps my little raspberry Pi. This is why I need 
> to throw away all but 1-2 a second.  The plan is to use these for 
> movement detection.
>
> I think there is no transport since it's pure video with no need to 
> mux in anything else.  I've uploaded an example video to 
> http://mocko.org.uk/objects/2013/06/testvideo.h264 if you'd like to 
> examine it.
>
>
>
>> In this example I use MPEG Transport Stream. I also got it to work 
>> with avimux, but not with qtmux as it's not streamable format (not 
>> without configuration). In the case there is no transport (H264 
>> byte-stream), I could make it work using h264element, or simply 
>> setting caps after the fdsrc:
>>
>> gst-launch-1.0 videotestsrc ! x264enc tune=zero-latency byte-stream=1 
>> ! fdsink fd=1 | \
>> gst-launch-1.0 -e fdsrc fd=0 ! h264parse ! decodebin ! jpegenc ! 
>> multifilesink location=img_%03d.jpeg
> This does work (produces a bunch of valid jpegs) but with one problem 
> - I get a jpeg for every single frame and not just the occasional ones 
> I need.
>
> So I think the real problem here is persuading the pipeline to throw 
> away most of the frames and turn remaining ones into jpegs.
>



More information about the gstreamer-devel mailing list