Capturing jpegs from an h264 stream
Tim-Philipp Müller
t.i.m at zen.co.uk
Wed Jun 5 11:33:41 PDT 2013
On Wed, 2013-06-05 at 11:42 +0100, Alex Hewson wrote:
> > In this example I use MPEG Transport Stream. I also got it to work
> > with avimux, but not with qtmux as it's not streamable format (not
> > without configuration). In the case there is no transport (H264
> > byte-stream), I could make it work using h264element, or simply
> > setting caps after the fdsrc:
> >
> > gst-launch-1.0 videotestsrc ! x264enc tune=zero-latency byte-stream=1 ! fdsink fd=1 | \
> > gst-launch-1.0 -e fdsrc fd=0 ! h264parse ! decodebin ! jpegenc ! multifilesink location=img_%03d.jpeg
> This does work (produces a bunch of valid jpegs) but with one problem -
> I get a jpeg for every single frame and not just the occasional ones I need.
>
> So I think the real problem here is persuading the pipeline to throw
> away most of the frames and turn remaining ones into jpegs.
I haven't read the entire thread in detail, so apologies if I'm saying
something that's been suggested elsewhere already, or if I misunderstood
the problem.
Couple of random suggestions:
a) why not reduce the framerate before feeding it into the encoder? Do
you want the full feed as well, in case something interesting happened?
b) you could try using gdppay ! fdsink/tcpserversink and then
fdsrc/tcpclientsrc ! gdpdepay - that will pass through the original
timestamps from the sending pipeline. Make sure to use something like
dec ! videorate skip-to-first=true ! video/x-raw,framerate=1/1 ! enc
c) if the sender produces data in real-time (live), then you could
probably just tell the source to timestamp according to the clock with
do-timestamp=true. That will also put timestamps on the buffers.
d) even if the h264 data going into h264parse is not timestamped, I
would expect h264parse to timestamp if you put a capsfilter with a
framerate in front of it (or after it?). (If not, that should be fixed
imho).
Cheers
-Tim
More information about the gstreamer-devel
mailing list