Decode/demux h264 file to jpeg on Windows using DirectShow

Tim-Philipp Müller t.i.m at zen.co.uk
Sun Aug 11 04:22:44 PDT 2013


On Sun, 2013-08-11 at 00:14 -0700, Bo Lechnowsky wrote:

Hi,

> I've been trying for days to get gstreamer's gst-launch-1.0 to output an 
> h264 stream as individual jpegs, but want only one per second, and using 
> the DirectShow hardware acceleration. I've tried numerous iterations of 
> commands, and this is the closest I've gotten:
> 
>      gst-launch-1.0 filesrc location=test.h264 ! decodebin ! videorate ! 
> video/x-raw,framerate=1/30 ! jpegenc ! multifilesink location=img%03d.jpg
> 
> This gives me 300 jpegs from my 10 second h264 stream, and it doesn't 
> use the DirectShow hardware interface.
> 
> I've used gst-inspect to try to use what I thought was the DirectShow 
> decoder for h264 (video/x-h264) but that gives me errors. I've also 
> tried to change the framerate from 1/30 to 30/1 and 1/1, but always get 
> the same 30 jpeg per second output.  Therefore, I'm assuming that 
> framerate is a method for me to tell gst how many fps the source file is.

I'm afraid you're running into a bug here. h264parse + the h264 decoder
(avdec_h264 presumably) don't timestamp the frames properly (that is:
not at all), and without timestamps videorate can't do much, that's why
changing the framerate in the capsfilter has no effect for you. You can
probably work around it using the 'videoparse' element after decodebin,
if you configure it correctly with the right size plus format etc.

It should work fine if the h264 video comes in a container such as MP4,
MPEG-TS, MPEG-PS or Matroska or so (as it ususally does).

I don't think there's any plugin yet that makes use of
hardware-accelerated decoders on windows (but I might not be entirely
up to date).

The "video/x-h264" thing you found is likely the typefinder for H.264.


> I thought decodebin was supposed to automatically select the best 
> decoder based on the input stream, but it appears to be using a CPU 
> intensive one (instead of GPU hardware-accelerated) judging by how the 
> CPU on my test machine pegs at 100% for the duration of the gstreamer 
> process.

That is to be expected whatever decoder is used, since there are no
elements in the pipeline that sync to the clock and thus throttle data
processing. A transcoding pipeline like yours will run as fast as
possible, reading and processing data as fast as possible, maxing out
the CPU/cores as much as possible.


> Ideally, I'd also like the jpegs to be output at a different resolution 
> than the resolution of the video, but everything I've tried 
> (width=640,height=480) either causes errors or doesn't result in a 
> resized jpg.

Something like ..

  ... ! videoscale ! video/x-raw,width=640,height=480 ! jpegenc ! ...

should work.

> I know how to do all this with avconv, but it also is CPU intensive and 
> I'm trying to free the CPU to perform other tasks during the decoding 
> process.

There are ways to make it not max out the CPU if you don't want it to
run as fast as possible. Inserting an identity sync=true might do the
trick, for example.

Cheers
 -Tim



More information about the gstreamer-devel mailing list