Decode/demux h264 file to jpeg on Windows using DirectShow

Bo Lechnowsky bo at respectech.com
Mon Aug 12 02:20:00 PDT 2013


Tim: Thanks for the response.  See below.

On 8/11/2013 4:22 AM, Tim-Philipp Müller wrote:
> On Sun, 2013-08-11 at 00:14 -0700, Bo Lechnowsky wrote:
>
> Hi,
>
>> I've been trying for days to get gstreamer's gst-launch-1.0 to output an
>> h264 stream as individual jpegs, but want only one per second, and using
>> the DirectShow hardware acceleration. I've tried numerous iterations of
>> commands, and this is the closest I've gotten:
>>
>>       gst-launch-1.0 filesrc location=test.h264 ! decodebin ! videorate !
>> video/x-raw,framerate=1/30 ! jpegenc ! multifilesink location=img%03d.jpg
>>
>> This gives me 300 jpegs from my 10 second h264 stream, and it doesn't
>> use the DirectShow hardware interface.
>>
>> I've used gst-inspect to try to use what I thought was the DirectShow
>> decoder for h264 (video/x-h264) but that gives me errors. I've also
>> tried to change the framerate from 1/30 to 30/1 and 1/1, but always get
>> the same 30 jpeg per second output.  Therefore, I'm assuming that
>> framerate is a method for me to tell gst how many fps the source file is.
> I'm afraid you're running into a bug here. h264parse + the h264 decoder
> (avdec_h264 presumably) don't timestamp the frames properly (that is:
> not at all), and without timestamps videorate can't do much, that's why
> changing the framerate in the capsfilter has no effect for you. You can
> probably work around it using the 'videoparse' element after decodebin,
> if you configure it correctly with the right size plus format etc.
So, it's not possible to tell gstreamer to drop all but 1 out of every 
30 frames?
> It should work fine if the h264 video comes in a container such as MP4,
> MPEG-TS, MPEG-PS or Matroska or so (as it ususally does).
I'm working with video generated by the Raspberry Pi's camera module 
using Raspivid, so it is just straight h264 not in any container. From 
what I understand, it also doesn't timestamp the frames.
> I don't think there's any plugin yet that makes use of
> hardware-accelerated decoders on windows (but I might not be entirely
> up to date).
Do you know if there are any plugins yet that make use of 
hardware-accelerated decoders on Linux/BSD or any other platform? Do you 
know if gstreamer/OpenMax for Linux uses the hardware-accelerated-decoder?
>> I thought decodebin was supposed to automatically select the best
>> decoder based on the input stream, but it appears to be using a CPU
>> intensive one (instead of GPU hardware-accelerated) judging by how the
>> CPU on my test machine pegs at 100% for the duration of the gstreamer
>> process.
> That is to be expected whatever decoder is used, since there are no
> elements in the pipeline that sync to the clock and thus throttle data
> processing. A transcoding pipeline like yours will run as fast as
> possible, reading and processing data as fast as possible, maxing out
> the CPU/cores as much as possible.
I guess what I was looking for was a decoder that was hardware-accelerated.
>> Ideally, I'd also like the jpegs to be output at a different resolution
>> than the resolution of the video, but everything I've tried
>> (width=640,height=480) either causes errors or doesn't result in a
>> resized jpg.
> Something like ..
>
>    ... ! videoscale ! video/x-raw,width=640,height=480 ! jpegenc ! ...
>
> should work.
I'm pretty sure I already tried that, but I'll give it another shot.
>> I know how to do all this with avconv, but it also is CPU intensive and
>> I'm trying to free the CPU to perform other tasks during the decoding
>> process.
> There are ways to make it not max out the CPU if you don't want it to
> run as fast as possible. Inserting an identity sync=true might do the
> trick, for example.
Thanks for all the pointers!

-Bo


More information about the gstreamer-devel mailing list