[gst-devel] timestamps on a live h264 source

PALFFY Daniel dpalffy at rainstorm.org
Tue Sep 8 16:28:17 CEST 2009


Hi,

I'm developing a gstreamer source for a raw-yuv/h264 capable video grabber 
card. In raw mode, the source works fine without setting 
GST_BUFFER_OFFSET, GST_BUFFER_OFFSET_END, GST_BUFFER_TIMESTAMP and 
GST_BUFFER_DURATION, but for h264 live play, I can't find a working 
combination.

The example pipeline looks like this:
gst-launch mysource ! "video/x-h264,framerate=25/1" ! ffdec_h264 ! xvimagesink

The card provides each frame as a separate buffer, and (in the current 
configuration) I have one SPS, one PPS, one I, and 14 P-frames in a group, 
each output in a separate GstBuffer;

When not setting anything, the pipeline takes all grabbed frames, but 
displays only the first (or maybe first few).

If I set all the values to what i believe is correct (put a serial number 
incrementing from 0 in OFFSET, OFFSET+1 in OFFSET_END, a 
hardware-generated timestamp in TIMESTAMP, and 0 for SPS/PPS frames 
and GST_SECOND/framerate for I/P frames in DURATION), the pipeline only 
takes and displays the first four frames and then stalls.

If I count the SPS/PPS frames as normal frames, use the same duration for 
them as I/P frmaes and increment the timestamp accordingly, the buffer in 
my element fills slowly as the decoder takes fewer frames than produced.

When saving the stream to a file and playing back from there, everything 
works fine.

What would be the correct values for the timestamps in this case? Or do I 
have to implement a clock-capable element?

-- 
Dani
 			...and Linux for all.




More information about the gstreamer-devel mailing list