[gst-devel] BaseSink "sync" property

Arnout Vandecappelle arnout at mind.be
Thu Apr 30 16:19:30 CEST 2009

On Wednesday 29 April 2009 11:02:41 Jon Burgess wrote:
> Perhaps someone can explain to me what might be happening for me.  I've been 
successfully using a pipeline, such as the following gst-launch equivalent, 
on my own development machine (32 bit linux, ubuntu hardy):
> > gst-launch-0.10 uridecodebin uri=rtsp://some_mp4v_encoded_source ! queue
> > ! decodebin2 ! xvimagesink
> However, when it came time to deploy my code on to a test machine (64 bit
> linux, ubuntu hardy, all relevant packages are the same versions, as far as
> I can tell), it was found that playback was far from good - very stuttered,
> missing/corrupted frames etc.
> I was playing around with a few of the element properties, and discovered
> that setting the "sync" property to false (default is "true) fixed my
> problem - i.e. it resulted in smooth playback.
> Now, the doco from BaseSink says: "sync" - Sync on the clock.
>  Well it might be obvious to some, but I'm not sure why the behaviour would
> be different on the two different machines.   Which element would be
> providing the clock in the above pipelines (I think the uridecodebin and
> decodebin2 elements "resolve" to something like rtpmp4vdepay !
> ffdec_mpeg4)? 
> Does turning sink off just result in the sink rendering it as soon as it
> gets data?

 Instead of turning off sync, you can also set the max-lateness property to -1 
(for video).  That way, you still get synchronisation between audio and video 
(when the video arrives on time), but none of the frames will be dropped.

 I had a similar problem but with a different cause: I was using a live HTTP 
source for MPEG4 streaming with buffer timestamps based on the HTTP arrival 
time.  This caused the framerate to drop to 1fps because all frames arrive 
late (unless max-lateness is set to -1).  The reason is that the IP camera 
set the frame duration to 1ms, which eventually trickles down to basesink 
estimating the end-to-end latency te be 1ms.  However, it actually is 
something like 80ms (because of double buffering in ffmpeg)...

 Anyway, I don't see much point in having a max-lateness for video: the 
rendering in the sink itself is hardly the bottleneck, is it?  So there's not 
much sense in dropping the frame...


Arnout Vandecappelle                               arnout at mind be
Senior Embedded Software Architect                 +32-16-286540
Essensium/Mind                                     http://www.mind.be
G.Geenslaan 9, 3001 Leuven, Belgium                BE 872 984 063 RPR Leuven
LinkedIn profile: http://www.linkedin.com/in/arnoutvandecappelle
GPG fingerprint:  D206 D44B 5155 DF98 550D  3F2A 2213 88AA A1C7 C933

More information about the gstreamer-devel mailing list