[gst-devel] AV synchronization problem with elementary AV (AAC/H264) streaming

Marco Ballesio gibrovacco at gmail.com
Sat Jan 22 10:20:56 CET 2011


Hi,

On Fri, Jan 21, 2011 at 8:28 AM, amitpandya <amitpandya at itimes.com> wrote:
>
> Hi all,
>
> I am doing the elementary streaming of AAC/H264 data using RTP over UDP with
> gstreamer.
>
> In my case, I have two separate pipelines for Audio & video as below.
>
> video Case:(pipeline1)
> appsrc->input_capsfilter->packatizer->output_capsfilter->gstrtpbin.
>
>        where, appsrc is pushing H264 encoded data to pipeline
>
> Audio Case:(pipeline2)
> appsrc->input_capsfilter->packatizer->output_capsfilter->gstrtpbin.
>
>        where, appsrc is pushing AAC encoded data to pipeline

It is usually bad to use separate pipelines for interrelated streams.
You should make somehow sure that the clocks are perfectly
synchronised and skew is kept under very low threshold. Btw I read
below you've tried with a single pipeline as well, so you might have
more than one problem here.

>
> Now , the problem is at clientside(QT player) both Audio & Video playback is
> happening fine but audio is delayed by several seconds (8-10 sec precisely)
> than the video.
>
> After primary debugging found that audio pipeline is taking more time in
> processing and sending Audio Packets a bit late than the Video packets.Here
> both the Audio - video processing is happening in different pipelines and
> both pipelines have their own processing time, which is unfortunately
> different.
>
> In sort, Video pipeline start sending video packets much earlier & after
> some time Audio pipeline starts sending Audio packets to the N/W.that is why
> at client side video playbach happens first and after some delay audio
> starts playing.
>
> I have done following testing to resolve the problem.
> 1)Instead of two pipelines I used only single pipeline for both AV but it is
> not making any difference.

This is a first step forward :)

> 2)With single pipeline instead of using two different gstrtpbin (one for
> Audio & one for Video) used only one gstrtpbin but that is not making any
> difference.

Another step in the right direction.

> 3) Used queue elements in the pipeline to delay video pipeline still no
> difference.

A queue does not delay playback on the remote end. at least not as
you're expecting it to happen. The player will always refer to
timestamps, assigned from the source and not modified by the queue.

>
> My question is how AV synchronization can be handled within the pipeline for
> these case?

Are you setting proper timestamps and durations in the buffers you're
generating with the two appsrcs? Are the timestamps synchronised to
the pipeline clock (e.g. are there no drifts between the two streams)?

Regards,
Marco

or
> I have to put some buffering mechanism outside gstreamer pipeline which can
> handle such AV syncronization?If so how to do it?
>
> Any help is greatly appreciated.
>
> Thanks,
> Amity Pandya
> --
> View this message in context: http://gstreamer-devel.966125.n4.nabble.com/AV-synchronization-problem-with-elementary-AV-AAC-H264-streaming-tp3229054p3229054.html
> Sent from the GStreamer-devel mailing list archive at Nabble.com.
>
> ------------------------------------------------------------------------------
> Special Offer-- Download ArcSight Logger for FREE (a $49 USD value)!
> Finally, a world-class log management solution at an even better price-free!
> Download using promo code Free_Logger_4_Dev2Dev. Offer expires
> February 28th, so secure your free ArcSight Logger TODAY!
> http://p.sf.net/sfu/arcsight-sfd2d
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>




More information about the gstreamer-devel mailing list