Audio lags in aSPICE client while using gstreamer framework
Ritesh Prajapati
ritesh.prajapati at slscorp.com
Fri Jul 8 03:55:03 UTC 2016
Hi Nicolas,
Thanks for Reply.
When I am getting too much audio discontinuity timestamps messages
at that time Audio data is lagged compare to video data and goes out of
sync for that duration. So, if I set threshold timestamp to 40ms then
will my sync problem be solved or reduced?
Please let me know if any other configuration require in appsrc
pipeline to solve this problem.
Regards,
Ritesh Prajapati,
System Level Solutions (India) Pvt.Ltd.
On Thursday 07 July 2016 06:45 PM, Nicolas Dufresne wrote:
> See reply inline.
>
> Le jeudi 07 juillet 2016 à 14:12 +0530, Ritesh Prajapati a écrit :
>> Hi All,
>>
>> I am working on one of our product in which android 4.4 Kitkat is
>> running. We have one application called it as SPICE client which runs
>> on our product and is used to capture Audio+Video data streamed over
>> network from SPICE server which is installed into Ubuntu 16.04
>> Server.
>>
>> We are using SPICE client gtk based topology in which Gstreamer
>> Framework 1.0 is used.
>>
>> We are facing one audio sync issue like when Streaming process
>> of Audio+Video are started from SPICE server side at that time we are
>> getting video data perfectly but not getting audio data in sync
>> compared to video frames and are dropped for some initial durations.
>> so, it seems like Audio data is not synced or is lagged compare to
>> video data at that time.
>>
>> Please find following code snippet of Gstreamer Pipeline we have
>> configured and used in our SPICE client android code.
>>
>>> #ifdef WITH_GST1AUDIO
>>> g_strdup_printf("audio/x-
>>> raw,format=\"S16LE\",channels=%d,rate=%d,"
>>> "layout=interleaved", channels,
>>> frequency);
>>> #else
>>> g_strdup_printf("audio/x-raw-
>>> int,channels=%d,rate=%d,signed=(boolean)true,"
>>> "width=16,depth=16,endianness=1234",
>>> channels, frequency);
>>> #endif
>>> gchar *pipeline = g_strdup
>>> (g_getenv("SPICE_GST_AUDIOSINK"));
>>> if (pipeline == NULL)
>>> pipeline = g_strdup_printf("appsrc is-live=1 do-
>>> timestamp=1 format=time min-latency=0 caps=\"%s\" name=\"appsrc\" !
>>> "
>>> "audioconvert !
>>> audioresample ! autoaudiosink name=\"audiosink\"", audio_caps);
>>
>> Also, we are getting below warning message sometimes from Android
>> Gstreamer Studio.
>>
>>
>> gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<audiosink-
>> actual-sink-opensles> Unexpected discontinuity in audio timestamps of
>> -0:00:00.076145124, resyncing
> As audio may arrive in small burst, what will happen is that the
> timestamp distance between buffer becomes too small. The timestamp +
> duration of the current buffer may endup after the next timestamp. In
> this case the audio sink will resync. To prevent that, you should
> timestamp the buffer yourself, so you pick an initial timestamp, and
> add the duration. You then monitor the calculated timestamp and the
> time now, if it drift over a certain threshold (generally 40ms), you
> resync (and set the discontinuity flag).
>
> In an ideal world, the streaming protocol should provide you with
> timing information that let you correlate in time the video and the
> audio frames. This should serve in creating timestamps and ensuring
> perfect A/V sync. I don't know Spice too much, but it might not be part
> of the protocol.
>
> Nicolas
>
>
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20160708/4534c17f/attachment.html>
More information about the gstreamer-devel
mailing list