How to correctly timestamp buffers in an appsrc element

J. Krieg bellum07 at googlemail.com
Thu Nov 26 20:55:55 UTC 2020


Hello,

No ideas?
Could anyone help please?
Unfortunately I can’t figure out this by myself.

Thank you very much.

Best Regards,
Joerg

Am Fr., 20. Nov. 2020 um 15:47 Uhr schrieb J. Krieg <bellum07 at googlemail.com>:
>
> Hello,
>
> I’m currently working on an application to display live TV using
> GStreamer on a Raspberry Pi 2B.
>
> Therefore I use two appsrc elements (one for video and one for audio)
> which are reading PES packets in 2 separate threads directly from the
> V4L DVB demux device ‘/dev/dvb/adapter0/demux0’.
> My current test pipelines are:
>
> Video
>   V4L DVB demux (DMX_OUT_TAP) -> appsrc ! h264parse ! v4l2h264dec !
> queue ! kmssink
> Audio
>   V4L DVB demux (DMX_OUT_TAP) -> appsrc ! mpegaudioparse !
> mpg123audiodec ! queue ! alsasink
>
> I managed to get this working without timestamping the buffers at all
> in both appsrc elements but then video and audio isn't synchronous.
>
> I tried to implement timestamping the buffers according to
> https://gstreamer.freedesktop.org/documentation/application-development/advanced/pipeline-manipulation.html?gi-language=c#inserting-data-with-appsrc
> But when doing this I get slightly stuttering video and extremely
> stuttering or no audio.
>
> What I'm also struggling with is that in the link above the following is said:
> "In live mode, you should timestamp the buffers with the pipeline
> running-time when the first byte of the buffer was captured before
> feeding them to appsrc."
>
> But according to my tests the pipeline only changes its state from
> PAUSED to PLAYING (where the clock of the pipeline is only available)
> after some captured buffers have already fed into the pipeline.
> So how could the buffers be timestamped with the running time the very
> first time before they have been put into the pipeline which is in a
> PAUSED state to get video and audio synchronous?
>
> What am I doing wrong?
> Any help or pointing in the right direction would be really appreciated.
>
> Thanks,
> Joerg
>
> Code:
> // Create a new empty buffer
> gbuffer = gst_buffer_new_allocate(NULL, rc, NULL);
>
> // Timestamp buffer
> if (((CustomData *)data)->pipelineclock) {
>     pipeline_clock_time = gst_clock_get_time(((CustomData
> *)data)->pipelineclock);
>     pipeline_running_time = pipeline_clock_time - g_pipeline_base_time;
>     GST_BUFFER_PTS(gbuffer) = pipeline_running_time;
>     GST_BUFFER_DURATION(gbuffer) = pipeline_running_time -
> g_last_pipeline_running_time_a;
>     g_last_pipeline_running_time_a = pipeline_running_time;
>     printf("*** DEBUG *** dmx_read_a | pipeline running timestamp for
> audio is in ns: %lld\n", pipeline_running_time);
> } else {
>     printf("*** DEBUG *** dmx_read_a | Sorry, pipelineclock NOT
> available...\n");
>     GST_BUFFER_PTS(gbuffer) = GST_CLOCK_TIME_NONE;
> }
>
> // Fill data into buffer
> bc = gst_buffer_fill(gbuffer, 0, buf, rc);
>
> // Push the buffer into the appsrc
> g_signal_emit_by_name (((CustomData *)data)->aappsrc, "push-buffer",
> gbuffer, &rb);
>
> // Free the buffer now that we are done with it
> gst_buffer_unref (gbuffer);


More information about the gstreamer-devel mailing list