Transmit absolute time

Pietro Bonfa' pietro_bonfa at
Wed Oct 21 07:15:06 PDT 2015

Dear You,

at the present stage I can send and receive timestamps with a slightly
modified version of the onvif plugin that I found in gst-plugins-bad but
I'm facing two issues:

1) With both the following two pipelines

"videotestsrc is-live=true ! videoconvert ! x264enc tune=\"zerolatency\"
! rtph264pay pt=96 ! rtpatimetimestamp name=pay0 ntp-offset=0"

"appsrc name=mysrc ! videoconvert ! x264enc tune=\"zerolatency\" !
rtph264pay pt=96 ! rtpatimetimestamp name=pay0 ntp-offset=0"

the PTS of the incoming buffers in rtpatimetimestamp (*) starts from
3600000000000000 (ns) independently on the PTS information present in
the source element.
I'm tend to think that this is due to x264enc but I cannot find anything
about this in the docs.

2) Rather than touching timestamps in the receiver part wouldn't it be
better to attach metadata to the buffer with gst_buffer_add_meta()?


(*) rtpatimetimestamp is my modified version of rtponviftimestamp
(available at the repo I have already posted, but the changes are
irrelevant in this context).

On 10/19/2015 11:50 AM, Sebastian Dröge wrote:
> On So, 2015-10-18 at 20:19 +0200, Pietro Bonfa' wrote:
>> On 10/16/2015 03:28 PM, Sebastian Dröge wrote:
>>> On Do, 2015-10-15 at 11:42 +0200, Pietro Bonfa' wrote:
>>> Let us know if you run into any problems or have further questions
>>> :)
>> At the present stage I'm trying to understand the structure of rtp
>> plugins that are in gst-plugins-good in order to insert
>> gst_rtp_buffer_get/set_extension,
>> gst_rtp_buffer_get/set_extension_data
>> and gst_rtp_buffer_get/set_extension_bytes somewhere.
>> Readings, suggestions and/or examples are welcome.
> You would implement a new element, based around GstElement. Inside the
> chain function you would use the GstRTPBuffer API to add the header
> extension, or to get/parse it.
> I have some code here for doing that but I'll have to find some time to
> clean it up before I can publish it anywhere.
>>> So you mean that the frame grabbers are not producing a frame at
>> 0.025
>>> second boundaries but somewhere around that? And you want all frame
>>> grabbers to produce a frame at exactly N*0.025 seconds, i.e. all of
>>> them at the same time?
>> No. Frame-grabbers are not perfect so they produce frames at
>> N.025+epsilon1, N.050+epsilon1, ...
>> N.025+epsilon2, N.050+epsilon2, ...
>> This is out of my control, but I have timestamps.
>> What I wanted to clarify is that if I use two pipelines like in RTSP
>> net-clock-client code for receiving the two streams (from two
>> different servers), they are indeed synchronized but I'm missing a
>> way to estimate epsilon1 from pipeline1 and epsilon2 from pipeline2.
> You could add an element in the pipeline (that you write) to collect
> timestamp statistics, or just use a pad probe on one of the pads to
> collect timestamps.
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at

More information about the gstreamer-devel mailing list