Get RTP timestamp of the buffer

Yurii Monakov monakov.y at gmail.com
Mon May 8 19:25:08 UTC 2017


Hi Sebastian!

Thank you for the answer.

> If you set buffer-mode=none on rtpbin, you will get the RTP timestamps
(extended) as PTS on the buffers.
Would it work with buffer timestamp in older gstreamer versions? There is
no concept of PTS/DTS in 0.10 series.

> There's also this extension that is not merged yet which would allow you
to get things more directly:
This will be really useful in my situation. At some day we will have
transition from RHEL 6 (gstreamer 0.10.x) to
RHEL 7 with hope to get most recent version of gstreamer and drop all our
slightly customized 0.10 plug-in packages.

> The best way to do it depends on the bigger picture of what you want to do
I want to record RTP streams with correct timing information to the
sequence of files (20 min long). So I have receiving pipeline:

udpsrc -> gstrtpbin -> rtpmp4vdepay -> mpeg4videoparse -> appsink

and recording pipeline:

appsrc -> mp4mux -> filesink

I need to break full pipeline into two separate pipelines because there is
no obvious way to split the stream into file sequence. In this article:
https://groakat.wordpress.com/2012/12/05/gstreamer-stream-h264-webcam-data-to-series-of-files/
one is fighting dynamic pipeline manipulation and I decided that it is too
complex and unreliable.

I'm aware of splitmuxsink element, but I'm using version 0.10.x (and cant
do anything with that).

At the recording site I need to get file sequences with special names
(these are ~2 min intervals for testing purposes):

2017.05.08-12.00.56.143.mp4
2017.05.08-12.02.59.893.mp4
2017.05.08-12.05.03.643.mp4
2017.05.08-12.07.07.393.mp4
2017.05.08-12.09.11.143.mp4
2017.05.08-12.11.14.893.mp4
2017.05.08-12.13.18.643.mp4

So the name of each file is the NTP timestamp of its first frame. And this
timestamp should follow sender's timeline (not the local wallclock).
I need this for later playback (synchronous with other air traffic
information).

When the buffer comes out of appsink (for the first time) I check if it is
keyframe (and check if I have received SR) and start new recording pipeline.
Now I've ended with three solutions:

1. Capture local time of sender report arrival (LOCAL_SR) and then
calculate file start time as:
    FILE_START = NTP_SR + (LOCAL_KEY - LOCAL_SR)
    where NTP_SR - NTP timestamp from sender report, LOCAL_KEY - local time
of keyframe arrival.
    Here I get some random error (~150 ms) in file start, because I don't
know real RTP timestamp.
    Another error source - RTP/NTP pair in SR is somewhere in the future
regarding to surrounding RTP stream timestamps (at least, when the source
is another gstreamer pipeline).

2. Dirty hack with depay element. I've noticed that offset_end field of the
appsink buffer is always invalid.
    So I've modified rtpmp4vdepay element and added these lines to
gst_rtp_mp4v_depay_process:

        rtp_ts = gst_rtp_buffer_get_timestamp(buf);
        GST_BUFFER_OFFSET_END(outbuf) = rtp_ts;

    Now I have RTP timestamp in GST_BUFFER_OFFSET_END and file names become
perfectly spaced.
    You can see this in file names example above - incoming stream is 4 FPS
and file names' milliseconds cycle exactly every fourth file.
    I was unable to find another way to add custom data to GstBuffer in
0.10 series.

3. Get timestamps with custom appsrc (or tee) element, not from UDP sink.
This is not implemented yet.

For now, the best solution is a dirty hack with another fix in our RPMs.
Another thing that embarrasses me is that someday GST_BUFFER_OFFSET_END
modification could break something further in pipeline.

Best Regards,
Yurii


2017-05-08 16:30 GMT+03:00 Sebastian Dröge <sebastian at centricular.com>:

> On Sat, 2017-05-06 at 15:08 +0300, Yurii Monakov wrote:
> > Hi All!
> >
> > Any thoughts on this?
> >
> > I'm developing video recording application for air traffic control
> > system. And RTP <-> NTP timestams matching is essential for later
> > playback of multiple streams (audio, video, digital records).
> > So I need to get perfect time of the first key frame in each video
> > sequence (~20 min long).
> >
> > Maybe I can alter the source code or subclass some gstreamer
> > elements?
>
> If you set buffer-mode=none on rtpbin, you will get the RTP timestamps
> (extended) as PTS on the buffers.
>
> There's also this extension that is not merged yet which would allow
> you to get things more directly:
> https://bugzilla.gnome.org/show_bug.cgi?id=762628
>
>
> And independent of that, rtpbin already has all functionality needed
> for synchronizing RTP packets based on their RTP timestamps, the
> NTP/RTP timestamp mapping and a NTP clock in various ways. If that is
> something you're looking for.
> Generally, all you want to do is already possible in various ways. The
> best way to do it depends on the bigger picture of what you want to do.
>
> --
> Sebastian Dröge, Centricular Ltd · http://www.centricular.com
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20170508/107c6034/attachment-0001.html>


More information about the gstreamer-devel mailing list