RTP/RTSP Camera Capture Timestamping

Jeff Shanab jshanab at jfs-tech.com
Thu Sep 29 17:42:16 UTC 2022


IP cameras are full linux machines and usually run an NTP dameon. some
cheat and just force sync every hour.
The most common transport for video is RTSP which describes a series of RTP
streams.
   This has a standard that within 2 seconds and every 5 seconds after
that, sends a SENDER REPORT as part of the rtcp protocol on the channel+1
or port+1
   In this there is a mapping from NTP time to the streams RTP timestamp
which is in a faster clock, for H264/5 it is a 90khz clock.
The RTP client code stores the offset at connect time to the one optionally
provided in the Play response or the clients wall clock time.
The RTCP SR are then used to adjust the offset making the video live,
timestamped to the wallclock of the client.
RTCP allows an ongoing and already connected stream to adjust the
timestamps as NTP adjustments come to the camera, thur the rtsp session
without needing to re-connect

If you want actual camera side time, other than seeing it in On Screen
Display, you need the RTP extension protocol used for replayed video.

Onvif uses this for example and on page 27 there is a pretty good
explanation

https://www.onvif.org/specs/stream/ONVIF-Streaming-Spec.pdf







On Thu, Sep 29, 2022 at 12:56 PM Joshua Quesenberry via gstreamer-devel <
gstreamer-devel at lists.freedesktop.org> wrote:

> Good Afternoon All,
>
>
>
> I’m looking into IP cameras right now and trying to understand how to
> accurately determine when a frame was captured. Hopefully some of you all
> have dealt with this and can give me some guidance? I have a couple
> different scenarios, each of which I want to be able to know the UTC time
> of when the frame was captured by the camera or at the very least when the
> packet hits a Linux box’s kernel.
>
>
>
> Scenario 1: USB Camera -> Linux Box 1 via v4l2src.
>
>
>
> Scenario 2: USB Camera -> Linux Box 1 via v4l2src -> Network via rtpbin
> (RTP/RTCP) -> Linux Box 2 via rtpbin.
>
>
>
> Scenario 3: GigE Camera RAW Stream -> Linux Box 1 via The Imaging Source
> element (tcamsrc).
>
>
>
> Scenario 4: GigE Camera RAW Stream -> Linux Box 1 via The Imaging Source
> element (tcamsrc) -> Network via rtpbin (RTP/RTCP) -> Linux Box 2 via
> rtpbin.
>
>
>
> Scenario 5: IP Camera H264 Stream with NTP Sync every 60min -> Linux Box 1
> via rtspsrc element.
>
>
>
> Scenario 6: IP Camera H264 Stream with NTP Sync every 60min -> Linux Box 1
> via rtspsrc element -> Network via rtpbin (RTP/RTCP) -> Linux Box 2 via
> rtpbin.
>
>
>
> Is what I’m looking for possible in all of these scenarios?
>
>
>
> When using the timeoverlay element, when are time-mode ==
> elapsed-running-time (4) and time-mode == reference-timestamp (5) available?
>
>
>
> When using the identity element with silent=false, is there a way to print
> to the screen the content of GstBuffers meta data? Seems like from reading
> this may contain some time elements I’d be interested in?
>
>
>
> Can RTP/RTCP/RTSP propagate UTC capture time information or does it get
> lost early on in the pipeline across multiple Linux boxes?
>
>
>
> All of my systems and the IP Cameras above should be looking at a local
> NTP Server instance at this point, so hopefully that will help things? It
> looks like the GigE cameras can’t talk to an NTP server.
>
>
>
> Thanks!
>
>
>
> Josh Q
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20220929/08c355d7/attachment.htm>


More information about the gstreamer-devel mailing list