[Bug 755125] rtp: RTCP mapping between NTP and RTP time could be capture or send time based

GStreamer (GNOME Bugzilla) bugzilla at gnome.org
Sun Sep 20 10:36:03 PDT 2015


https://bugzilla.gnome.org/show_bug.cgi?id=755125

Jan Schmidt <thaytan at noraisin.net> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |thaytan at noraisin.net

--- Comment #12 from Jan Schmidt <thaytan at noraisin.net> ---
Here's a long quote from the RFC, with a nice illustrative example that says we
probably want to allow both behaviours as a programmatic option:

   timestamp: 32 bits

      ....

      RTP timestamps from different media streams may advance at
      different rates and usually have independent, random offsets.
      Therefore, although these timestamps are sufficient to reconstruct
      the timing of a single stream, directly comparing RTP timestamps
      from different media is not effective for synchronization.
      Instead, for each medium the RTP timestamp is related to the
      sampling instant by pairing it with a timestamp from a reference
      clock (wallclock) that represents the time when the data
      corresponding to the RTP timestamp was sampled.  The reference
      clock is shared by all media to be synchronized.  The timestamp
      pairs are not transmitted in every data packet, but at a lower
      rate in RTCP SR packets as described in Section 6.4.

      The sampling instant is chosen as the point of reference for the
      RTP timestamp because it is known to the transmitting endpoint and
      has a common definition for all media, independent of encoding
      delays or other processing.  The purpose is to allow synchronized
      presentation of all media sampled at the same time.

      Applications transmitting stored data rather than data sampled in
      real time typically use a virtual presentation timeline derived
      from wallclock time to determine when the next frame or other unit
      of each medium in the stored data should be presented.  In this
      case, the RTP timestamp would reflect the presentation time for
      each unit.  That is, the RTP timestamp for each unit would be
      related to the wallclock time at which the unit becomes current on
      the virtual presentation timeline.  Actual presentation occurs
      some time later as determined by the receiver.

      An example describing live audio narration of prerecorded video
      illustrates the significance of choosing the sampling instant as
      the reference point.  In this scenario, the video would be
      presented locally for the narrator to view and would be
      simultaneously transmitted using RTP.  The "sampling instant" of a
      video frame transmitted in RTP would be established by referencing
      its timestamp to the wallclock time when that video frame was
      presented to the narrator.  The sampling instant for the audio RTP
      packets containing the narrator's speech would be established by
      referencing the same wallclock time when the audio was sampled.
      The audio and video may even be transmitted by different hosts if
      the reference clocks on the two hosts are synchronized by some
      means such as NTP.  A receiver can then synchronize presentation
      of the audio and video packets by relating their RTP timestamps
      using the timestamp pairs in RTCP SR packets.

-- 
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.


More information about the gstreamer-bugs mailing list