Synchronization Issue

Patrick Doyle wpdster at gmail.com
Thu Jan 29 05:22:16 PST 2015


Hello fellow Gstreamer travelers...

I have a system in which I collect and analyze videos on one computer
(let's call it A) and send the analysis reports to another computer
(B) via the network.  Now B needs to know the time at which Computer A
saw something.  But computer B's clock is completely independent of
that of Computer A.

It strikes me that this is an already solved problem, and that
Gstreamer must have some infrastructure, and some techniques for
dealing with this sort of thing.  It seems very similar (in my mind)
to the situation where video frames are captured with a camera at 30
frames/sec, according to one crystal oscillator on the camera, while
audo is captured at 44100 samples/sec, according to the oscillator on
the audio card, and both must be combined to be transmitted over the
air via a (digital) broadcast television channel, whose (FCC
regulated) timing must be derived from yet a third, independent clock.

So how is this handled in Gstreamer?

I have read http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/chapter-clocks.html,
which states that every Gstreamer pipeline maintains a GstClock
object, and (a few pages later) that when "...the pipeline goes to the
PLAYING state, it will go over all elements in the pipeline from sink
to source and ask each element if they can provide a clock. The last
element that can provide a clock will be used as the clock provider in
the pipeline."

All of that makes sense intellectually, but how does it work practically?

In my case, I have a GigE Vision camera, to which I interface using
Emmanual Pacaud's excellent Aravis Gstreamer plugin.  That plugin is
derived from a GstPushSrc object.  Can/Should I assume that this
object provides the clock for the pipeline?  I can't find evidence of
that in gstpushsrc.c or gstbasesrc.c, but perhaps I'm not looking hard
enough.  What should I look for?

As a side node, I'm curious why GstBaseSrc thinks that the aravissrc
is a  "pseudo-live" element instead of a "live" element.  What's the
difference between the two and why would I care?  (Right now, I only
care because I found it to be surprising when the GstBaseSrc element
computed a latency figure for the aravissrc object).

Where am I going with all of this?  I don't know yet, but I thought I
would start by gaining a better understanding of what Gstreamer is
doing with clocks and such.

Back to my original problem, if I could synchronize the clocks on the
two computers with NTP or somesuch, and I could report the difference
between the CPU clock and the videostream clock as measured on
computer A, then I should be able to relate that to time on computer
B.

If I knew that the timestamps reported in the video pipeline were
those produced by the camera, how could could compute delta between
the camera clock and the CPU clock as early in the pipeline as
possible?  These clocks will drift with time, so I will need to track
this delta (or perhaps a low pass filtered version of it).

Any help or tips would be greatly appreciated.

Thanks for reading this far.

--wpd


More information about the gstreamer-devel mailing list