Synchronization Issue

Jan Schmidt thaytan at noraisin.net
Fri Jan 30 05:28:42 PST 2015


On 30/01/15 11:05, Patrick Doyle wrote:

Hi Patrick,

> On Thu, Jan 29, 2015 at 9:22 AM, Jan Schmidt <thaytan at noraisin.net> wrote:
>> On 30/01/15 00:22, Patrick Doyle wrote:
>>>
>>> Hello fellow Gstreamer travelers...
>>>
>>> I have a system in which I collect and analyze videos on one computer
>>> (let's call it A) and send the analysis reports to another computer
>>> (B) via the network.  Now B needs to know the time at which Computer A
>>> saw something.  But computer B's clock is completely independent of
>>> that of Computer A, and both are independent of the clock used by the
>>> camera.
>>
>> What you want is the GStreamer network time provider and network clock. That
>> will let you synchronise a common clock across different machines.
>>
>> http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-libs/html/gstreamer-net.html
>
> OK, think I have a plan that should work, but I'd appreciate any
> feedback folks care to give me.  I'm either overthinking this, or I've
> managed to figure out the correct Gstreamer way to do this.  I'm just
> not sure which :-)
>
> I am going to start by synchronizing the two independent computers
> using ntp.  If that doesn't prove sufficient, then I'll investigate
> getting enough of Gstreamer on computer B that I can run the
> GstNetTimeProvider for computer A.  For this discussion, it doesn't
> matter.  I am focusing on how to synchronize the video stream clock to
> computer A's system clock (or to the GstNetClientClock, if that proves
> necessary).

If you're not using GStreamer for precise playback, then NTP on both 
machines should be sufficient.

> I think I need to create a new GstClock (GstAravisClock) that
> represents the timestamps in the GigE Vision supplied frames.  I will
> make that clock a slave to the GstSystemClock.  When I receive the
> header packet for a video frame, I will take a snapshot of the
> GstSystemClock at that instant.  Then I will call
> gst_clock_add_observation() for each frame received, and use the
> GstAravisClock as the timestamp for my video frames.  I think this
> will provide timestamps for the video frames that correspond to the
> system time for computer A at which the video frame was captured.
> Then ntp will ensure that the timestamps on computer A are (within
> reason) the same as timestamps on computer B.
>
> Does that sound like I've got it right?  Or am I overthinking this?

You probably don't need to synthesise your own clock. This isn't much 
different to v4l2src, which uses the provided pipeline clock to 
timestamp buffers it captures, with some calculations to make sure the 
timestamp it applies is the timestamp the kernel driver captured the 
frame, rather than when GStreamer received it.

In your case, if possible you want the timestamp on the clock (whichever 
clock was provided to the pipeline) that the video frame was captured at 
on the camera. If not, the moment the frame header is read off the 
network might be 'close enough' for your purposes.

You will need to be somewhat careful. By default the GstSystemClock 
returns the POSIX monotonic clock time - which is not at all related to 
the NTP system clock.

To make it report the wall clock NTP time, you'll need to do:

   clock = gst_system_clock_obtain();
   g_object_set (clock, "clock-type", GST_CLOCK_TYPE_REALTIME, NULL);

You'll also need to make sure that even with NTP, both machines are 
operating in the same timezone, or else your timestamps won't correlate.

> I still need to figure out how to "take a snapshot of the
> GstSystemClock" upon receipt of the GigE Vsion header frame without
> breaking Aravis's encapsulation too badly.

I don't know much about how you'll do that. Depends on the API you have 
available.

> I also need to figure out whether the aravissrc should be
> "pseudo-live" or "live".  (Right now, it is "pseudo-live", but based
> on some feedback on IRC today, I think it should be "live").

A camera is a live source. Pseudo-live sources are ones which are 
generating data internally as fast as possible, but controlling the 
output rate by waiting to push each buffer out. They're used to make 
live sources out of elements that are necessarily tied to real time - 
videotestsrc live=true, for example.

> (I should also spend some time figuring out if the GigE Vision
> protocol already has a solution for camera-to-computer
> synchronization.)

Do that :)

Cheers,
Jan.

>
> Thanks as always for any tips or pointers you can give me.
>
> --wpd
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>


More information about the gstreamer-devel mailing list