How do I use a 90KHz input device clock?

Sebastian Dröge sebastian at
Wed Jun 17 00:12:51 PDT 2015

On Di, 2015-06-16 at 16:08 -0600, John P Poet wrote:
> Hi all,
> I am fairly new to gstreamer so I apologize for being clueless.  I 
> have tried to figure this out myself, but am just spinning my wheels 
> at this point.
> I am working on writing a src plugin for the Euresys Picolo H.264 
> cards:
> which has an on-board 90KHz clock.
> I decided to use Slomo's work on the decklink src as a template:
> -blackmagic-decklink-cards/
> and I generally have it working.  Except for the clock. 

Did you take a look at the code of decklinkvideosrc, and especially the
clock handling? That should have everything you need, including the
relatively complicated calculations for slaving the capture clock to
whatever clock was selected for the pipeline.

The decklinkvideosrc code is very much written for the behaviour of
that hardware's clock though, you will need to understand what it is
trying to do and then adjust that for the behaviour of your hardware's

In any case, for a pipeline where your clock is selected as pipeline
clock, the idea basically is that the timestamps on the buffers are the
capture clock times minus the base time. That is, minus the clock time
when the pipeline went to PLAYING. That way your timestamps will start
approximately around zero if your source starts together with the

Additionally you have to report the latency with the LATENCY query. A
frame will be captured at clock time X, but it will only be available
to you at least 1/framerate later, probably even more. That value has
to be reported as minimum latency.

For the case when another clock is selected as pipeline clock, you will
have to convert capture times from your clock times to the pipeline
clock times. For that the clock slaving feature in GstClock exists,
where you can set a master clock on your clock which then causes
regular observations between both clocks to happen and automatic
adjustment of the clock parameters (the calibration). If you now get
those parameters, you will get a time for both clocks that were
observed at the same time (offset) and a relative rate between both
clocks. That gives you all information you need for converting the
clock times.

> If I don't set the timestamp for the buffer at all, then it it seems 
> to work.  However if I include the line:
> GST_BUFFER_TIMESTAMP (*buffer) = timestamp;
> in my gst_picoloh264_video_src_create() and 
> gst_picoloh264_audio_src_create() routines then I run into trouble.
> The timestamp coming from the Picolo H.264 is running at 90KHz.  I 
> can convert that to nanoseconds using:
> time_stamp = gst_util_uint64_scale_int(time_stamp, 1000000, 90);
> but that leads to my first question:  Should I be using 
> gst_clock_set_calibration() and gst_clock_get_calibration() somehow 
> to achieve this instead?

No, that's for slaving a clock to a master clock.

Sebastian Dröge, Centricular Ltd ·
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 949 bytes
Desc: This is a digitally signed message part
URL: <>

More information about the gstreamer-devel mailing list