How do you construct the timestamps/duration for video/audio appsrc when captured by DeckLink?

Andres Gonzalez andres.agoralabs at gmail.com
Sat Feb 6 21:10:38 UTC 2016


just to add some additional info in support of my questions above...

With the DeckLink capture cards, I have the option to set a scale value. So
currently I am setting that scale value to 1000 * framerate.  So for 30fps,
the scale value is 30000. Then in the DeckLink capture callback, I am given
the frame-time and the frame-duration.  So for each captured frame, I get a
frame-duration of 1000 and the frame-time increases by 1000 each subsequent
frame.

I am not sure of how to best map this into the gstreamer buffer times. 
Currently this is what I am doing (m_tTimestamp starts out at 0--the
following works but is very kludgy):

GST_BUFFER_PTS(pVideoData) = m_tTimestamp;
GST_BUFFER_DTS(pVideoData) = m_tTimestamp;
GST_BUFFER_DURATION(pVideoData) = 33333333;
m_tTimestamp += 33333333;

The documentation for GstBuffer says that the pts and dts are in nanoseconds
so I am just hard coding them here. But this doesn't seem like the proper
way to do this because it is only based on the fixed frame rate and does not
take into account any variation in the capture rate of the DeckLink driver
(which BTW I have *not* even observed any variation in the frame-time the
DeckLink driver passes into each capture callback so perhaps it doesn't
matter??)

Can anyone explain (hopefully  :-) in a simple way) what the *proper* way to
do this is.

Thanks,

-Andres



--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/How-do-you-construct-the-timestamps-duration-for-video-audio-appsrc-when-captured-by-DeckLink-tp4675678p4675701.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.


More information about the gstreamer-devel mailing list