How do you construct the timestamps/duration for video/audio appsrc when captured by DeckLink?

Tim Müller tim at centricular.com
Tue Feb 9 14:12:19 UTC 2016


On Thu, 2016-02-04 at 18:48 -0800, Andres Gonzalez wrote:

Hi Andres,

> I am using DeckLink capture cards and native C code to capture
> HD/SDI. In the callback of the DeckLink driver, I get the time_scale,
> video_frame_time, and video_frame_duration.  Then after processing, I
> use two gstreamer appsrc to construct a pipeline for streaming
> RTP/RTCP. So I have to construct a buffer then push it into appsrc
> like this (this is for the video, I do the same for the audio
> appsrc):
> 
> GstBuffer *pVideoData   = gst_buffer_new();
> GstMemory *pVideoMemory =
> gst_memory_new_wrapped(GST_MEMORY_FLAG_READONLY,
>                              pBuffer, m_nSize, 0, m_nSize, nullptr,
> nullptr);
> 
> gst_buffer_append_memory(pVideoData, pVideoMemory);
> GST_BUFFER_PTS(pVideoData) =  m_tTimestamp ;
> GST_BUFFER_DURATION(pVideoData) = m_nVideoDuration;
> 
> tReturnCode = gst_app_src_push_buffer(GST_APP_SRC(m_pGstVideoSource),
> pVideoData);
> 
> Question #1: Is this the correct way to push data into appsrc?

In principle, yes. There's also gst_buffer_new_wrapped()
and gst_buffer_new_wrapped_full() for what it's worth.

The first issue is memory management. I'm not sure this is quite
correct yet here, but you would probably have noticed crashes if it's
not. Basically you need to keep the memory alive (and make sure it's
not being reused) as long as the GstMemory/GstBuffer are still in use
inside the GStreamer pipeline. Secondly you need to make sure the
memory gets freed properly when GStreamer is done with the
GstBuffer/GstMemory. As far as I can tell here you don't take care of
memory management really, since you're not setting a destroy notify
function or pointer - so you just assume the memory will stay around
long enough and not be re-used?

For starters it might be prudent to do something like:

 pVideoData =
     gst_buffer_new_wrapped (g_memdup (pBuffer, m_nSize), m_nSize);

which copies the data. You can sort out proper memory handling later
then.


> Question #2: What should the PTS timestamp be?

This is the question. The generic answer is "it depends". It depends
what the rest of the pipeline is, if it's a live pipeline where
something syncs to a clock, or not. And whether you feed live data into
the pipeline or can generate it faster than real-time.

If you can generate data faster than real-time, you would typically
just push data into appsrc and start timestamping from 0 and increment
timestamps accordingly. This is equivalent to what e.g. videotestsrc
would do, it would just generates frames as quickly as possible and
increment the timestamp by 1/fps for each frame. The rest of the
pipeline will then either consume data as fast as possible (e.g.
videotestsrc ! x264enc ! matroskamux ! filesink) or in real-time (e.g.
videotestsrc ! autovideosink). In the latter case videotestsrc would at
some point be throttled by the buffering/queues in the pipeline - when
those run full, the pad_push() will block, and then data will be
consumed+produced in ~real-time once the queues are full. appsrc has
queuing internally, that will by default run full fairly quickly too.

But that is not your case. Your case is that you want to capture and
feed live data into a pipeline which is then also streamed out live.

In this case two things are important: (a) the timestamps on the
buffers, and (b) latency configuration/handling.

You probably want to set the "min-latency" property on appsrc to
GST_SECOND/fps and also "is-live" to TRUE. That should inform the rest
of the pipeline about the live-ness of the source and that there's some
latency involved when generating the data.

Next, timestamps: in the simplest case you just don't put timestamps on
the buffers you push in and instead set the "do-timestamp" property to
TRUE. Then appsrc will just timestamp them according to the running
time of the clock. This should Just Work, but it might add some jitter
since the timestamp will reflect the time when the buffer is processed
by appsrc and not when it was captured. This may not be a problem of
course, but it's not perfect yet.

Duration: setting duration to GST_SECOND/fps seems fine. The duration
doesn't really matter that much in your scenario. It's nice to give an
indication. I would expect the decklink card to capture at a constant-
ish framerate, and perhaps miss a frame every now and then (unlike
webcams where the framerate is just going to be all over the place in
practice)

Generally speaking it's also best to count frames and then calculate
everything based on n_frames and n_frames + 1 to avoid rounding erros
(that is: every 30 frames your duration should be 1 nanosecond longer
so that it all adds up ;)). I don't think this matters here though.

For something more sophisticated time-stamp wise, you'll have to
generate timestamps based on the decklink timestamping and base time
but for the GStreamer clock + basetime. And then you'll have to also
take into account drift etc. I'm sure you can find some inspiration
inside the decklinksrc element somewhere.


> Question #3: I am assuming that the duration is 1/fps.  Is this
> correct?

Yes.

> I am also capturing audio in the same DeckLink driver callback and
> using a separate appsrc for audio. 
> 
> Question: #3: I am assuming that I use the same PTS timestamp and
> duration for the audio appsrc. Is this correct?

Probably, but it depends a bit on how the decklink API feeds you data
exactly, if you can make that assumption. Since you get raw audio
samples, you can easily calculate the duration based on the number of
samples and the sample rate. I would just do that.


> The reason why I want to verify my code/reasoning here is because my
> code works fine for a solo video RTP stream, but when I add the audio
> RTP session it breaks and the pipeline only sends RTCP but no RTP
> video/audio media pkts.

One would have to check the debug logs to see why/where it goes wrong I
suppose.

You could try to just play stuff back for starters by doing

  appsrc name=vsrc ! queue ! videoconvert ! xvimagesink

  appsrc name=asrc ! queue ! audioconvert ! pulsesink provide-
clock=false

and see if it plays back okayish or not.

Cheers
 -Tim


-- 
Tim Müller, Centricular Ltd - http://www.centricular.com




More information about the gstreamer-devel mailing list