How do you construct the timestamps/duration for video/audio appsrc when captured by DeckLink?
Andres Gonzalez
andres.agoralabs at gmail.com
Fri Feb 5 02:48:44 UTC 2016
Hi,
I am using DeckLink capture cards and native C code to capture HD/SDI. In
the callback of the DeckLink driver, I get the time_scale, video_frame_time,
and video_frame_duration. Then after processing, I use two gstreamer appsrc
to construct a pipeline for streaming RTP/RTCP. So I have to construct a
buffer then push it into appsrc like this (this is for the video, I do the
same for the audio appsrc):
GstBuffer *pVideoData = gst_buffer_new();
GstMemory *pVideoMemory = gst_memory_new_wrapped(GST_MEMORY_FLAG_READONLY,
pBuffer, m_nSize, 0, m_nSize, nullptr,
nullptr);
gst_buffer_append_memory(pVideoData, pVideoMemory);
GST_BUFFER_PTS(pVideoData) = m_tTimestamp ;
GST_BUFFER_DURATION(pVideoData) = m_nVideoDuration;
tReturnCode = gst_app_src_push_buffer(GST_APP_SRC(m_pGstVideoSource),
pVideoData);
Question #1: Is this the correct way to push data into appsrc?
Question #2: What should the PTS timestamp be?
Question #3: I am assuming that the duration is 1/fps. Is this correct?
I am also capturing audio in the same DeckLink driver callback and using a
separate appsrc for audio.
Question: #3: I am assuming that I use the same PTS timestamp and duration
for the audio appsrc. Is this correct?
The reason why I want to verify my code/reasoning here is because my code
works fine for a solo video RTP stream, but when I add the audio RTP session
it breaks and the pipeline only sends RTCP but no RTP video/audio media
pkts.
Thanks,
-Andres
--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/How-do-you-construct-the-timestamps-duration-for-video-audio-appsrc-when-captured-by-DeckLink-tp4675678.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.
More information about the gstreamer-devel
mailing list