How do you construct the timestamps/duration for video/audio appsrc when captured by DeckLink?

Andres Gonzalez andres.agoralabs at gmail.com
Fri Feb 5 16:24:11 UTC 2016


Thank you for your reply Tim.

You ask a very good question. The reason is that I have spent over 6 months
developing a C++ linux application that supports any number of multiple
video inputs (DeckLink, USB webcams, IEEE1394, etc), each input with its own
processing pipeline that I can add any filter (format conversion, color
space conversion, resizing, video signal processing, etc, using the Intel
IPP library); filters which can be added/removed in real time; all input
pipeline outputs are selectable into a main overlay buffer which allows
alpha blending of other overlay source inputs; then this main overlay buffer
feeds any number of multiple Post-Processing-Paths where each post
processing path can have any number of output processing pipelines (similar
to the input processing pipelines using any number of Intel IPP-based
filters); and each post processing path (feeding its multiple output
pipelines) has its own separate overlay functionality, which allows each
post processing path to overlay text in its own language (current only
English and Spanish); then each output pipeline feeds an output DeckLink SDI
output.  I am currently now adding output RTP streaming as an output option.
So, all of this functionality is currently working well with very low
latency (I use hardly any mutexes because they are so slow but instead use
C++11 atomics for all serialization).

Then a couple of weeks ago someone mentioned to me about their video
application which used GStreamer libraries.  I had heard of GStreamer, I
even vaguely remember checking out the website many years ago long before I
started my current video application development, but I really knew nothing
about GStreamer.  So, when I really then looked into GStreamer I was amazed
at what I found. Here I have been developing a source/pipeline/sink
architecture video application for some time now, and just 2 weeks ago I
discovered that a whole source/pipeline/sink architecture media processing
library ecosystem has already been developed!  It pains me to realize that I
must live a very sheltered and myopia life in this age of the vast Internet.

So the simple answer to your question is that I didn't know about GStreamer
when I started developing this video application. I wish I had known about
GStreamer before I started development on this, but I now have so much well
functioning and tested code that I must continue on with what I have. 

So what I am doing is trying to come up to speed with GStreamer and try to
use it to implement new functionality going forward, which currently is
adding output streaming support as an additional type of my output sinks. 
Thus, I am using GStreamer appsrc as an interface to my current code and a
way of initially integrating the use of GStreamer into my current code base.

It is obvious to me that using GStreamer in my application provides tons of
additional benefits so I anticipate that I will use it more and more and
perhaps ultimately re-write many sections of my current code using the
GStreamer APIs.

But now I am struggling to learn GStreamer. GStreamer is both amazing, and
amazingly frustrating. On the one hand it is amazing how much a simple
gst-launch script actually does.  On the other hand, it is very frustrating
trying to understand well enough what is actually happening
under-the-covers, to be able to use it as a viable library to develop solid
industrial strength linux applications. But I am really impressed with what
you guys have built here--it truly is amazing--in its functionality, breath
of coverage, and design.  Please have patience with me as I continue to ask
such newbe/stupid questions as I start this journey of coming up to speed
with GStreamer. 

So Tim.....now that I have answered your question..... :-)  .... can you
answer my questions?   :-)

-Andres

  



--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/How-do-you-construct-the-timestamps-duration-for-video-audio-appsrc-when-captured-by-DeckLink-tp4675678p4675695.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.


More information about the gstreamer-devel mailing list