<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
</head>
<body>
<p data-sourcepos="1:1-1:182" dir="auto">Hello<br>
<br>
I'm a software programer from Barcelona and my company develops
professional video servers for broadcasting. We use propietary
SDKs for managing video and audio, like MatroxDSX, Medialooks,
BlackMagicSDK, ... For a recent project we are studying the
possibility to use GStreamer form sending an SRT stream, but the
resulting stream in some receivers, no all, like VLC, drops audio
samples and I don't know the reason.<br>
</p>
<p data-sourcepos="1:1-1:182" dir="auto">If I use this pipe, the
resulting stream is Ok. Players like VLC are able to connect them
using <a href="srt://127.0.0.1:5011?mode=caller"
class="moz-txt-link-freetext">srt://127.0.0.1:5011?mode=caller</a>
and the works fine, with no errors and valid video and audio data<br>
</p>
<p data-sourcepos="3:1-3:250" dir="auto">gst-launch-1.0 uridecodebin
uri="<a>file:///C:/temp/AT_0007</a> Corzo emboscado.mxf"
name=decode ! videoconvert ! x264enc ! queue ! mpegtsmux name=mux
! queue ! srtsink uri=<a href="srt://127.0.0.1:5011?mode=listener"
class="moz-txt-link-freetext">srt://127.0.0.1:5011?mode=listener</a>
decode. ! audioconvert ! avenc_aac ! queue ! mux.</p>
<p data-sourcepos="5:1-5:263" dir="auto">But if I use appsrc the
problem appears in some players, like VLC, but there are no errors
in the log window<br>
</p>
<p data-sourcepos="7:1-7:228" dir="auto">appsrc is-live=true
do-timestamp=true ! videoconvert ! x264enc ! queue ! mpegtsmux
name=mux ! queue ! srtsink uri=<a
href="srt://127.0.0.1:5011?mode=listener"
class="moz-txt-link-freetext">srt://127.0.0.1:5011?mode=listener</a>
appsrc is-live=true do-timestamp=true ! audioconvert ! avenc_aac !
queue ! mux.</p>
<p data-sourcepos="9:1-9:193" dir="auto">I have done multiple tests
with no success, removing queues, using UDP instead of SRT, saving
to disk the trasport stream and playing back using VLC...<br>
The "need-data" callbacks for audio and video are very easy The
always push a buffer of one frame for video and the corresponding
audio data for audio. The timestamps are ok.<br>
If instead of send the data to a muxer we change the pipe to
render the incoming video and audio, there are no problems, and
the audio sounds ok. It really looks like a timestamp problem,
but I can't find the solution<br>
</p>
This is the "need-data code"<br>
<br>
bool want_video = false;<br>
<br>
void cb_need_video_data(GstElement *src, guint unused_size, gpointer
user_data)<br>
{<br>
want_video = true;<br>
return;<br>
}<br>
<br>
void write_video_data(const void *data, int data_size)<br>
{<br>
if(!want_video) return false;<br>
want_video = false;<br>
<br>
// Fetch num den factors<br>
GstStructure *s = gst_caps_get_structure(m_vCaps, 0);<br>
gint num = 0, den = 0;<br>
gst_structure_get_fraction(s, "framerate", &num, &den);<br>
<br>
GstBuffer *buffer =
gst_buffer_new_wrapped_full(GST_MEMORY_FLAG_READONLY, (gpointer)
data, data_size, 0, data_size, NULL, NULL);<br>
static GstClockTime timestamp = 0;<br>
GST_BUFFER_PTS(buffer) = timestamp;<br>
GST_BUFFER_DURATION(buffer) =
gst_util_uint64_scale_int(GST_SECOND, den, num);<br>
// GST_BUFFER_DTS(buffer) = timestamp -
GST_BUFFER_DURATION(buffer);<br>
timestamp += GST_BUFFER_DURATION(buffer);<br>
GstFlowReturn ret = gst_app_src_push_buffer((GstAppSrc *) vAppSrc,
buffer);<br>
}<br>
And the same for audio. The incoming data comes from our video
server<br>
<br>
Can anybody help me? I'm working on Windows with the last GStreamer
version.<br>
<br>
<p></p>
</body>
</html>