<div dir="ltr">when I can get "<span style="font-size:12.8px">time_already_playing"?</span><div><span style="font-size:12.8px">for sample it is correct?</span></div><div><span style="font-size:12.8px"><br></span></div><div><span style="font-size:12.8px">gint64 l_tmp;</span></div><div><span style="font-size:12.8px">gst_element_query_position((QGst::ElementPtr)m_pipeline_ptr, GST_FORMAT_TIME, &</span><span style="font-size:12.8px">l_tmp</span><span style="font-size:12.8px">);</span></div><div><span style="font-size:12.8px"><br></span></div><div><span style="font-size:12.8px">i use qtgstreamer</span></div></div><div class="gmail_extra"><br><div class="gmail_quote">2015-11-10 12:07 GMT+03:00 Tim Müller <span dir="ltr"><<a href="mailto:tim@centricular.com" target="_blank">tim@centricular.com</a>></span>:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">On Mon, 2015-11-09 at 23:07 -0800, _dmp wrote:<br>
<br>
Hi,<br>
<span class=""><br>
> I wish to dynamically connect filesrc to play the file to an existing<br>
> conveyer but I have no play wav file at all.<br>
><br>
> So static pipeline looks like this:<br>
><br>
> alsasrc ! audiomixer ! alsasink<br>
><br>
> I dinamically create bin and connect his to audiomixer<br>
><br>
> fileplaybin: filesrc ! wavparse ! capsfilter ->to "audiomixer"<br>
><br>
> I connect the probe to the point "capsfilter:src", to expect EOS (to<br>
> control the end of file playback).<br>
> Immediately after the start "fileplaybin" I get the EOS and the file<br>
> does not play.<br>
> If I connect "fileplaybin" to "alsasink"(I mean clean connect without<br>
> "alsasrc" and "audiomixer") - file is played normally. The problem is<br>
> in the dynamic connection chains "filesrc" to an existing pipeline.<br>
> What could be the problem and where do I look?<br>
<br>
</span>wavparse will create a GstSegment + timestamps starting from 0.<br>
<br>
It doesn't know that your other branch (alsasrc ! audiomixer !<br>
alsasink) has been playing for a while already.<br>
<br>
So if after some time you hook up that wavparse branch to audiomixer,<br>
audiomixer will see timestamps (converted to running time) which are<br>
already in the past from its perspective, so it will just throw away<br>
all input buffers (or some, if you haven't playing that long yet).<br>
<br>
You can use<br>
<br>
gst_pad_set_offset (pad, time_already_playing);<br>
<br>
to shift the timeline of the newly-added branch.<br>
<br>
Cheers<br>
<span class="HOEnZb"><font color="#888888"> -Tim<br>
<br>
--<br>
Tim Müller, Centricular Ltd - <a href="http://www.centricular.com" rel="noreferrer" target="_blank">http://www.centricular.com</a><br>
</font></span><div class="HOEnZb"><div class="h5"><br>
<br>
_______________________________________________<br>
gstreamer-devel mailing list<br>
<a href="mailto:gstreamer-devel@lists.freedesktop.org">gstreamer-devel@lists.freedesktop.org</a><br>
<a href="http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel" rel="noreferrer" target="_blank">http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel</a><br>
</div></div></blockquote></div><br></div>