<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
<div class="moz-cite-prefix">I had similar problems when using live
sources and adding in wav files at any time. I solved it by using
multisrc (not a filesrc) and applying a time offset to the mixer
sinkpad;</div>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix">See:
<a class="moz-txt-link-freetext" href="https://stackoverflow.com/questions/69624156/adding-10-second-wav-file-to-gstreamer-pipeline-that-is-already-playing/69717638#69717638">https://stackoverflow.com/questions/69624156/adding-10-second-wav-file-to-gstreamer-pipeline-that-is-already-playing/69717638#69717638</a></div>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix">On 14/01/2022 12:17 am, Marianna Smidth
Buschle via gstreamer-devel wrote:<br>
</div>
<blockquote type="cite"
cite="mid:4d121459-88fc-ed12-6ed7-8d1e0359e777@qtec.com">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<p>Hello,</p>
<p>I strongly believe I have experienced similar issues, but
related to images/video instead of audio.</p>
<p>Your basic problem is that 'multifilesrc' is not a live source,
while you use the 'audiotestsrc' as live.</p>
<p>That basically means that while a live source will produce
buffers based on the configured timing (from CAPS), the non-live
source will produce buffers as fast as possible (or as fast as
the downstream elements tell it to).</p>
<p>You can try checking the difference by doing:</p>
<pre class="moz-quote-pre" wrap="">audiotestsrc is-live=true wave=ticks ! audio/x-raw,format=S16LE,rate=48000,channels=1 ! autoaudiosink sync=true</pre>
<p>And <br>
</p>
<pre class="moz-quote-pre" wrap="">audiotestsrc is-live=true wave=ticks ! audio/x-raw,format=S16LE,rate=48000,channels=1 ! autoaudiosink sync=false</pre>
<p>For the live source you shouldn't see any difference.<br>
But I expect that you will see for the non-live, depending on
using 'autoaudiosink sync=true' or 'autoaudiosink sync=false'</p>
<pre class="moz-quote-pre" wrap="">multifilesrc do-timestamp=true loop=true location=count.wav ! wavparse ignore-length=1 ! audio/x-raw,format=S16LE,rate=48000,channels=1 ! autoaudiosink sync=false
</pre>
<p><br>
</p>
<p>Now, the way I managed to have file sources working as "live
sources" for me was by adding either a 'identity sync=true' or
'clocksync' to the pipeline.<br>
Something like:<br>
</p>
<pre class="moz-quote-pre" wrap="">multifilesrc do-timestamp=true loop=true location=count.wav ! wavparse ignore-length=1 ! audio/x-raw,format=S16LE,rate=48000,channels=1 ! clocksync ! autoaudiosink sync=false</pre>
<p><br>
</p>
<p>Now, I do remember some issues with 'multifilesrc', so I would
recommend trying also with 'filesrc' instead.</p>
<p><br>
</p>
<p>And I was using H264 streams packed to MPEG-TS, in this case it
would only work after the demuxer:</p>
<div style="color: #d4d4d4;background-color: #1e1e1e;font-family: 'Droid Sans Mono', 'monospace', monospace, 'Droid Sans Fallback';font-weight: normal;font-size: 14px;line-height: 19px;white-space: pre;"><div><span style="color: #6a9955;">gst-launch-1.0 filesrc location=/tmp/test1.ts ! tsdemux name=demux :</span></div><div><span style="color: #6a9955;">queue ! identity sync=true ! h264parse ! avdec_h264 qos=false ! videoconvert ! ximagesink</span></div><div><span style="color: #6a9955;">demux. : queue ! identity sync=true ! decodebin : audioconvert ! autoaudiosink</span></div></div>
<p><br>
</p>
<p>Note that I haven't tested any of the pipelines besides this
last one, which comes from my own project...<br>
</p>
<p><br>
</p>
<p>Best Regards</p>
<p>Marianna S. Buschle</p>
<p><br>
</p>
<div class="moz-cite-prefix">On 13.01.2022 13.00, <a
class="moz-txt-link-abbreviated moz-txt-link-freetext"
href="mailto:gstreamer-devel-request@lists.freedesktop.org"
moz-do-not-send="true">gstreamer-devel-request@lists.freedesktop.org</a>
wrote:<br>
</div>
<blockquote type="cite"
cite="mid:mailman.46.1642075203.14000.gstreamer-devel@lists.freedesktop.org">
<pre class="moz-quote-pre" wrap="">Hello everyone,
I'm having an issue here that's probably very simple, but I can't see what's wrong.
I've been using the following audio source for testing in my larger WebRTC pipeline:
audiotestsrc is-live=true wave=ticks !
audio/x-raw,format=S16LE,rate=48000,channels=1 ! tee allow-not-linked=true
name=audiotestsrc
Now I've tried to replace it with an audiofile of a voice counting (to estimate
delay etc.):
multifilesrc do-timestamp=true loop=true location=count.wav ! wavparse
ignore-length=1 ! audio/x-raw,format=S16LE,rate=48000,channels=1 ! tee
allow-not-linked=true name=audiotestsrc
The audiofile does have S16LE/48kHz/mono, so there shouldn't be any format
issues. Both variants work when I run them in gst-launch-1.0 and append an
autoaudiosink at the end; I can even replicate the encoder pipeline by appending
"... ! queue ! opusenc ! rtpopuspay ! queue max-size-time=100000000 !
rtpopusdepay ! opusdec ! autoaudiosink" and it still works for both of them.
However, when I connect the encoder output to webrtcbin within my larger
pipeline, then the multifilesrc seems to never start streaming. Caps negotiation
and WebRTC SDP negotiation both complete and seem to be fine, but I'm never
actually getting an audiostream unless I keep using audiotestsrc (or e.g.
alsasrc/pulsesrc, those work as well).
For the curious, audiofile in question is here:
<a class="moz-txt-link-freetext" href="https://floe.butterbrot.org/external/count.wav" moz-do-not-send="true">https://floe.butterbrot.org/external/count.wav</a>
Any suggestions?
Thanks and best regards, Florian</pre>
</blockquote>
<pre class="moz-signature" cols="72">--
Best regards / Med venlig hilsen
“Marianna Smidth Buschle”</pre>
</blockquote>
<p><br>
</p>
</body>
</html>