alsasink underrun recovery issue in a synchronized streaming
Charlie Laub
charleslaub at sbcglobal.net
Sun Feb 25 18:06:25 UTC 2018
-----Original Message-----
From: gstreamer-devel [mailto:gstreamer-devel-bounces at lists.freedesktop.org] On Behalf Of gstreamer-devel-request at lists.freedesktop.org
Sent: Saturday, February 24, 2018 4:00 AM
To: gstreamer-devel at lists.freedesktop.org
Subject: gstreamer-devel Digest, Vol 85, Issue 56
------------------------------
Message: 4
Date: Sat, 24 Feb 2018 00:55:38 -0700 (MST)
From: "danny.smith" <danny.smith at axis.com>
To: gstreamer-devel at lists.freedesktop.org
Subject: alsasink underrun recovery issue in a synchronized streaming
usecase
Message-ID: <1519458938532-0.post at n4.nabble.com>
Content-Type: text/plain; charset=UTF-8
Hi all,
I am using scenario where we stream synchronized audio from one sender to multiple receivers using the approach outlined in the 2015 Gstreamer conference in Dublin by Sebastian Dröge (Synchronised multi-room media playback and distributed live media processing and mixing).
I have set up my latencies towards a more "low-latency" usecase and it works fine.
When I stress my receivers by doing other stuff on the same platform I get alsasink underruns which are expected due to small buffers and period times but not a problem for my usecase. However, quite often the pipeline does not recover correctly from these underruns. Audio can disappear completely, sometimes it comes back a while later. It can also become distorted.
We are using gstreamer 1.10.4 and kernel 4.x. Any ideas or hints on what might be causing this? The pipeline seems to be running fine.
Regards,
Danny
------------------------------
HI Danny,
I am facing a similar problem. I developed a platform for RTP/UDP streaming with gst-launch with which streams audio to a number of clients, where I do further DSP processing using LADSPA plugins before sending the processed audio to one or more alsasinks on each client.
As part of an upgrade for the LADSPA capabilities I made use of some of the "audioXXX" elements, such as audiointerleave, audiomixer, etc. Even with latencies on the RTP jitterbuffer element and sinks that are NOT low, this resulted in buggy behavior. Sometimes when I launch the send and receive pipelines both will go into PLAYING state but no audio is produced. Sometimes the audio is garbled or stutters, either from the start or transitions from playing properly to that behavior. Then sometimes the audio works fine, and might seem to be working properly for hours or days. Then by chance the wheels fall off and it becomes problematic again.
I was able to get reliable playback by reverting all the "audioXXX" elements back to their non-audio equivalents, e.g. audiointerleave-->interleave, audiomixer-->adder, etc. It seems these "audio" elements are not yet stable (they are part of the "bad" plugins after all). That's unfortunate, because I really wanted to make use of their synchronization properties.
If you are using any of these elements, try to eliminate them and see if the problem disappears.
I have not been successful using this list to get any help to these problems, since all my posts went unanswered. Using the non-audio elements was the only way I was able to remedy the situation.
Charlie
More information about the gstreamer-devel
mailing list