[Bug 751179] New: rtpjitterbuffer: jitter buffer does not send do-lost events downstream when packets don't arrive in time

GStreamer (GNOME Bugzilla) bugzilla at gnome.org
Thu Jun 18 13:34:40 PDT 2015


https://bugzilla.gnome.org/show_bug.cgi?id=751179

            Bug ID: 751179
           Summary: rtpjitterbuffer: jitter buffer does not send do-lost
                    events downstream when packets don't arrive in time
    Classification: Platform
           Product: GStreamer
           Version: git master
                OS: Linux
            Status: NEW
          Severity: normal
          Priority: Normal
         Component: gst-plugins-good
          Assignee: gstreamer-bugs at lists.freedesktop.org
          Reporter: dv at pseudoterminal.org
        QA Contact: gstreamer-bugs at lists.freedesktop.org
     GNOME version: ---

If using the jitter buffer with mode "synced" or "none", it does not deliver
expected results when RTP packets arrive too late.

Here is how to replicate:

Sender:
gst-launch-1.0 audiotestsrc wave=saw ! opusenc audio=true ! rtpopuspay !
udpsink host=127.0.0.1 port=54001 -v
Leitung wird auf PAUSIERT gesetzt ...

Receiver:
gst-launch-1.0 udpsrc port=54001 caps="application/x-rtp, media=(string)audio,
clock-rate=(int)48000, encoding-name=(string)OPUS" ! rtpjitterbuffer
do-lost=true mode=synced latency=200 ! rtpopusdepay ! opusdec plc=true !
autoaudiosink

Once both are running, start the network simulator in another console:
sudo tc qdisc add dev lo root netem delay 1000ms

This simulates a delay of one second on localhost.
To remove it again, use:
sudo tc qdisc del dev lo root netem


Expected behavior: when the simulated delay is active, I'd hear PLC kick in,
and once the delay is removed, regular audio continues.
Actual behavior: after the simulated delay is added, no audio can be heard.
There is sudden silence. No PLC kicks in. Once the simulated delay is removed,
PLC is active for quite a while, *then* regular audio follows.

The problem is that the rtpjitterbuffer isn't pushing anything downstream once
the delay is enabled. Once a packet arrives in time (that is, after the delay
is removed), it suddenly pushes downstream many do-lost events. It should
however send a do-lost event downstream if packets don't arrive soon enough,
right? Apparently, some internal timeout timer isn't timing out correctly.


I looked through the bug reports, and the ones I think are related are
https://bugzilla.gnome.org/show_bug.cgi?id=738363 and
https://bugzilla.gnome.org/show_bug.cgi?id=720655 .

Also, I am not even 100% sure if I am correct about the expected behavior. Can
anybody confirm that the way I described it is indeed how it should behave?

-- 
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.


More information about the gstreamer-bugs mailing list