[Bug 744106] New: Maximum latency handling broken everywhere
bugzilla at gnome.org
Fri Feb 6 11:02:56 PST 2015
GStreamer | gstreamer (core) | unspecified
Summary: Maximum latency handling broken everywhere
Component: gstreamer (core)
AssignedTo: gstreamer-bugs at lists.freedesktop.org
ReportedBy: slomo at coaxion.net
QAContact: gstreamer-bugs at lists.freedesktop.org
CC: thaytan at noraisin.net, wim.taymans at gmail.com
GNOME version: ---
I think we have a problem with the maximum latency. Let's first summarize what
The maximum latency is the maximum amount of time, that is allowed to block the
data without overflowing fixed-size buffers in upstream elements and losing
Now what we see in various elements is:
1) minimum and maximum latency are set to the same value, assuming that maximum
latency means the maximum latency this element produces
2) minimum latency is set correctly, maximum to GST_CLOCK_TIME_NONE. As the
element in question has no fixed size buffer (like most filters, decoders, etc)
3) the element's minimum latency is added to the upstream maximum latency if it
is not NONE
4) the element's maximum latency is set to the minimum latency if the upstream
maximum latency is NONE
I think what would be correct is that the minimum latency is added to the
maximum latency if the upstream maximum latency is not NONE. And if it is NONE,
the minimum latency should not influence the maximum latency.
Also every element that has a fixed size buffer should add its maximum latency
defined by that to the upstream maximum latency if it is NONE, or set its own
if the upstream one is NONE. However this also means that *all* queues that
have a maximum limit set would have to do this.
So the problems I see here right now is that elements are doing everything in
various combinations wrong, queues are not setting the maximum latency (and
often can't), base classes provide min/max latency setters but do wrong things
internally (combination of 3) and 4) most often apparently). And to make things
perfect, the maximum latency is currently only used to create a
GST_ELEMENT_ERROR() in GstBin if it ever becomes bigger than the minimum
latency. Due to all the confusion about what maximum latency is, this cause
unnecessary error messages now.
I see two ways forward here:
1) We fix everything to work as described above (assuming this is correct), at
least every code under our control. This would also mean that base class
behaviour would change. This all is arguably an ABI change, but as everything
is broken right now anyway it can't become worse ;)
2) We deprecate the maximum latency and don't post an error message anymore.
It's not used in useful ways currently, and there's too much code using it
wrong in various ways
Ideas? Did I misunderstand anything? :) If we decide on a solution here I'm ok
with implementing it.
Configure bugmail: https://bugzilla.gnome.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the QA contact for the bug.
You are the assignee for the bug.
More information about the gstreamer-bugs