[Bug 744106] Maximum latency handling broken everywhere

GStreamer (bugzilla.gnome.org) bugzilla at gnome.org
Sat Feb 7 11:09:49 PST 2015


https://bugzilla.gnome.org/show_bug.cgi?id=744106
  GStreamer | gstreamer (core) | unspecified

--- Comment #7 from Jan Schmidt <thaytan at noraisin.net> 2015-02-07 19:09:45 UTC ---
I think at least the videoencoder/decoder base classes are doing the right
thing with the latency query - if upstream reported max latency -1, it is
replaced by the encoder or decoder's max latency (if any), otherwise
total_max_latency += local_max_latency.

And there are appropriate checks in gst_{de|en}coder_set_latency that max >=
min.

The only real problem I see is that queue uses -1 to mean 'infinite' for max
latency, and every other user of the query interprets it as 'no info yet' -
since the default value for max latency in the query is -1.

That is, there's no way when handling a latency query for an element to
differentiate between an upstream queue with max-size-time=0 (which can buffer
infinitely and therefore sets max = -1) and an upstream which didn't override
the maximum latency value - but why would that ever happen? Any element that
handles latency query should update the max latency if it's -1, right?

It seems like it is ambiguous to pass a latency query upstream and have max
returned as -1 since we can't know whether we can override that value.

An alternative is that only source elements are allowed to override a
max_latency=-1 in a query - as it hasn't been set yet, and that all downstream
elements then need to assume -1 means upstream can do infinite buffering.

The sticky point with that is queues which have max-buffer-bytes/buffers, but
max-size-time=0 - at the moment that would make queue report infinite max
latency (infinite buffering capability), but obviously that isn't true. Queue
can't make a sensible estimate of max-latency in that case, and it should
probably warn or something in live pipelines if it ever hits the bytes/buffers
limit before it hits the time limit.

The end result is:

* A pipeline can't necessarily estimate max latency - in which case it should
pick min-latency and hope for the best in terms of compatibility of latency in
different chains
* There's an ambiguity between 2 different interpretations of max-latency=-1,
queue uses it to mean 'infinite buffering, therefore infinite max-latency' and
everywhere else uses it to mean 'noone set this to anything sensible, so you
should assume max-latency = min-latency'

-- 
Configure bugmail: https://bugzilla.gnome.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the QA contact for the bug.
You are the assignee for the bug.


More information about the gstreamer-bugs mailing list