[Bug 744106] Maximum latency handling broken everywhere
bugzilla at gnome.org
Sat Feb 7 19:27:52 PST 2015
GStreamer | gstreamer (core) | unspecified
--- Comment #14 from Nicolas Dufresne (stormer) <nicolas.dufresne at collabora.co.uk> 2015-02-08 03:27:47 UTC ---
Maybe I should mention something:
... ! decoder ! queue ! sink
It is not because the queue thinks it can accumulate N ms of data (hence
contribute to max latency), that it will be possible for the queue to be
filled. Buffer pools might have maximum of buffers that prevent this. I think
it's all linked together. I wonder what should be the behaviour of the latency
query/message from the decoder (regardless if it's is own or downstream, the
decoder decide the allocation and pool).
Configure bugmail: https://bugzilla.gnome.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the QA contact for the bug.
You are the assignee for the bug.
More information about the gstreamer-bugs