[gst-devel] decodebin2 manipulation of multiqueue

Conrad Cooke Conrad.Cooke at palm.com
Tue Oct 20 19:10:10 CEST 2009


I recently debugged an issue where playback was failing for certain mpeg-4 container files.  Turned out to be hitting a limitation placed on multiqueue.
This is a result of how the file was encoded.  While it is a legal containerised media file, there is a large difference between audio and video timestamps.
It is the purpose of multiqueue to realign the audio and video bins, I believe.

multiqueue has various parameters which determine how much difference there can be between video and audio.
>From gst-inspect:
  name                : The name of the object
                        flags: readable, writable
                        String. Default: null Current: "multiqueue0"
  extra-size-bytes    : Amount of data the queues can grow if one of them is empty (bytes, 0=disable)
                        flags: readable, writable
                        Unsigned Integer. Range: 0 - 4294967295 Default: 10485760 Current: 10485760
  extra-size-buffers  : Amount of buffers the queues can grow if one of them is empty (0=disable)
                        flags: readable, writable
                        Unsigned Integer. Range: 0 - 4294967295 Default: 5 Current: 5
  extra-size-time     : Amount of time the queues can grow if one of them is empty (in ns, 0=disable)
                        flags: readable, writable
                        Unsigned Integer64. Range: 0 - 18446744073709551615 Default: 3000000000 Current: 3000000000
  max-size-bytes      : Max. amount of data in the queue (bytes, 0=disable)
                        flags: readable, writable
                        Unsigned Integer. Range: 0 - 4294967295 Default: 10485760 Current: 10485760
  max-size-buffers    : Max. number of buffers in the queue (0=disable)
                        flags: readable, writable
                        Unsigned Integer. Range: 0 - 4294967295 Default: 5 Current: 5
  max-size-time       : Max. amount of data in the queue (in ns, 0=disable)
                        flags: readable, writable
                        Unsigned Integer64. Range: 0 - 18446744073709551615 Default: 2000000000 Current: 2000000000

Seems the last three properties are not implemented but were intended to support 'poorly encoded media'

The three supported properties are set to their defaults on construction of the multiqueue.  They are also set twice by decodebin, once when a group is created (gst_decode_group_new) and once when its pads are exposed (gst_decode_group_expose)
The second case set the values to 

        "max-size-bytes", 2 * 1024 * 1024,
        "max-size-time", 2 * GST_SECOND,
        "max-size-buffers", 5, NULL;

It seems for some media, there is a greater than 2 second difference betwee audio and video, so I chnged max-size-time to be 50 seconds.

My questions are 
Why is decodebin2 directly modifying this component at all ?
Why is a 2 second maximum imposed - is the only real limitation is the maximum memory that can be used ?

Here is my change:

component: gst-plugins-base-0.10.21

file: gstdecodebin2 line 1983 

  if (group->multiqueue) {
    /* update runtime limits. At runtime, we try to keep the amount of buffers
     * in the queues as low as possible (but at least 5 buffers). */
    g_object_set (G_OBJECT (group->multiqueue),
        "max-size-bytes", 2 * 1024 * 1024,
        "max-size-time", 50 * GST_SECOND, "max-size-buffers", 5, NULL);
    /* we can now disconnect any overrun signal, which is used to expose the
     * group. */
    if (group->overrunsig) {
      GST_LOG ("Disconnecting overrun");
      g_signal_handler_disconnect (group->multiqueue, group->overrunsig);
      group->overrunsig = 0;
    }
  }


Best Wishes,

Conrad Cooke



More information about the gstreamer-devel mailing list