Using a queue element to store video frames

Ranti Endeley ranti_endeley at
Wed Nov 10 17:16:38 UTC 2021


I am quite new to gstreamer. I am trying to develop a couple of plugins.

  *   one which examines video frames in a buffer alongside inference metadata (object detection and classification) - this plugin then emits a custom out of bounds event downstream which is acted upon by the second plugin
  *   the second plugin is separated from the 1st by a queue (the idea being to block the queue until the custom event notifies it to start capturing the frames in the buffer)

My idea is to use the queue (leaky=downstream) between the two plugins to store the buffers until an event is detected by the 1st plugin. I would like the queue to fill up with buffers and drop the oldest buffers until the downstream plugin is prepared to accept them. This should in theory then allow the 2nd plugin to capture the video frames emitted prior to the event occurring so that I can have a record of some seconds of video prior to the event which triggered the capture (by increasing the max-size-time property of the queue to match the amount of time I want to store before an event).

My pipeline is like this:
video/x-raw + inference metadata -> videoconvert ! trigger plugin ! queue max-size-buffers=G_MAXUINT, "max-size-bytes=G_MAXUINT, max-size-time=60000000000,leaky=downstream ! capture plugin ! videoconvert ! videoscale ! videorate ! theoraenc ! oggmux ! filesink [hoping to store up to 60 seconds of videoframes]

Testing the above pipeline using videotestsrc (no metadata yet, just passing through all frames) has brought up some issues that are a little hard for me to understand.

  *   The output video runs for much longer than expected (for example 2 seconds of runtime results in about 30 seconds of video)
  *   When the leaky=downstream option is set on the queue frames are dropped much earlier than I would expect (leading to a very choppy output video - which incidentally is still longer than expected).

My questions:

  *   Is what I am trying to do possible with the pipeline I have described above? If not, why and what am I missing?
  *   Why is the video length being output disproportionate to the run time of the pipeline?

Thanks in advance for your assistance.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>

More information about the gstreamer-devel mailing list