Pipeline 'eats' buffers
edgar.thier at theimagingsource.com
Wed Nov 22 07:07:12 UTC 2017
I have custom source for cameras. This source lets the library in the background allocate n buffer
for the image data and passes these buffers in a create function with gst_buffer_new_wrapped_full to
the pipeline. Since pipelines might take a while I am using the GDestroyNotify callback to re-queue
the buffer only after downstream allows for it.
The problem I am having is that when the system experiences heavy load the pipeline seems to 'eat'
the buffers and does not return them, resulting in a reduced amount of buffer or no buffer at all
that I can fill.
Is there a recommended way of dealing with this situation?
Should I just allocate additional buffers in the hopes that the load goes away?
Can I somehow reclaim buffers that take to long?
Is my buffer management simply wrong and are there better ways to prevent this entirely?
More information about the gstreamer-devel