Generic DMABUF to vaapi h264 encoding

Volker Vogelhuber v.vogelhuber at digitalendoscopy.de
Mon Jul 27 10:56:17 PDT 2015


>> Basically what I want to do is to render into a texture in another
>> application. This texture is exported using a DMABUF file descriptor
>> and passed to my video recording process. There I want to encode it
>> using the VAAPI h264 encoder and after buffer has been processed I
>> can reuse the buffer within the rendering application.
>> Therefore it would be also nice if gstreamer can tell me when it is
>> not using the pushed buffer anymore. Is there any callback one could
>> add to maybe the vaapiencode_h264 or vaapipostproc element, that
>> notifies me, when the buffer pushed via appsrc isn't used anymore by
>> the pipeline?
> You need to track the memory, not the Buffer itself. If you only care
> about rendering it, then just make you dmabuf read only. Otherwise, you
> could use gst_mini_object_weak_ref() to get notified when the memory
> has been discarded (GstMemory derives from GstMiniObject).
>
I tried adding a gst_mini_object_weak_ref, but it seems like I cannot
stop releasing the buffer within the callback. So for me it's to late if
I don't want to reallocate the whole GstBuffer and GstMemory object.
How expensive is it to do

GstMemory mem = gst_dmabuf_allocator_alloc(
      m_dmaAllocator, fd, h * stride );
GstBuffer* buffer = gst_buffer_new();
gst_buffer_append_memory(buffer, mem );
gsize offset[GST_VIDEO_MAX_PLANES] = {0, 0, 0, 0};
gint stride[GST_VIDEO_MAX_PLANES] = {w * 4, 0, 0, 0};
gst_buffer_add_video_meta_full( buffer, GST_VIDEO_FRAME_FLAG_NONE, 
GST_VIDEO_FORMAT_BGRx, w,h, 1, offset, stride);

all the time? Is it negligible, so I should do it that way?

If don't unref the GstBuffer in the need-data callback, everything is
Ok, but I don't know when I'm allowed to reuse the buffer for GPU
rendering. Or is there any other callback or signal I can use when
vaapipostproc is done with converting the input buffer.

For now I guess that vaapipostproc is converting my BGRx buffer
somewhere into a NV12 buffer which will be passed on to the h264
encoder. Unfortunately I didn't found the source code that is
responsible for that conversation, so maybe that's the reason I
didn't found a proper way to be notified when buffer is available
for the other process (that renders into the buffers via OpenGL).

As gstreamer is setup very asynchronously it is a bit tricky to 
understand what's going on under the hood.





More information about the gstreamer-devel mailing list