Encoding a DMABuf into a video stream with VAAPI

Nicolas Dufresne nicolas at ndufresne.ca
Mon Feb 20 18:48:59 UTC 2023


Hi,

Le lundi 20 février 2023 à 18:00 +0100, Michael via gstreamer-devel a écrit :
>  
> Hello,
>  
> I want to encode a framebuffer I got via dmabuf into a video stream. I have a
> dmabuf file descriptor which contains the framebuffer.
> I got the filedescriptor from the intel i915 driver via ioctl
> VFIO_DEVICE_QUERY_GFX_PLANE. 
> The dmabuf has the format BGRx and modifier is 0 respectively changes to
> I915_FORMAT_MOD_X_TILED.

I'm a bit unclear what you mean, is it 0 / LINEAR or I915_FORMAT_MOD_X_TILED ?

>  
> Now I want to encode it per zero copy in gstreamer into a video stream (h264,
> h265 etc...). 
> I push the single frames per appsrc into the gstreamer pipline. Since I use
> intel hardware I thought it makes sense to use VAAPI.
> The problem is that the sink pads of vaapi only support video/x-raw and
> video/x-raw(memory:VASurface) and I have video/x-raw(memory:DMABuf).
>  
> Is there any way to convert video/x-raw(memory:DMABuf) to video/x-
> raw(memory:VASurface) (zero copy) or import the DMABuf directly as video/x-
> raw(memory:VASurface)?

The data is in RGB and the encoders only support YUV, so you must apply a
conversion to the buffers anyway. You should be using vaapipostproc (or
vapostproc) to adapt using hardware acceleration. Note that you don't need the
dmabuf caps feature if you have linear buffers. The GstDmabufAllocator utility
will wrap your dmabuf and implement CPU mapping in case it would be the only
option. 

Nicolas

>  
> A lternatively is there a plugin which is better suited than vaapi?
>  
> My code to push the frames into gstreamer currently looks like this:
>  
>     GstMemory* mem = gst_dmabuf_allocator_alloc(vedpy->gdata.allocator,
> dmabuf->fd, dmabuf->width * dmabuf->height * (dmabuf->stride/1024));

Very strange size formula, should be just "dmabuf->height * dmabuf->stride".

>     vedpy->gdata.buffer = gst_buffer_new();
>     gst_buffer_append_memory(vedpy->gdata.buffer, mem );
>     
>     gsize offset[GST_VIDEO_MAX_PLANES] = {0, 0, 0, 0};
>     gint stride[GST_VIDEO_MAX_PLANES] = {dmabuf->stride, 0, 0, 0};
>     gst_buffer_add_video_meta_full( vedpy->gdata.buffer,
> GST_VIDEO_FRAME_FLAG_NONE,
>                                     GST_VIDEO_FORMAT_ENCODED,
>                                     dmabuf->width, dmabuf->height, 1, offset,
> stride);
>     GstFlowReturn ret;
>     vfio_encode_dpy *vedpy = container_of(dcl, vfio_encode_dpy, dcl);
>     g_signal_emit_by_name (vedpy->gdata.source, "push-buffer", vedpy-
> >gdata.buffer, &ret);
>     ...
>    
> And my pipline:
>  
> char launch_stream[]  = "appsrc name=source ! "
>     " video/x-
> raw(memory:DMABuf),width=1024,height=768,framerate=0/1,format={BGRx,BGRx:0x010
> 0000000000001} ! "
>     " vaapipostproc !"
>     " vaapih265enc ! 
>     ...
>  
> which obviously does not work, because vaapipostproc cannot be linked with the
> filter.

I think are using work-in-progress drm-modifier branch. I think the plan is to
ignore the old vaapi plugin, and only implement this for the new VA plugin (just
try and replace every vaapi* element into va* names).

>  
> Thanks in advance.
>  
> Best Regards,
> Michael

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20230220/1ccb8763/attachment.htm>


More information about the gstreamer-devel mailing list