Encoding a DMABuf into a video stream with VAAPI

Victor Jaquez vjaquez at igalia.com
Mon Feb 20 17:43:52 UTC 2023


On Mon, 20 Feb 2023 at 18:00, Michael via gstreamer-devel wrote:
>  
> 
> Hello,
> 
>  
> 
> I want to encode a framebuffer I got via dmabuf into a video stream. I have
> a dmabuf file descriptor which contains the framebuffer.
> 
> I got the filedescriptor from the intel i915 driver via ioctl
> VFIO_DEVICE_QUERY_GFX_PLANE. 
> 
> The dmabuf has the format BGRx and modifier is 0 respectively changes to
> I915_FORMAT_MOD_X_TILED.

I don't follow, the modifier is 0 or I915_FORMAT_MOD_X_TILED??

Right now, no drm modifier is negotiated. There are on-going efforts in this
direction, but they aren't finished yet.

> 
>  
> 
> Now I want to encode it per zero copy in gstreamer into a video stream
> (h264, h265 etc...). 
> 
> I push the single frames per appsrc into the gstreamer pipline. Since I use
> intel hardware I thought it makes sense to use VAAPI.
> 
> The problem is that the sink pads of vaapi only support video/x-raw and
> video/x-raw(memory:VASurface) and I have video/x-raw(memory:DMABuf).

What version of gstreamer-vaapi are you using?

Right now it support dmabuf as input. But also, it would be better if you can
use vah264enc/vah264enc in gst-plugins-bad which will replace those in
gstreamer-vaapi.

Still, both don't consume RGB frames. You'll need to add vapostproc /
vaapipostproc to do the color conversion. vapostproc / vaapipostproc also
process DMAbuf as input, and wil negotiate VA native format downstream with the
encoder.

> 
>  
> 
> Is there any way to convert video/x-raw(memory:DMABuf) to
> video/x-raw(memory:VASurface) (zero copy) or import the DMABuf directly as
> video/x-raw(memory:VASurface)?
> 
>  
> 
> Alternatively is there a plugin which is better suited than vaapi?
> 
>  
> 
> My code to push the frames into gstreamer currently looks like this:
> 
>  
> 
>     GstMemory* mem = gst_dmabuf_allocator_alloc(vedpy->gdata.allocator,
> dmabuf->fd, dmabuf->width * dmabuf->height * (dmabuf->stride/1024));
> 
>     vedpy->gdata.buffer = gst_buffer_new();
> 
>     gst_buffer_append_memory(vedpy->gdata.buffer, mem );
> 
>     
> 
>     gsize offset[GST_VIDEO_MAX_PLANES] = {0, 0, 0, 0};
> 
>     gint stride[GST_VIDEO_MAX_PLANES] = {dmabuf->stride, 0, 0, 0};
> 
>     gst_buffer_add_video_meta_full( vedpy->gdata.buffer,
> GST_VIDEO_FRAME_FLAG_NONE,
> 
>                                     GST_VIDEO_FORMAT_ENCODED,
> 
>                                     dmabuf->width, dmabuf->height, 1,
> offset, stride);
> 
>     GstFlowReturn ret;
> 
>     vfio_encode_dpy *vedpy = container_of(dcl, vfio_encode_dpy, dcl);
> 
>     g_signal_emit_by_name (vedpy->gdata.source, "push-buffer",
> vedpy->gdata.buffer, &ret);
> 
>     ...
> 
>    
> 
> And my pipline:
> 
>  
> 
> char launch_stream[]  = "appsrc name=source ! "
> 
>     "
> video/x-raw(memory:DMABuf),width=1024,height=768,framerate=0/1,format={BGRx,
> BGRx:0x0100000000000001} ! "
> 
>     " vaapipostproc !"
> 
>     " vaapih265enc ! 
> 
>     ...
> 
>  
> 
> which obviously does not work, because vaapipostproc cannot be linked with
> the filter.

Why not? They should link.  A debug log would be needed.


More information about the gstreamer-devel mailing list