[PATCH] drm: xlnx: zynqmp_dp: Support DRM_FORMAT_XRGB8888

Laurent Pinchart laurent.pinchart at ideasonboard.com
Mon Jun 30 09:33:35 UTC 2025


On Mon, Jun 30, 2025 at 11:29:08AM +0200, Maxime Ripard wrote:
> On Mon, Jun 30, 2025 at 12:11:56PM +0300, Laurent Pinchart wrote:
> > On Mon, Jun 30, 2025 at 10:27:55AM +0200, Maxime Ripard wrote:
> > > On Mon, Jun 30, 2025 at 10:03:16AM +0200, Mike Looijmans wrote:
> > > > On 27-06-2025 20:19, Laurent Pinchart wrote:
> > > > > On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote:
> > > > > > XRGB8888 is the default mode that Xorg will want to use. Add support
> > > > > > for this to the Zynqmp DisplayPort driver, so that applications can use
> > > > > > 32-bit framebuffers. This solves that the X server would fail to start
> > > > > > unless one provided an xorg.conf that sets DefaultDepth to 16.
> > > > > > 
> > > > > > Signed-off-by: Mike Looijmans <mike.looijmans at topic.nl>
> > > > > > ---
> > > > > > 
> > > > > >   drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++
> > > > > >   1 file changed, 5 insertions(+)
> > > > > > 
> > > > > > diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c
> > > > > > index 80d1e499a18d..501428437000 100644
> > > > > > --- a/drivers/gpu/drm/xlnx/zynqmp_disp.c
> > > > > > +++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c
> > > > > > @@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = {
> > > > > >   		.buf_fmt	= ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888,
> > > > > >   		.swap		= true,
> > > > > >   		.sf		= scaling_factors_888,
> > > > > > +	}, {
> > > > > > +		.drm_fmt	= DRM_FORMAT_XRGB8888,
> > > > > > +		.buf_fmt	= ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888,
> > > > > > +		.swap		= true,
> > > > > > +		.sf		= scaling_factors_888,
> > > > > 
> > > > > I'm afraid that's not enough. There's a crucial difference between
> > > > > DRM_FORMAT_ARGB8888 (already supported by this driver) and
> > > > > DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored.
> > > > > The graphics layer is blended on top of the video layer, and the blender
> > > > > uses both a global alpha parameter and the alpha channel of the graphics
> > > > > layer for 32-bit RGB formats. This will lead to incorrect operation when
> > > > > the 'X' component is not set to full opacity.
> > > > 
> > > > I spent a few hours digging in the source code and what I could find in the
> > > > TRM and register maps, but there's not enough information in there to
> > > > explain how the blender works. The obvious "XRGB" implementation would be to
> > > > just disable the blender.
> > > > 
> > > > What I got from experimenting so far is that the alpha component is ignored
> > > > anyway while the video path isn't active. So as long as one isn't using the
> > > > video blending path, the ARGB and XRGB modes are identical.
> > > > 
> > > > Guess I'll need assistance from AMD/Xilinx to completely implement the XRGB
> > > > modes.
> > > > 
> > > > (For our application, this patch is sufficient as it solves the issues like
> > > > X11 not starting up, OpenGL not working and horrendously slow scaling
> > > > performance)
> > > 
> > > Given that we consider XRGB8888 mandatory,
> > 
> > How about platforms that can't support it at all ?
> 
> We emulate it.

Does that imply a full memcpy of the frame buffer in the kernel driver,
or is it emulated in userspace ?

> > > this patch is a good thing to
> > > have anyway, even if suboptimal, or broken in some scenario we can
> > > always fix later.
> > 
> > It needs to at least be updated to disallow XRGB8888 usage when the
> > video plan is enabled, or when global alpha is set to a non-opaque
> > value.
> 
> Yeah, that's reasonable

-- 
Regards,

Laurent Pinchart


More information about the dri-devel mailing list