[PATCH RFC 3/6] drm: Plane YCbCr to RGB conversion related properties

Ville Syrjälä ville.syrjala at linux.intel.com
Fri Apr 21 13:52:03 UTC 2017


On Fri, Apr 21, 2017 at 04:39:16PM +0300, Jyri Sarha wrote:
> On 04/21/17 14:17, Ville Syrjälä wrote:
> >> +static char *ycbcr_to_rgb_mode_name[] = {
> >> +	[DRM_PLANE_YCBCR_BT601_LIM_TO_RGB_BT601_FULL] =
> >> +		"YCbCr BT.601 limited range TO RGB BT.601 full range",
> >> +	[DRM_PLANE_YCBCR_BT601_FULL_TO_RGB_BT601_FULL] =
> >> +		"YCbCr BT.601 full range TO RGB BT.601 full range",
> >> +	[DRM_PLANE_YCBCR_BT709_LIM_TO_RGB_BT709_FULL] =
> >> +		"YCbCr BT.709 limited range TO RGB BT.709 full range",
> >> +	[DRM_PLANE_YCBCR_BT2020_LIM_TO_RGB_BT2020_FULL] =
> >> +		"YCbCr BT.2020 limited range TO RGB BT.2020 full range",
> >> +	[DRM_PLANE_YCBCR_BT601_LIM_TO_RGB_BT709_FULL] =
> >> +		"YCbCr BT.601 limited range TO RGB BT.709 full range",
> > We probably don't want to conflate the YCbCr->RGB part with the colorspace
> > conversion because the YCbCr->RGB part should be performed on gamma encoded
> > data and the colorspace conversion on linear data. So we need a degamma
> > stage in between. At least that seemed to be the general concencus after
> > the last round of mails on this topic.
> > 
> 
> I do not really have the expertise to argue with that. I merely copied
> the idea from the mail thread I referred to in the cover letter.
> However, there are several display HWs out there that do not have all
> bolts and knobs to make the color-space conversion in exactly the ideal
> order, omap DSS being one of them.

Yeah. Intel hardware is in the same boat for the time being. On current
hw I think we can only really expose the YCbCr->RGB and degamma stages.

On some limited set of platforms we could expose a blob as well, and I
suppose it would then be possible to use it for color space conversion
if you ignore gamma and/or only deal with linear RGB data. But it's such
a limited subset of hardware for us that I don't think I'm interested
in exposing it.

In the future we should be getting a more fully fleged pipeline.

> 
> > After staring at the v4l docs on this stuff I kinda like their
> > "encoding" terminology to describe the YCbCr->RGB part, so I'm now a
> > little partial to calling the prop something like YCBCR_ENCODING. OTOH
> 
> I guess this property should be called YCBCR_DECODING.

Only if you think of it as a verb.

> 
> > if we want to expose the raw matrix as a blob then maybe calling it a
> > CSC might be better. Not sure there's much point in exposing it though.
> 
> In my first version it was called just CSC, but then I wanted to make it
> explicit what this CSC was used for to avoid mixing the YCbCr decoding
> matrix with YCbCr encoding matrix. At least on OMAP DSS we have pieces
> of HW that can do only one or the other, e.g. the offset calculations
> are supported only to one direction.

Are you planning to do RGB->YCbCr conversion in the plane as well? I
think we'll be only doing that at crtc/connector level.

> 
> > I don't think most people are in the habit if cooking up new ways to
> > encode their pixel data.
> > 
> 
> In the embedded side I can imagine there could be some custom appliances
> where one may want to do some custom thing with the CSC and not needing
> a custom kernel for that could make a life easier... but then again I am
> not really an expert in this area.

I would assume most customy things you'd do in the crtc (eg. color
correction and whatnot). But could be that I just lack imagination.

-- 
Ville Syrjälä
Intel OTC


More information about the dri-devel mailing list