[PATCH RFC 3/6] drm: Plane YCbCr to RGB conversion related properties
Ville Syrjälä
ville.syrjala at linux.intel.com
Fri Apr 21 15:21:48 UTC 2017
On Fri, Apr 21, 2017 at 03:53:42PM +0100, Brian Starkey wrote:
> Hi,
>
> Thanks for picking this up Jyri.
>
> On Fri, Apr 21, 2017 at 04:52:03PM +0300, Ville Syrjälä wrote:
> >On Fri, Apr 21, 2017 at 04:39:16PM +0300, Jyri Sarha wrote:
> >> On 04/21/17 14:17, Ville Syrjälä wrote:
> >> >> +static char *ycbcr_to_rgb_mode_name[] = {
> >> >> + [DRM_PLANE_YCBCR_BT601_LIM_TO_RGB_BT601_FULL] =
> >> >> + "YCbCr BT.601 limited range TO RGB BT.601 full range",
> >> >> + [DRM_PLANE_YCBCR_BT601_FULL_TO_RGB_BT601_FULL] =
> >> >> + "YCbCr BT.601 full range TO RGB BT.601 full range",
> >> >> + [DRM_PLANE_YCBCR_BT709_LIM_TO_RGB_BT709_FULL] =
> >> >> + "YCbCr BT.709 limited range TO RGB BT.709 full range",
> >> >> + [DRM_PLANE_YCBCR_BT2020_LIM_TO_RGB_BT2020_FULL] =
> >> >> + "YCbCr BT.2020 limited range TO RGB BT.2020 full range",
> >> >> + [DRM_PLANE_YCBCR_BT601_LIM_TO_RGB_BT709_FULL] =
> >> >> + "YCbCr BT.601 limited range TO RGB BT.709 full range",
> >> > We probably don't want to conflate the YCbCr->RGB part with the colorspace
> >> > conversion because the YCbCr->RGB part should be performed on gamma encoded
> >> > data and the colorspace conversion on linear data. So we need a degamma
> >> > stage in between. At least that seemed to be the general concencus after
> >> > the last round of mails on this topic.
> >> >
>
> I thought the conclusion of the last round was that on some hardware
> this would be conflated as a conscious choice. If the HW doesn't have
> the required degamma->csc->gamma stages, and the implementor/user
> doesn't care about being a little incorrect, then it can all be
> described in this single property.
I was proposing the single prop approach initially, but now I think it
might just lead to more confusion. So a dedicated property for each stage
is the clearer design I think. We do lose potentially a bit of
discoverability when not all combinations are supported, but we have
that problem in many other places as well, so not a big deal I think.
>
> If there is more capable hardware with the additional stages, then
> they should expose additional properties (in pipeline order):
>
> YCBCR_TO_RGB_*:
> Does YCbCr->RGB conversion on non-linear YCbCr data,
> only the enum values which don't have a CSC are exposed.
Not sure what you mean but that last part.
>
> DEGAMMA:
> Does the non-linear to linear conversion on the RGB data output from
> the YCBCR_TO_RGB stage.
>
> RGB_TO_RGB_*:
> Similar set of properties to YCBCR_TO_RGB_*, allowing RGB->RGB CSC
> on the linear data.
We may want to call this just CTM to match the matching crtc prop.
>
> GAMMA:
> Convert the CSC'd RGB data back in to non-linear for blending (if
> blending is to be done with non-linear data).
>
> Drivers can expose as many or as few of the above properties as their
> hardware supports.
>
> >>
> >> I do not really have the expertise to argue with that. I merely copied
> >> the idea from the mail thread I referred to in the cover letter.
> >> However, there are several display HWs out there that do not have all
> >> bolts and knobs to make the color-space conversion in exactly the ideal
> >> order, omap DSS being one of them.
> >
> >Yeah. Intel hardware is in the same boat for the time being. On current
> >hw I think we can only really expose the YCbCr->RGB and degamma stages.
> >
> >On some limited set of platforms we could expose a blob as well, and I
> >suppose it would then be possible to use it for color space conversion
> >if you ignore gamma and/or only deal with linear RGB data. But it's such
> >a limited subset of hardware for us that I don't think I'm interested
> >in exposing it.
> >
>
> I'm not sure the custom blob is worth having either. It can easily be
> added later if we decide we want it after all.
>
> Thanks,
> -Brian
>
> >In the future we should be getting a more fully fleged pipeline.
> >
> >>
> >> > After staring at the v4l docs on this stuff I kinda like their
> >> > "encoding" terminology to describe the YCbCr->RGB part, so I'm now a
> >> > little partial to calling the prop something like YCBCR_ENCODING. OTOH
> >>
> >> I guess this property should be called YCBCR_DECODING.
> >
> >Only if you think of it as a verb.
> >
> >>
> >> > if we want to expose the raw matrix as a blob then maybe calling it a
> >> > CSC might be better. Not sure there's much point in exposing it though.
> >>
> >> In my first version it was called just CSC, but then I wanted to make it
> >> explicit what this CSC was used for to avoid mixing the YCbCr decoding
> >> matrix with YCbCr encoding matrix. At least on OMAP DSS we have pieces
> >> of HW that can do only one or the other, e.g. the offset calculations
> >> are supported only to one direction.
> >
> >Are you planning to do RGB->YCbCr conversion in the plane as well? I
> >think we'll be only doing that at crtc/connector level.
> >
> >>
> >> > I don't think most people are in the habit if cooking up new ways to
> >> > encode their pixel data.
> >> >
> >>
> >> In the embedded side I can imagine there could be some custom appliances
> >> where one may want to do some custom thing with the CSC and not needing
> >> a custom kernel for that could make a life easier... but then again I am
> >> not really an expert in this area.
> >
> >I would assume most customy things you'd do in the crtc (eg. color
> >correction and whatnot). But could be that I just lack imagination.
> >
> >--
> >Ville Syrjälä
> >Intel OTC
--
Ville Syrjälä
Intel OTC
More information about the dri-devel
mailing list