[RFC 0/5] Introduce drm sharpening property

Garg, Nemesa nemesa.garg at intel.com
Wed Jun 19 11:23:42 UTC 2024



> -----Original Message-----
> From: Pekka Paalanen <pekka.paalanen at haloniitty.fi>
> Sent: Thursday, March 28, 2024 3:35 PM
> To: Garg, Nemesa <nemesa.garg at intel.com>
> Cc: Simon Ser <contact at emersion.fr>; intel-gfx at lists.freedesktop.org; dri-
> devel at lists.freedesktop.org; G M, Adarsh <adarsh.g.m at intel.com>
> Subject: Re: [RFC 0/5] Introduce drm sharpening property
> 
> On Wed, 27 Mar 2024 13:29:16 +0200
> Pekka Paalanen <pekka.paalanen at haloniitty.fi> wrote:
> 
> > On Wed, 27 Mar 2024 07:11:48 +0000
> > "Garg, Nemesa" <nemesa.garg at intel.com> wrote:
> >
> > > > -----Original Message-----
> > > > From: Pekka Paalanen <pekka.paalanen at haloniitty.fi>
> > > > Sent: Wednesday, March 13, 2024 3:07 PM
> > > > To: Garg, Nemesa <nemesa.garg at intel.com>
> > > > Cc: Simon Ser <contact at emersion.fr>;
> > > > intel-gfx at lists.freedesktop.org; dri- devel at lists.freedesktop.org;
> > > > G M, Adarsh <adarsh.g.m at intel.com>
> > > > Subject: Re: [RFC 0/5] Introduce drm sharpening property
> > > >
> > > > On Tue, 12 Mar 2024 16:26:00 +0200 Pekka Paalanen
> > > > <pekka.paalanen at haloniitty.fi> wrote:
> > > >
> > > > > On Tue, 12 Mar 2024 08:30:34 +0000 "Garg, Nemesa"
> > > > > <nemesa.garg at intel.com> wrote:
> > > > >
> > > > > > This  KMS property is not implementing any formula
> > > > >
> > > > > Sure it is. Maybe Intel just does not want to tell what the
> > > > > algorithm is, or maybe it's even patented.
> > > > >
> > > > > > and the values
> > > > > > that are being used are based on empirical analysis and
> > > > > > certain experiments done on the hardware. These values are
> > > > > > fixed and is not expected to change and this can change from
> > > > > > vendor to vendor. The client can choose any sharpness value on
> > > > > > the scale and on the basis of it the sharpness will be set.
> > > > > > The sharpness effect can be changed from content to content
> > > > > > and from display to display so user needs to adjust the
> > > > > > optimum intensity value so as to get good experience on the screen.
> > > > > >
> > > > >
> > > > > IOW, it's an opaque box operation, and there is no way to
> > > > > reproduce its results without the specific Intel hardware.
> > > > > Definitely no way to reproduce its results in free open source software
> alone.
> > > > >
> > > > > Such opaque box operations can only occur after KMS blending, at
> > > > > the CRTC or later stage. They cannot appear before blending, not
> > > > > in the new KMS color pipeline design at least. The reason is
> > > > > that the modern way to use KMS planes is opportunistic composition off-
> loading.
> > > > > Opportunistic means that userspace decides from time to time
> > > > > whether it composes the final picture using KMS or some other
> > > > > rendering method (usually GPU and shaders). Since userspace will
> > > > > arbitrarily switch between KMS and render composition, both must
> > > > > result in the exact same image, or end users will observe unwanted flicker.
> > > > >
> > > > > Such opaque box operations are fine after blending, because there they
> > > > > can be configured once and remain on forever. No switching, no flicker.
> > > >
> > > > If you want to see how sharpness property would apply in Wayland
> > > > design, it would be in step 5 "Adjust (settings UI)" of
> > > > https://gitlab.freedesktop.org/pq/color-and-hdr/-/blob/main/doc/co
> > > > lor- management-model.md#compositor-color-management-model
> > > >
> > > > To relate that diagram to KMS color processing, you can identify step 3
> "Compose"
> > > > as the KMS blending step. Everything before step 3 happens in KMS
> > > > plane color processing, and steps 4-5 happen in KMS CRTC color processing.
> > > >
> > > > Sharpening would essentially be a "compositor color effect", it
> > > > just happens to be implementable only by specific Intel hardware.
> > > >
> > > > If a color effect is dynamic or content-dependant, it will
> > > > preclude colorimetric monitor calibration.
> > > >
> > > >
> > > > Thanks,
> > > > pq
> > > >
> > > >
> > > > > Where does "sharpeness" operation occur in the Intel color
> > > > > processing chain? Is it before or after blending?
> > > > >
> > > Thank you for detail explanation and link.
> > > Sharpness operation occur post blending in CRTC ie on the final
> > > composed output after blending . Yes Pekka you are right as per the
> > > diagram it is done at step 5  "Adjust (settings UI)").  I  will also
> > > document this thing along with documentation change.
> > >
> > > > > What kind of transfer characteristics does it expect from the
> > > > > image, and can those be realized with KMS CRTC properties if KMS is
> > > > > configured such that the blending happens using some other
> characteristics
> > > > (e.g.
> > > > > blending in optical space)?
> > > > >
> > > The filter values are not dependent/calculated on the inputs of
> > > image but depending on the blending space and other inputs the
> > > blended output gets changed and the sharpness is applied post
> > > blending so according to the content user needs to adjust the
> > > strength value to get the better visual effect. So tuning of
> > > sharpness strength may be needed by user based on  the input
> > > contents and blending policy to get the desired experience.
> > >
> > > > > What about SDR vs. HDR imagery?
> > > > >
> > > The interface can be used for both HDR and SDR. The effect is more
> prominent for SDR use cases.
> > > For HDR filter values and tap value may change.
> >
> > Who will be providing these values?
> >
> > The kernel driver cannot know if it is dealing with SDR or HDR or
> > which transfer function is in effect at that point of the
> > post-blending color pipeline.
> >
> > If the UAPI is one "strength" value, then how can it work?
> >
> > Maybe the UAPI needs more controls, if not providing all "filter and
> > tap" values directly. Maybe all the filter and tap values should be
> > provided by userspace?
> 
> Actually, is the hardware just doing a convolution with a filter defined by the
> driver?
> 
> Convolution algorithm (it is a formula!) is pretty standard stuff I believe. If the
> hardware is actually doing convolution, then the driver really should be exposing
> the convolution operation. Then people can choose to use it for sharpening with
> the Intel developed kernels, or for custom effects with custom kernels. Everyone
> would win. Convolution is also something that other hardware vendors could
> implement.
> 
> A convolution filter would fit very well in the new KMS color pipeline design for
> post-compositing operations, too.
> 
> Is the sharpening element doing something similar to the unsharp masking?
> 
> I suppose users might want different strength based on what kind of content is
> the majority on the screen. That makes it something that a Wayland compositor
> would adjust automatically based on Wayland content type (similar to HDMI
> content type), for example.
> 
> 
> Thanks,
> pq

Hi Pekka,
 
Thank you for the feedback and suggestions. Based on the discussions in Hackfest(2024 Linux Display Next) and community feedback we were exploring various solutions from the user-space side and come up with a looking glass solution. Below is the link for mutter MR:
https://gitlab.gnome.org/GNOME/mutter/-/merge_requests/3665
 
There is a sort of convolution along with other optimizations which are within the hardware and the user-space should just control the extent of sharpness. This value will be received in the driver and appropriate programming should be done based on respective hardware design.
Also as agreed it should be fine if sharpness is applied post blending or if we drive a single plane use-case.
 
In cases when hardware doesn't support sharpness due to various reasons  we can implement the generic shader using some open source algorithm. This may not match pixel by pixel with hardware but should be ok if hardware applies post blending.

Thanks and Regards,
Nemesa


More information about the dri-devel mailing list