Color lookup support for the atmel-hlcdc driver

Boris Brezillon boris.brezillon at free-electrons.com
Thu Jun 15 10:15:58 UTC 2017


On Thu, 15 Jun 2017 11:54:29 +0200
Peter Rosin <peda at axentia.se> wrote:

> On 2017-06-13 17:30, Boris Brezillon wrote:
> > Hi Peter,
> > 
> > On Tue, 13 Jun 2017 16:34:25 +0200
> > Peter Rosin <peda at axentia.se> wrote:
> >   
> >> Hi!
> >>
> >> I need color lookup support for the atmel-hlcdc driver, and had a peek
> >> at the code. I also looked at the drivers/gpu/drm/stm driver and came
> >> up with the below diff. It compiles, but I have not booted it for the
> >> simple reason that I can't imagine it will work.
> >>
> >> Sure, the code fills the clut registers in the .load_lut helper function
> >> and I think it might even do the right thing when setting up the mode.
> >> I'm less sure DMA will work correctly? It might, but I haven't looked at
> >> it for many seconds...
> >>
> >> The big question is how the color info gets into the new clut array
> >> I created in atmel_hlcdc_crtc? When I look at the stm driver, which does
> >> it just like this AFAICT I just don't see it. So, what I'm I missing?  
> > 
> > You should look at drm_atomic_helper_legacy_gamma_set() and its users.  
> 
> Ok, thanks. I had a long look and could not get it to work at all.

Probably because you're trying to set this up through the emulated
fbdev interface. The solution I proposed is supposed to be accessed
through DRM ioctls.

> So,
> I did as below instead. However, there are a few glaring problems...

Well, I doubt this will be accepted. The fbdev emulation layer is
supposed to be rather dumb, partly because DRM people want developers
to switch to the DRM interface.

Also note that I won't accept a solution that supports setting the LUT
table only through the fbdev interface, so please try to make it work
the DRM way before you even consider modifying the fbdev emulation
layer.

> 
> Something, somewhere, sets up default entries for the 16 first entries
> of the palette and seem to expect them to stay as some kind of safe
> basic palette, and my applications don't handle it too well. When they
> want to draw black, they get a poisonous cyan/green instead etc. But
> it's pretty close. Can anyone provide some clue as to how that is
> supposed to be handled?
> 
> Also, I had to change the second argument of the drm_fbdev_cma_init...
> call at the end of atmel_hlcdc_dc.c:atmel_hlcdc_dc_load() from 24 to 8
> to make it possible to set 8-modes. However, doing so fucks up 24-bit
> modes. Which made me suspect that no non-24-bit mode work w/o my patch.
> And I could indeed only get 24-bit modes to work, unless I changed this
> value to 16. Then RGB565 works like a charm, but RGB888 don't. What's
> up with that? Seems like a rather silly limitation, but it's perhaps
> just a bug?

I'm pretty sure this is not a bug. AFAIR, prefered_bpp is used when you
don't explicitly specify the video mode you want to use on the cmdline
[1].

[1]http://elixir.free-electrons.com/linux/v3.11/source/Documentation/fb/modedb.txt


More information about the dri-devel mailing list