[PATCH] drm/ssd130x: Change pixel format used to compute the buffer size
Javier Martinez Canillas
javierm at redhat.com
Thu Jul 13 08:58:07 UTC 2023
The commit e254b584dbc0 ("drm/ssd130x: Remove hardcoded bits-per-pixel in
ssd130x_buf_alloc()") used a pixel format info instead of a hardcoded bpp
to calculate the size of the buffer allocated to store the native pixels.
But that wrongly used the DRM_FORMAT_C1 fourcc pixel format, which is for
color-indexed frame buffer formats. While the ssd103x controllers don't
support different single-channel colors nor a Color Lookup Table (CLUT).
Both formats use eight pixels/byte, so in practice there is no functional
changes in this patch. But still the correct pixel format should be used.
Suggested-by: Geert Uytterhoeven <geert at linux-m68k.org>
Signed-off-by: Javier Martinez Canillas <javierm at redhat.com>
---
drivers/gpu/drm/solomon/ssd130x.c | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/drivers/gpu/drm/solomon/ssd130x.c b/drivers/gpu/drm/solomon/ssd130x.c
index b3dc1ca9dc10..afb08a8aa9fc 100644
--- a/drivers/gpu/drm/solomon/ssd130x.c
+++ b/drivers/gpu/drm/solomon/ssd130x.c
@@ -153,7 +153,7 @@ static int ssd130x_buf_alloc(struct ssd130x_device *ssd130x)
const struct drm_format_info *fi;
unsigned int pitch;
- fi = drm_format_info(DRM_FORMAT_C1);
+ fi = drm_format_info(DRM_FORMAT_R1);
if (!fi)
return -EINVAL;
--
2.41.0
More information about the dri-devel
mailing list