[Intel-gfx] [PATCH 1/4] lib/hexdump.c: Allow 64 bytes per line

Petr Mladek pmladek at suse.com
Fri Apr 12 13:48:02 UTC 2019


On Wed 2019-04-10 13:17:17, Alastair D'Silva wrote:
> From: Alastair D'Silva <alastair at d-silva.org>
> 
> With modern high resolution screens, we can display more data, which makes
> life a bit easier when debugging.

I have quite some doubts about this feature.

We are talking about more than 256 characters per-line. I wonder
if such a long line is really easier to read for a human.

I am not expert but there is a reason why the standard is 80
characters per-line. I guess that anything above 100 characters
is questionable. https://en.wikipedia.org/wiki/Line_length
somehow confirms that.

Second, if we take 8 pixels per-character. Then we need
2048 to show the 256 characters. It is more than HD.
IMHO, there is still huge number of people that even do
not have HD display, especially on a notebook.


I am sorry but I am strongly against having this hardcoded
anywhere. And I doubt that there will be enough users
to complicate the code and make it configurable.

Best Regards,
Petr


More information about the Intel-gfx mailing list