[wayland HiDPI support, posible regression?]

Pekka Paalanen ppaalanen at gmail.com
Fri Mar 13 03:30:15 PDT 2015


Please, use reply-to-all.

I'm writing this down so that we have a thing to point to when the next
person comes asking the same questions.


On Fri, 13 Mar 2015 00:08:59 +0800
microcai <microcai at fedoraproject.org> wrote:

> on Monday 09 March 2015 08:59:35,Jasper St. Pierre wrote:
> > You misunderstood what pq said.
> > 
> > You work in "logical pixels". On a hi-DPI display, with twice the pixel
> > density, each "logical pixel" is backed by 4 device pixels. Your actual
> > backing surface containing the pixels is in "device pixels".
> 
> assume we design for 72DPI screen, then , yes, we are working on logical 
> pixels even when we use  12point font, because 1point=1pixel on 72DPI screen.

By the way, for whole conversation you started, it seems you have been
assuming that the physical DPI is known. To that I say: what about a
projector?

But you don't need a projector for the concept of physical DPI break
down. This is a fun read:
http://mjg59.dreamwidth.org/8705.html

Yes, it's about TVs. But, in the beginning there are two links that
discuss computer monitors.

IOW, apart from being extremely lucky or having your users calibrate
their monitor sizes manually, you're not really going to get a real DPI
value.

What's DPI without the meaning of "inch"? A meaningless number you use
for scaling things. That brings us to
https://people.gnome.org/~federico/news-2007-01.html

The conclusion is that when you use a meaningless number to scale
another number, the result is meaningless. Font sizes are just
relative numbers for monitor output.

Here's a look into resolution independence of text rendering:
https://blogs.gnome.org/danni/2011/12/15/more-on-dpi/
The conclusion is that the DPI doesn't matter much, having readable
fonts does.

That refers to Daniel's post that is mirrored here:
http://web.archive.org/web/20120102153021/http://www.fooishbar.org/blog
though it is mostly just linking to the pages mentioned earlier.

On Thu, 12 Mar 2015 09:08:57 -0700
"Jasper St. Pierre" <jstpierre at mecheye.net> wrote:

> Hi-DPI displays and resolution independence are two completely different
> problems. You don't get resolution independence by scaling, it needs
> separate design and a responsive layout. The scale is simply for displays
> where the pixel density is double or triple other displays.

What Jasper said is very important:
"Hi-DPI displays and resolution independence are two completely
different problems."

As I see it, Wayland's HiDPI support (buffer scale and output scale) is
meant to allow user interfaces designed in low-density pixels to remain
usable on HiDPI outputs. It's definitely not about making things
physically the same size. It is about keeping the visual appearance as
much as possible, e.g. a crisp line stays crisp. Differing buffer and
output scales where the compositor needs to scale the window is only a
fallback case, which HiDPI-aware apps should not really hit.

Making things physically exactly the same size over different output
devices is a bad goal for all the reasons referred above: cannot trust
given measurements, loss of readability. I believe this holds as long as
low-density (~100 dpi) display devices exist and single pixels are
visible to the naked eye.

You can make things readable, though. We hopefully have a good enough
guess of the actual resolution (dpi), so that a GUI can be drawn in a
usable size. The scale in which to draw a GUI might be arbitrary,
scaled by a non-integer factor compared to some reference size. It
requires the layout to be scalable: widget sizes change, content
reflows, etc., not that the image or even content gets scaled
uniformly.


Thanks,
pq


More information about the wayland-devel mailing list