Re-thinking DPI and scaling (Re: Physical vs logical DPI on X)
Pekka Paalanen
ppaalanen at gmail.com
Mon Oct 5 08:35:23 UTC 2020
Hi all,
The below email was sent to xorg-devel, but I think it is such a good
discussion of the topic that I want to CC it to wayland-devel as well.
While the email does go into X11 specifics, the fundamental ideas are
well applicable to Wayland as well. There was an IRC discussion of the
very same topic recently on #wayland.
The difference in Wayland is that one believes the compositor should do
some kind of fallback scaling to make windows legible in case the
application did not or is not capable of doing it itself.
Though, even that idea in the below is mentioned to have been
investigated for Xorg as well.
Currently in Wayland we have buffer scale, output scale, and the
fractional scaling implementations in some compositors. We also *had*
monitor physical size exposed via wl_output, however that is a
controversial feature too IIRC. Maybe this discussion could
eventually produce better ideas for Wayland as well.
Due to my work on Wayland color management and HDR extension, my
personal opinion on the Wayland buffer/output scale system has started
to shift as well. AFAIK, the output/buffer scale design chose integers
as the scaling factors because scaling raster images by non-integer
factors necessarily makes them blurry. This stems from the graphics
designs using pixels not as point samples but as colored "tiles",
relying on the edge between two tiles to be crisp and visible (for
low-DPI monitors), e.g. one pixel wide lines. Obviously, implementing
fractional scaling makes the image quality degradation unavoidable
under the current Wayland design when monitors are not high-DPI enough
to hide the effect.
The discussion below also criticises the choice of using a single
number to conflate both UI scale and output resolution (DPI). If these
two concepts need to be clearly separate, then I'm not sure the integer
scale factor design is sufficient going forward.
One more thing I'd like to have considered is the viewing distance. The
write-up below does mention it: monitor physical size, pixel resolution
and viewing distance all affect how "big" graphics appears to the user.
But when we talk about DPI, it does not include the viewing distance,
hence it is only a partial description of the output "apparent
size" (in lack of a better term). DPI sets the lower limit of how small
graphical patterns can theoretically be observable on an output, but
viewing distance determines what sizes are actually legible, combined
with variation in people's eye sight.
An example of what I mean above: if you have a projector, it does not
make sense to attempt to display 12pt (physical size! e.g. printed on
A4 paper) font on it if 12pt would be two pixels high. The DPI
of the projector is too low. You have to use at least some many pixels
to make a font legible, so either you pick something much bigger than
12pt or there is another scaling factor involved with the projector.
I intended this email as an opening for discussion, where I do not plan
to participate actively, since CM&HDR is currently my priority.
Thanks,
pq
On Sun, 4 Oct 2020 17:42:43 +0200
Giuseppe Bilotta <giuseppe.bilotta at gmail.com> wrote:
> Hello Tor Arne and all,
>
> I'll try to give a reply to this, but keep in mind I'm not a core
> developer; my response is mostly guided by my experience with working
> with Xorg in mixed-DPI environment, and as much insight as I've
> managed to gather from the experience, experience that has matured
> mostly in:
>
> * the xdpi debug tool: https://github.com/Oblomov/xdpi
> * a write-up about the reality of mixed-DPI in X11 as of a couple of
> years ago: http://wok.oblomov.eu/tecnologia/mixed-dpi-x11/ (not that
> much has changed; also, if there's any feedback about the content of
> this article, suggestions are welcome)
> * a tentative patchset to include mixed-DPI support in awesome WM,
> https://github.com/awesomeWM/awesome/pull/2053 (currently without too
> much chance of going forward, and not only because I don't have the
> time to work on it as would be appropriate);
> * some discussion on IRC with keithp concerning his proposed
> window-scaling extension https://keithp.com/blogs/window-scaling/
>
> Before going forward, I'd like to clarify that I may have a somewhat
> different idea about DPI and scaling. To make sure we understand each
> other, I'd like to clarify some terminology (independently from the
> window system being used).
>
> For each device, there are three (at least; possibly four)
> display-related values that are relevant to the discussion.
>
> One is the physical pixel density, represented by the number of
> physical pixels spanning an inch of physical media. Ajax has written
> at length about the issues concerning the retrieval of correct
> information about this value, and I'm quite convinced that any
> possible solution for the issues related to this cannot come from
> within the display server itself, although the server may provide
> features to override any detected values (still, I think these would
> be better handled at a lower level, e.g. by the kernel). This is
> particularly true for cases (such as projects) where the physical
> density is much more dependent on the user setup than on a particular
> hardware characteristic.
>
> The second value is the “visual” pixel density, which depends on the
> physical pixel density as well as on the distance of the observer to
> the viewing surface. A high-resolution display held very close to the
> eyes (e.g. VR headset) may have a “visual” pixel density which is the
> same or lower than that of a coarse-resolution display which is much
> farther away (e.g. a standard-resolution projector seen from several
> meters away).
>
> The third value is the user preference for UI scaling, which is (or
> rather should be, see below) completely independent from the display
> resolution. A possible fourth value is the “reference” pixel density
> (for which we can consider the CSS “reference” of 96dpi), which is the
> one with respect to which the UI scaling _should_ be defined. And one
> of the biggest issues with the correct handling of DPI is that almost
> everywhere the UI scaling preference is “squashed together” with the
> physical-to-reference DPI setting, which ultimately causes a bit (or a
> lot) of confusion at both the display server and toolkit/application
> level.
>
> The fact that the UI scaling and DPI handling should be separate
> becomes particularly important in mixed-DPI setup. Consider for
> example the (relatively common) case of two monitors (a 192-DPI and a
> 96-DPI one) attached to the same display server and viewed from the
> same distance. Then, for an image to appear to be at the same size, it
> should be scaled 2x when on the 192-DPI monitor compared to when
> displayed on the 96-DPI monitor, because the high-DPI monitor needs a
> 2x “DPI scaling” to reach the “reference” pixel density. This is
> _independent_ of any user preference for UI scaling, so that if the
> end user opts for a 150% UI scaling (e.g. to compensate for their poor
> eyesight) this ends up using a 3x _overall_ scaling for the high-DPI
> monitor vis-a-vis a 1.5 scaling on the standard-DPI monitor. Ideally,
> the user would only have to choose the UI scaling, with the DPI
> scaling managed automatically (as far as possible i.e. within the
> limits of the autodetection of the device DPI).
>
> My understanding from reading
> https://keithp.com/~keithp/talks/xtc2001/paper/ is that the intent of
> the Xft.scale resource was to manage the “user UI preference”
> (hopefully keithp can confirm), but my understanding is that
> “everybody” has settled on using Xft.dpi for this instead —which is
> quite a bother, if you ask me.
>
> I'm not entirely sure how the Qt concept of logical DPI fits into
> these. I'm guessing it's somewhere between the reference DPI and the
> UI scaling configuration?
>
> Now onto your question:
>
> > Now, for X, there's at least four different things to consider, as far as I can tell:
> >
> > 1) The resolution and size of the X Screen
> > 2) The resolution and size of the individual outputs
> > 3) The resolution and size of the RandR 1.5 monitors
> > 4) The Xft.DPI setting.
> >
> > (For all the things exposed through RandR (1-3), as far as I can tell they are all stored as resolution and size (in mm), so all DPI-numbers going in or out of X are effectively converted to a width and height in mm to represent that DPI with the current resolution taken into account.)
>
> You may want to add to these the XSETTINGS, whose (dynamically
> adjustable) Xft/DPI value works in pretty much the same way as the
> Xft.dpi resource (and overriding it if both are present). This has the
> same limitation as Xft.dpi concerning globality, though.
>
> > The last one is the easy one, it's clearly a logical DPI, and we reflect that in Qt if set. Unfortunately it's a global DPI.
>
> Arguably, the biggest issue is that Xft.dpi is being used beyond its
> original intentions (defining the DPI for point size to pixel count
> conversion used by Xft). Since Xft isn't compatible with RANDR (in the
> sense that its API isn't output- or monitor-aware) the fact that it
> deals only in global value would be acceptable. The unfortunateness of
> it is that the value is otherwise used to set the UI scaling (where
> Xft.scale would have been a better choice).
>
> Given the current usage, though, Xft.dpi is one of the ways in which
> users can override the global scaling (conflating UI scaling and
> physical-to-reference DPI scaling).
>
> > Now, I'm guessing that #1, as set by Xorg -dpi, xorg.conf DisplaySize, or xrandr --dpi, originally was meant as a physical DPI override, for cases where the detection and heuristics in X would fail? But nowadays, especially with a single X Screen representing multiple physical displays, with potentially different physical DPIs, it feels like it's effectively a logical DPI setting on an X level, with the same limitation as Xft.DPI in that it's a global setting. What is your take on this?
> >
> > If it's the former — a physical DPI override (however little that makes sense when reflecting multiple displays) — we don't want to reflect it per QScreen, as that would not be specific enough in a multi monitor setup. Nor do we want to reflect it for a QScreen's logicalDpi, if it's strictly defined as a physical property, not to be used for adjusting logical DPI.
> >
> > But if it's in practice the latter — a logical DPI override — then we should reflect it through a QScreen's logicalDpi, if Xft.DPI hasn't been set to override it.
>
> AFAIK, the DPI of the X Screen has no physical meaning today, which is
> why it's normally set to 96 rather than trying to second-guess the
> value from the RANDR setup. Legacy applications continue using it as a
> fallback if Xft.dpi is not defined (following the Xft.dpi
> specification), so it can be used to control their rendering through
> it.
>
> Note however that its value is actively ignored by GTK3 (see also
> https://bugzilla.gnome.org/show_bug.cgi?id=757142 and associated
> issues such as https://gitlab.xfce.org/xfce/xfce4-settings/-/issues/34
> and https://bugs.mageia.org/show_bug.cgi?id=21201 for example).
> Personally, I disagree with the choice of the GTK3 developers, since
> ignoring the value is an unnecessary regression that also breaks the
> Xft.dpi fallback, and as a frequent user of mixed DPI configurations
> I'd rather see it used for the logical DPI override.
>
> > Now, for #2, as far as I can tell there isn't any option in xrandr to override this, nor does tweaking DisplaySize in xorg.conf affect it (even for multiple Monitor sections), so I'm guessing it's strictly a physical size picked up from EDID? If that's not the case, and it's possible to override it for the user, then the same questions as for #1 apply: Does that make it a logical DPI?
>
> According to the spec, RANDR reports the physical size (if known), and
> there is no way to change it via API (it's not user-settable), so from
> it you get a physical DPI.
>
> By the way, considering the globality of Xft.dpi, I think toolkits
> should agree on using a user-settable per-output property to define
> the physical-to-reference scaling of that output (_NETWM_SCALE or
> whatever). This could even be used by the server (with keithp's
> window-scaling extension) to automatically scale legacy apps (e.g.
> clients that do not have a specific hint saying that they can do the
> scaling themselves). At the very least Qt could start using this as a
> more flexible alternative to the environment variables currently used
> to set per-output scaling.
>
> > Finally, for #3, this is where it gets interesting. From reading the RandR spec [3] about the new Monitors introduced in 1.5, this seems like a defined logical DPI:
> >
> > "This new object separates the physical configuration of the hardware
> > from the logical subsets of the screen that applications should
> > consider as single viewable areas."
> >
> > It's possible to combine two outputs into one monitor, to split a single output into multiple monitors,
>
> Correct me if I'm wrong, but I don't think it's possible to split a
> single output into multiple monitors, since adding an output to a
> monitor will remove it from the other monitors.
>
> > or even to override the auto-generated monitor for a an output. And all these allow you to pass a width and height, effectively setting the DPI. E.g.
> >
> > xrandr --setmonitor DUMMY0-DPIOVERRIDE 1600/200x1200/200+0+0 DUMMY0
> >
> > This seems like the definition of logical DPI, where the desktop environment can give the user a nice control panel on how to adjust these things, either directly by adding/removing/moving monitors, or by setting a DPI or scale (200% e.g.) on an individual monitor, and then reflect that as RandR updates.
>
> Now this is an interesting side effect. I believe the original intent
> of the Monitor concept was to improve support for video walls and
> physical monitors that require two streams because of how large they
> are, but the possibility to override the physical size definitely
> allows for user-selection of the presented DPI. Would you then go look
> for the physical DPI as reported by the corresponding output(s)?
>
> > Based on all of this, it seems Qt should do the following:
> >
> > 1. If Xft.DPI has been set, respect that as a global override, and reflect that as the logical DPI for all QScreens
> > 2. If not, reflect the resolution and size of individual RandR 1.5 monitors as logical DPI per QScreen
> > 3. If 1.5 is not available, reflect the resolution and size of the X Screen as a global logical DPI for all QScreens
> > 4. Reflect the resolution and size of the individual outputs as physical DPI, or read EDID ourselves
> >
> > As far as I can tell this should cover DEs like Ubuntu 20.04 that sets a global 192 Xft.DPI to represent 200% scaling (and fractional scales in between 100% and 200%), as well as DEs that (in the future) allow per-monitor DPI/scale control via the 1.5 monitors.
>
> I suspect this might not be future-proof: DEs that allow per-monitor
> DPI/scale control via RANDR 1.5 may still want to use Xft.DPI for
> legacy applications. I don't think there's a way out of this without
> adding some kind of side-channel setting (_NET_PER_MONITOR_DPI boolean
> property on the root window). So the idea could be:
>
> 1. If _NET_PER_MONITOR_DPI is set, and RANDR 1.5 is present, use
> Monitor info for logical DPI per QScreen;
> 2. if _NET_PER_MONITOR_DPI is set, and RANDR 1.5 is not present use
> to-be-determined user-controllable per-output property for logical DPI
> per QScreen; (assuming we want to support this kind of configuration,
> with new DE/WM on pre-RANDR1.5 server);
> 3. fall back to Xft/DPI => Xft.dpi => X Screen dpi as global logical
> DPI for all QScreens; (note that the X Screen dpi can change with RR,
> and clients can get a notification when it happens; if possible, do
> keep this into consideration);
>
> Honestly while we're at it I would appreciate if Qt spearheaded the
> separation of DPI scaling from UI scaling (with a separate root window
> property or XSETTING or whatever), but I understand if this is
> considered being “too much” (especially since AFAIK other OSes/display
> servers don't have the concept either, but feel free to correct me if
> I'm wrong).
>
> Cheers,
>
> Giuseppe Bilotta
> _______________________________________________
> xorg-devel at lists.x.org: X.Org development
> Archives: http://lists.x.org/archives/xorg-devel
> Info: https://lists.x.org/mailman/listinfo/xorg-devel
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.freedesktop.org/archives/wayland-devel/attachments/20201005/f9a4f57e/attachment.sig>
More information about the wayland-devel
mailing list