X server sets invalid DPI
Steven Newbury
steve at snewbury.org.uk
Wed Jan 23 17:49:04 PST 2013
On Wed, 2013-01-23 at 18:13 +0000, Simon Farnsworth wrote:
> On Wednesday 23 January 2013 18:38:41 Nicolas Mailhot wrote:
> >
> > Le Mar 22 janvier 2013 09:39, Steven Newbury a écrit :
> >
> > > I thought the eventual resolution to the previous flamewars on this
> > > topic was that the whole idea of an X (DISPLAY) DPI was invalid (due to
> > > multi-screen and viewing distances) so 96 DPI was chosen as it was the
> > > default specified by various GUIs.
> >
> > This was wishful developer thinking (I don't want to fix dpi handling so
> > I'll pretend the problem does not exist). It's been thoroughly invalidated
> > now that there are high-density screens on the market (as everyone could
> > see would happen someday). There is no way a simple configuration like a
> > low-dpi laptop plugged on a high-dpi screen will work if the presentation
> > layer does not learn to handle the actual dpi differences between hardware
> > monitors.
> >
> X is already up to speed with this - we're waiting for the applications to
> catch up.
>
> RandR 1.2 exposes all the information available to the X server - the screen
> size in mm for each output, the locations of each output in pixels, and the
> size of each output in pixels. An application aiming to keep things the same
> size in mm on every output can do so by working in mm internally, and using
> RandR information to convert mm to pixels on output. What's more, RandR
> information is updated on hotplug, complete with events so that applications
> can recompute their scaling factors when the displays change.
>
I pretty much said this in my reply, but it was snipped off. It's
unfortunate how difficult it is to achieve in traction from
applications/DEs for new innovative X technologies and extensions.
> X11 also has the concept of a DPI for the display fixed at server start time -
> this is the DPI setting that defaults to 96 DPI unless overriden in the
> configuration or by command line option. The trouble with this DPI figure
> (commonly used by applications, unfortunately) is that there is no right
> answer - at least 96 DPI is consistently wrong, whereas a figure chosen by
> reading EDID of any monitors attached at startup varies.
Which is why environments such as GNOME which (currently..?) ignore
individual physical device characteristics should at least allow for the
selection or concept of a preferred output (if currently plugged) from
which to set the effective DPI for GUI scaling. What would be much
better would be a categorization system via udev which could provide a
hint for how to scale a GUI on a given output device, then toolkits and
WMs could intelligently apply appropriate default behaviour according to
pixel density and typical usage. Probably not likely to see anything
like this in GNOME unless there's a change of policy with respect to
configurabilty. I'm not specifically picking on GNOME, although it has
made a conscious move away from the more traditional X display DPI
concept (preferring the fixed 96 dpi model) the other DEs, as far as I
know, aren't offering anything superior.
This isn't intended as a flame, I'd just like to inspire some thought
about how a dynamic multi-output graphical environment could be
implemented or evolved from our current crop of X desktops and mobile
UIs.
More information about the xorg
mailing list