[PATCH 0/2] Support for high DPI outputs via scaling

Todd Showalter todd at electronjump.com
Wed May 8 07:43:49 PDT 2013


On Wed, May 8, 2013 at 6:51 AM,  <alexl at redhat.com> wrote:

> I'm working on trying to make high DPI (i.e. retina-class) outputs
> work well on Linux. I've written a proposal here:
>
> https://docs.google.com/document/d/1rvtiZb_Sm9C9718IoYQgnpzkirdl-wJZBBu_qLgaYyY/edit?usp=sharing

    I'm dubious about handling things this way.  This is what gets
done in iOS and OSX, and it's what Microsoft tried decades ago; at one
point in the win16/win32 era you were supposed to do everything in
twips and mickeys (a twip was a device-independent
usually-about-a-pixel measurement, a mickey was a unit of mouse
movement...).

    There are problems with this.

    We're moving to a "HiDPI" world.  With 3DTV failing in the market,
it's looking like the TV manufacturers are going to push hard on 4K,
and once 4K panels are widely available they'll work their way down to
PCs fairly quickly.  By the time Wayland is moving into mainstream
adoption, if 4K isn't the standard yet for 20" - 24" monitors it will
be imminent.

    We're at that point in iOS now; with the exception of legacy
hardware, the iPad Mini and the (presumably soon to be discontinued)
iPad 2, everything is "retina" and has a 2.0 scale factor.  We're
probably less than a year away from the end of Apple selling any iOS
devices with a 1.0 scale factor, and two to three years at most from
the point where support for 1.0 scale devices is a consideration for
developers.  Apple needed something to bridge the gap between the
devices, but that gap is just about behind us now, and the scale
factor is going to remain as a minor programming wart for a long time
to come; something that trips up new developers.

    Windows blew it with their DPI adjustments; in my experience
changing the DPI significantly on Windows (up to and including win7)
breaks things.  Even OS tools render wrong, and many times I've run
into games that don't handle mouse positions properly when the DPI is
changed.  If you run Stardock's Fallen Enchantress at double DPI, for
example, the mouse pointer can move across the entire screen, but as
far as the game is concerned the actual location of the pointer is
scaled by 0.5, constrained to the upper left quarter of the screen.
This is not an uncommon problem.

    More fundamentally, "HiDPI" doesn't really capture the nature of
the problem.  As an example, I have a living room PC I built mostly
for gaming.  I've got a 46" HDMI TV it's hooked up to, and according
to my measuring tape I'm sitting about 10' from it (sorry for the
furlong/firkin/fortnight measurement system, but it seems like metric
hasn't made it to monitor discussions yet).

    The rough standard for monitors these days is a 22" monitor viewed
from an 18" distance.  Ish.  YMMV.  But it seems like that's about
what most people seem to do.  Most monitors these days are 1920x1080,
since that's what TV panels are, and my tv is no different.  So,
similar triangles: a 22" monitor at 18" would be the equivalent of a
146" monitor at 120".  The practical effect of this is that "normal"
text is nearly impossible to read.

    The problem is entirely about the angle subtended by a pixel from
the position of your eye. If you aren't thinking about it in those
terms, you're misunderstanding the problem.  Oddly, Apple got it right
in their *marketing* for the retina iPhone, but they blew it in
implementation; if I plug a mac mini into my TV, it still assumes it's
a 22" monitor I'm sitting 18" away from, and there's no way I know of
to convince it otherwise.  It's all but unusable as a living room PC
unless you have a crazy-big TV, are sitting way to close, or are part
falcon.

    The angle subtended by a pixel to the viewer is the problem.

    The reason why this matters is that you can't know what that angle
*is*, outside of specialized environments.  My 46" TV is 48dpi, and
that could presumably be determined by EDID (assuming Sony didn't
screw the table up), but there's no way for the computer or tv to know
how far away from it I'm sitting, and that's a critical variable; at
the distance I sit, a pixel subtends only 7.5% of the angle it would
if I was sitting at 18".

    This is one of the major flaws with the OSX/iOS approach; they
went for convenience (integer scale based somewhat on DPI and somewhat
on screen size) rather than solving the real problem.  So, we've got
the iPad Mini displaying 160dpi using the same gui as the iPad 2
displaying 130dpi, and we've got TVs displaying pixel-for-pixel the
same gui as desktop displays.

    Ultimately, the answer may just be to do what you're planning and
make sure that there's some sort of simple tool to let the user set
their view distance.  Whatever we do, though, it needs to deal with
this problem.

                                       Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.


More information about the wayland-devel mailing list