[PATCH 0/2] Support for high DPI outputs via scaling

Todd Showalter todd at electronjump.com
Wed May 8 09:06:45 PDT 2013


On Wed, May 8, 2013 at 11:07 AM, Alexander Larsson <alexl at redhat.com> wrote:

>>     We're moving to a "HiDPI" world.  With 3DTV failing in the market,
>> it's looking like the TV manufacturers are going to push hard on 4K,
>> and once 4K panels are widely available they'll work their way down to
>> PCs fairly quickly.  By the time Wayland is moving into mainstream
>> adoption, if 4K isn't the standard yet for 20" - 24" monitors it will
>> be imminent.
>
> I disagree, external monitors will be not be commonly HiDPI for a long
> time, so having a mix of "Hi" and "Low" DPI monitors will be common for
> a long time.

    Manufacturers want standards; when they move to doing 4K TV
panels, they're going to want to move to doing that, and drop 1080p
panels as quickly as they can.  There might be a year or two of
transition, and obviously there's the legacy hardware that people
actually own, but have you tried to buy a monitor that isn't 1080p
lately?  It's still possible, and if you're willing to pay a premium
or do a lot of looking you can get other resolutions, but if you just
walk into a computer shop looking for a screen, it's a sea of 22" to
24" 1080p monitors.

    Sure, it'll be a mix for a while, but monitors age out pretty
fast.  How many CRT monitors do you know of that are in active use?
It hasn't been a decade since CRTs were dominant and LCDs were an
expensive niche product.  We've still got some tube TVs kicking around
the office, (a Commodore 1702 and a couple of trinitrons), but I don't
think they've even been plugged in for the past four years.  Except
for the Commodore it's only inertia and the possibility that one of
the LCDs might fail that has kept them out of recycling.

    Laptops age out fast too, and the screen density on those is
climbing as smartphones push panel densities higher.  Do you know
anyone running a laptop that's more than six or seven years old?  If
you do, either they're running it plugged in all the time, or else
they've probably spent more money on replacement batteries than it
would have cost to buy a new laptop.  Battery longevity is still
dismal.

    I'll give you projectors, though.

> I don't think we can assume everything is high dpi for a loong time.
> There is both the external monitor case and old/non-apple hardware to
> consider. So apps and things like themes must specify coordinates in a
> scaled coordinate space, or things will just not work on hidpi displays
> (i have one, its unusable without it). And, for various technical as
> well as visual (non-aligned/integer width borders in e.g. buttons look
> *really* bad) we need this scaling factor to be an integer.

    The visual problems can be severely reduced if you floorf() your
scaled coordinates for geometric elements, and make sure you're
rendering text at the appropriate dpi.  I've done a fair amount of
scalable gui work in games, and just doing those two things makes a
world of difference.

    That said, I'm not arguing against integer scaling so much as I'm
arguing that without care we wind up in the place where iOS is
heading, where 1:1 scale is only useful on hardware nobody has any
more, and everything else deals with scaled/quantized coordinate
systems.

> This is due to the way microsoft handled this, enforcing apps to do the
> scaling. This solution is completely different. All input will be
> pre-scaled and appear in "lodpi", and upscaling will be fully automatic
> by wayland and/or the toolkit.

    That solves the pilot error problem; the main problem on windows,
really, is that nobody thinks to test their software at other DPI
settings.

    Is there a way to bypass the scaling?  When I'm using the mouse in
a game, I want to know the pixel position of the pointer, with no
scaling applied.  I'm going to be drawing the gui at native res, and I
want the mouse to interact with it at native res.

>>     The angle subtended by a pixel to the viewer is the problem.
>
> Its true that the monitor DPI itself doesn't necessarily map directly to
> whether the monitor is HiDPI or not. We may very well need to scale the
> UI up on your 46" TV, but that can be done as well with buffer scaling,
> and it will work automatically with all apps, and scale
> icons/borders/padding, etc as well as fonts.

    This is my point; the OSX solution is to assume a DPI based on
resolution and screen size, and that's clearly insufficient to solve
the problem.  You can quantize the scale to integer pixel multiples if
you like, but I think there needs to be a system (even if it's just a
simple tool somewhere) that lets the user say "I'm sitting 1.3 meters
away from this display", and does the subtended angle math to decide
what the candidate scale factors should be.

    That info should also be directly available to applications, so
(especially in the case of things like games and image editing
programs) they can make judgements about how to size things for
display elements that aren't part of the gui toolkit.  Things like
displaying documents or images "actual size", or games trying to
decide how large they should render their GUI text.

                                       Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.


More information about the wayland-devel mailing list