DPI and screen resolution on OS X
quikee at gmail.com
Fri Feb 5 10:29:50 UTC 2016
On Fri, Feb 5, 2016 at 2:02 AM, Chris Sherlock
<chris.sherlock79 at gmail.com> wrote:
> There were a series of patches that handled hi-DPI displays in 2014 that Keith did for us and that were pushed by Kendy:
Yes, and I continued later on with the patches when I got a laptop
with HiDPI screen and running Fedora.
>> I've just pushed a backport of the hi-dpi patches from master to gerrit
>> for libreoffice-4-2 integration - as was requested earlier, to fix the
>> unfortunate state of LibreOffice on the hi-dpi displays. It is the
>> following 5 patches (order is important):
>> Keith confirmed that they fix the hi-dpi issues he was seeing in
>> LibreOffice 4.2.
>> They are supposed to be safe for normal displays; that is anything
>> non-safe should be enclosed in an "if (mnDPIScaleFactor > 1)". Few
>> cases make the computation a bit more general, like:
>> + long yOffset = (aRect.GetHeight() - mpImpl->maImage.GetSizePixel().Height()) / 2;
>> if( mpImpl->mnState == SIGNATURESTATE_SIGNATURES_OK )
>> - ++aRect.Top();
>> + aRect.Top() += yOffset;
> I’m wondering if this is the area I should focus on.
> I’m not entirely sure how the scaling factor is being worked out, we seem to do this in Window::ImplInit and Window::ImplInitResolutionSettings with the following calculation:
> mnDPIScaleFactor = std::max(1, (mpWindowImpl->mpFrameData->mnDPIY + 48) / 96);
> Does anyone know what the underlying theory is behind this calculation? 96 seems to be a hardcoded DPI value assumed for all screens, but I can’t quite work out where the 48 number comes from…
You need the scale factor for bitmaps (icons) and some other places in
the UI (mostly the places where we draw 1 pixel lines) because when
you increase the DPI everything becomes larger pixel wise but bitmaps
stay as they are. When you start to approach 192 DPI (2*96) we
increase the scaling factor and scale the bitmaps by the scaling
factor. 48 is there only so that we scale before we hit 192 DPI - at
around 144 DPI (however in the latest code this starts at 169 DPI).
OSX is however excluded from this - it does its scaling in the backend
AFAIK. (see WIndow::CountDPIScaleFactor)
> I’m also wondering if it might not be better for us to move this calculation out of Window and into SalGraphics, given it is the SalGraphics backend that really gives the DPI via GetResolution.
Yes, it would be better to do it in the backend I guess.
> Another thing is: we seem to have this idea of logical coordinates, as opposed to device coordinates all through OutputDevice, and also there is a way of setting the OutputDevice mapmode. I’ve never quite understood what the idea behind this is. Can anyone give me any insights into this?
OutputDevice backends work only with pixels - you can set the mapmode
to a logical mode and all the inputs can be in that logical
coordinates. OutputDevice will automatically convert them to pixels. I
don't like this however. I think this doesn't belong on the
OutputDevice and it just adds bloat - if we need something like this
then as a wrapper around OutputDevice that does this
> P.S. I’ve just checked my Mac and the default scaling option is indeed lower than what I was expecting - the default on my Mac with a Retina screen is 2560x1440.
> Hint for OS X developers: to change actual screen resolutions, you need to:
> 1. Go to System Preferences
> 2. Go into the Display panel
> 3. In the Display tab, hold down the option key on your keyboard and click on the Scaled radio option
> This will give you the chance to set the actual screen resolution, as opposed to the more limited graphical option that OS X gives you.
More information about the LibreOffice