Pixels Per Inch needs to be standardized π
Mattias AndrΓ©e
maandree at member.fsf.org
Wed May 4 17:45:30 UTC 2016
On Wed, 4 May 2016 19:01:09 +0200
Alberto Salvia Novella <es20490446e at gmail.com> wrote:
> Mattias AndrΓ©e:
> > What's wrong with dots per inch?
>
> How can an application reliably know which is the current
> pixel density of the desktop?
>
>
Well, you cannot know anything reliably. The EDID
does contain all information you need for DPI, however
with limited precision. X.org reports a bogus DPI. But
if pretend that all monitors' dimensions are in whole
centimetres, than the number of pixels per centimetre
can be calculated
ppc_x = output_width_px(monitor) / output_width_cm(monitor);
ppc_y = output_height_px(monitor) / output_height_cm(monitor);
Notice that this is easier to calculate than the pixels per inch.
ppi_x = output_width_px(monitor) / output_width_cm(monitor) * 2.540;
ppi_y = output_height_px(monitor) / output_height_cm(monitor) * 2.540;
But why is pixels preferred over dots?
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 819 bytes
Desc: OpenPGP digital signature
URL: <https://lists.freedesktop.org/archives/xdg/attachments/20160504/192832b8/attachment-0001.sig>
More information about the xdg
mailing list