Re: Pixels Per Inch needs to be standardized 🔍
Jasper St. Pierre
jstpierre at mecheye.net
Wed May 4 18:12:38 UTC 2016
What are the dimensions of a projector, whose pixels-per-inch or
dots-per-inch value is a distance of how far away the projector is for
the wall, or, in a keystoned case, isn't even constant across the
For limited scenarios, you can make it work (with caution, see ).
But we cannot calculate a sensible DPI value in the general case.
On Wed, May 4, 2016 at 10:45 AM, Mattias Andrée <maandree at member.fsf.org> wrote:
> On Wed, 4 May 2016 19:01:09 +0200
> Alberto Salvia Novella <es20490446e at gmail.com> wrote:
>> Mattias Andrée:
>> > What's wrong with dots per inch?
>> How can an application reliably know which is the current
>> pixel density of the desktop?
> Well, you cannot know anything reliably. The EDID
> does contain all information you need for DPI, however
> with limited precision. X.org reports a bogus DPI. But
> if pretend that all monitors' dimensions are in whole
> centimetres, than the number of pixels per centimetre
> can be calculated
> ppc_x = output_width_px(monitor) / output_width_cm(monitor);
> ppc_y = output_height_px(monitor) / output_height_cm(monitor);
> Notice that this is easier to calculate than the pixels per inch.
> ppi_x = output_width_px(monitor) / output_width_cm(monitor) * 2.540;
> ppi_y = output_height_px(monitor) / output_height_cm(monitor) * 2.540;
> But why is pixels preferred over dots?
> xdg mailing list
> xdg at lists.freedesktop.org
More information about the xdg