[PATCH 2/2] protocol: Support scaled outputs and surfaces

Alexander Larsson alexl at redhat.com
Thu May 23 11:51:16 PDT 2013


> What if a client sets scale=0?

I guess we should forbid that, as it risks things dividing by zero.

> Maybe the scale should also be signed here? I think all sizes are
> signed, too, even though a negative size does not make sense. We seem
> to have a convention, that numbers you compute with are signed, and
> enums and flags and bitfields and handles and such are unsigned. And
> timestamps, since there we need the overflow behaviour. I
> believe it's due to the C promotion or implicit cast rules more than
> anything else.

Yeah, we should change it to signed.

> > @@ -1548,6 +1596,8 @@
> >  	     summary="indicates this is the current mode"/>
> >        <entry name="preferred" value="0x2"
> >  	     summary="indicates this is the preferred mode"/>
> > +      <entry name="scaled" value="0x4"
> > +	     summary="indicates that this is a scaled mode"/>
> 
> What do we need the "scaled" flag for? And what does this flag mean?
> How is it used? I mean, can we get duplicate native modes that differ
> only by the scaled flag?
> 
> Unfortunately I didn't get to answer that thread before, but I had some
> disagreement or not understanding there.

Yeah, this is the area of the scaling stuff that is least baked. 

Right now what happens is that the modes get listed at the scaled resolution
(i.e. divided by to two, etc), and such scaled mode gets reported with a bit
set so client can tell they are not native size. However, this doesn't seem
quite right for a few reasons:

* We don't report rotated/flipped modes, nor do we swap the width/height for
  these so this is inconsistent
* The clients can tell what the scale is anyway, so what use is it?

However, listing the unscaled resolution for the modes is also somewhat
problematic. For instance, if we listed the raw modes and an app wanted
to go fullscreen in a mode it would need to create a surface of the scaled
width/heigh (with the right scale), as otherwise the buffer size would not
match the scanout size. 

For instance, if the output scale is 2 and there is a 800x600 native mode
then the app should use a 400x300 surface with a 800x600 buffer and a 
buffer_scale of 2.

Hmmm, I guess if the app used a 800x600 surface with buffer scale 1 we could still
scan out from it. Although we'd have to be very careful about how we treat
input and pointer position then, as its not quite the same.

I'll have a look at changing this.


More information about the wayland-devel mailing list