[Interest] qt5 window setGeometry and move not work in wayland platform

Rutledge Shawn Shawn.Rutledge at digia.com
Mon Aug 11 06:12:48 PDT 2014


On 11 Aug 2014, at 12:57 PM, Giulio Camuffo wrote:

> 2014-08-11 13:29 GMT+03:00 Rutledge Shawn <Shawn.Rutledge at digia.com>:
>> 
>> On 11 Aug 2014, at 11:34 AM, Giulio Camuffo wrote:
>> 
>>> 2014-08-11 12:20 GMT+03:00 Rutledge Shawn <Shawn.Rutledge at digia.com>:
>>>> 
>>>> On 11 Aug 2014, at 9:10 AM, Pier Luigi wrote:
>>>> (top-posting fixed)
>>>>> 2014-08-11 8:13 GMT+02:00 Steve (YiLiang) Zhou <szhou at telecomsys.com>:
>>>>>> Dear all,
>>>>>> 
>>>>>> My app has a mainwindow and a QDialog which is a child of mainwindow. And I
>>>>>> want to set the app to the position 0,0.
>>>>>> 
>>>>>> I use both setGeometry and move to  0,0. No luck , both failed. The window’s
>>>>>> position is unfixed and may appear to anywhere on the screen.
>>>> 
>>>> I was wondering about that too.  I understand that it's generally good policy to leave positioning of generic windows up to the window manager, but sometimes you want to write a dock or taskbar which anchors itself to screen edges, and can animate in and out of view; or a splash screen which is centered on one screen.  What is the right way to do that on Wayland?
>>> 
>>> The right way is to have a protocol designed for that. A taskbar
>>> should use some taskbar_protocol with a request like
>>> put_on_edge(edge), and the compositor will then move the surface on
>>> the edge and do slide in/out or whatever effect it wants to.
>> 
>> I understand the advantage of taking a higher-level approach.  But then someone thinks of something for which the scenario-specific protocol doesn't suffice.  If windows could move themselves, it might be more flexible.  It may be too low-level, but it's hard to think of any other protocol that is universal enough, which I suppose is why it's not standardized.
> 
> The problem is that windows don't always have a meaningful position.
> If a window is shown on two outputs at the same time, maybe one of
> which a remote one, what is the window position?

On X11 (and other window systems) all outputs are mapped into the "virtual desktop" space, side-by-side or overlapping or whatever, so that there is a unified coordinate system.  On Wayland there is not this assumption?

> And what is the
> position of a window rotated 45 degrees?

Something could be made up; perhaps the position should always be the centroid instead of the upper-left? (although in other use cases that would be less convenient)  Rotation doesn't make sense without a center of rotation either.

>> What about when a window provides its own "skinned" window decorations: there will probably be some area in which you can drag to move the window, as you normally can on the titlebar.  Is there another protocol for that?  How would that be different from a generic protocol which windows could use to position themselves?
> 
> wl_shell_surface/xdg_surface have a "move" request. The clients call
> that and then the compositor actually does the moving.

So interactive moving only, but nothing to ask programmatically for a window to be moved by some delta.


More information about the wayland-devel mailing list