[PATCH 1/1] randr: make RROutputChanged change the main protocol screen not the gpu screen
Alberto Milone
alberto.milone at canonical.com
Mon Dec 9 01:52:34 PST 2013
On 09/12/13 03:53, Dave Airlie wrote:
> On Sun, Dec 8, 2013 at 9:08 PM, Alberto Milone
> <alberto.milone at canonical.com> wrote:
>> We only set changes on the main protocol screen as, for example
>> in RRSetChanged() and RRTellChanged(), therefore we should follow
>> the same logic when reporting that an output changed in
>> RROutputChanged().
>>
>> This means that RRTellChanged() will then update the relevant
>> timestamps also when events come from gpu screens.
>
> This seems sane, is RROutput the only one to suffer this though?
>
> Do we need to do this for other ones?
>
> Dave.
>
I've had another look at the code, specifically the one responsible for
sending the events that the protocol exposes.
In TellChanged() we look for crtc events, output events, and provider
events from gpu screens, and the relevant RRDeliverCrtcEvent, etc. all
use pWin->drawable.pScreen, so this looks correct.
When we call RRDeliverPropertyEvent, I think we always use the main
protocol screen, as per RRChangeOutputProperty, ProcRRGetOutputProperty,
and RRDeleteProperty.
The one thing I'm not sure about is RRResourcesChanged which is called
by RRCrtcCreate RRCrtcDestroyResource, RROutputCreate,
RROutputDestroyResource, xf86platformAddDevice,
xf86platformRemoveDevice. This should be fine though, the protocol
doesn't mention any RRResourceChangeNotify events, and I wouldn't expect
any clients to use it.
To sum it up, I think things are looking good now, unless there's
something obvious that I'm missing.
Cheers,
--
Alberto Milone
Software Engineer
Hardware Enablement Team
Professional and Engineering Services
More information about the xorg-devel
mailing list