[PATCH v2 0/3] Experimental freesync video mode optimization
Christian König
ckoenig.leichtzumerken at gmail.com
Fri Dec 11 10:28:36 UTC 2020
Am 11.12.20 um 10:55 schrieb Pekka Paalanen:
> On Fri, 11 Dec 2020 09:56:07 +0530
> Shashank Sharma <shashank.sharma at amd.com> wrote:
>
>> Hello Simon,
>>
>> Hope you are doing well,
>>
>> I was helping out Aurabindo and the team with the design, so I have
>> taken the liberty of adding some comments on behalf of the team,
>> Inline.
>>
>> On 11/12/20 3:31 am, Simon Ser wrote:
>>> Hi,
>>>
>>> (CC dri-devel, Pekka and Martin who might be interested in this as
>>> well.)
> Thanks for the Cc! This is very interesting to me, and because it was
> not Cc'd to dri-devel@ originally, I would have missed this otherwise.
>
>>> On Thursday, December 10th, 2020 at 7:48 PM, Aurabindo Pillai <aurabindo.pillai at amd.com> wrote:
>>>
>>>> This patchset enables freesync video mode usecase where the userspace
>>>> can request a freesync compatible video mode such that switching to this
>>>> mode does not trigger blanking.
>>>>
>>>> This feature is guarded by a module parameter which is disabled by
>>>> default. Enabling this paramters adds additional modes to the driver
>>>> modelist, and also enables the optimization to skip modeset when using
>>>> one of these modes.
>>> Thanks for working on this, it's an interesting feature! However I'd like to
>>> take some time to think about the user-space API for this.
>>>
>>> As I understand it, some new synthetic modes are added, and user-space can
>>> perform a test-only atomic *without* ALLOW_MODESET to figure out whether it can
>>> switch to a mode without blanking the screen.
>> The implementation is in those lines, but a bit different. The idea
>> is to:
>>
>> - check if the monitor supports VRR,
>>
>> - If it does, add some new modes which are in the VRR tolerance
>> range, as new video modes in the list (with driver flag).
>>
>> - when you get modeset on any of these modes, skip the full modeset,
>> and just adjust the front_porch timing
>>
>> so they are not test-only as such, for any user-space these modes
>> will be as real as any other probed modes of the list.
> But is it worth to allow a modeset to be glitch-free if the userspace
> does not know they are glitch-free? I think if this is going in, it
> would be really useful to give the guarantees to userspace explicitly,
> and not leave this feature at an "accidentally no glitch sometimes"
> level.
>
>
> I have been expecting and hoping for the ability to change video mode
> timings without a modeset ever since I learnt that VRR is about
> front-porch adjustment, quite a while ago.
>
> This is how I envision userspace making use of it:
>
> Let us have a Wayland compositor, which uses fixed-frequency video
> modes, because it wants predictable vblank cycles. IOW, it will not
> enable VRR as such.
Well in general please keep in mind that this is just a short term
solution for X11 applications.
For things like Wayland we probably want to approach this from a
completely different vector.
> When the Wayland compositor starts, it will choose *some* video mode
> for an output. It may or may not be what a KMS driver calls "preferred
> mode", because it depends on things like user preferences. The
> compositor makes the initial modeset to this mode.
I think the general idea we settled on is that we specify an earliest
display time for each frame and give feedback to the application when a
frame was actually displayed.
This approach should also be able to handle multiple applications with
different refresh rates. E.g. just think of a video playback with 25 and
another one with 30 Hz in two windows when the max refresh rate is
something like 120Hz.
Regards,
Christian.
>
> Use case 1:
>
> A Wayland client comes up and determines that its window would really
> like a refresh rate of, say, 47.5 Hz. Yes, it's not a traditional video
> player rate, but let's assume the application has its reasons. The
> client tells the compositor this (Wayland protocol still to be designed
> to be able to do that). (Hey, this could be how future games should
> implement refresh rate controls in cooperation with the window system.)
>
> The compositor sees the wish, and according to its complex policy
> rules, determines that yes, it shall try to honor that wish by changing
> the whole output temporarily to 47.5 Hz if possible.
>
> The compositor takes the original video mode it modeset on the output,
> and adjusts the front-porch to create a new custom 47.5 Hz mode. Using
> this mode, the compositor does a TEST_ONLY atomic commit *without*
> ALLOW_MODESET.
>
> If the test commit succeeds, the compositor knows that changing timings
> will not cause any kind of glitch, flicker, blanking, or freeze, and
> proceeds to commit this video mode without ALLOW_MODESET. The whole
> output becomes 47.5 Hz until the compositor policy again determines
> that it is time to change, e.g. to go back to the original mode. Going
> back to the original mode also needs to work without ALLOW_MODESET -
> but a compositor cannot test for this with atomic TEST_ONLY commits.
>
> If the test commit fails, the compositor knows that it cannot change
> the timings like this without risking a visible glitch. Therefore the
> compositor does not change the video mode timings, and the client's
> wish is not granted.
>
> The client adapts to whatever the refresh rate is in any case.
>
> Use case 2:
>
> A client comes up, and starts presenting frames with a target timestamp
> (Wayland protocol for this still to be designed). The compositor
> analyzes the target timestamp, and according to the complex compositor
> policy, determines that it should try to adjust video mode timings to
> better meet the target timestamps.
>
> Like in use case 1, the compositor creates a new custom video mode and
> tests if it can be applied without any glitch. If yes, it is used. If
> not, it is not used.
>
> This use case is more complicated, because the video mode timing
> changes may happen refresh by refresh, which means they need to
> apply for the very next front-porch in the scanout cycle in
> hardware. Hence, I'm not sure this use case is realistic. It can also
> be argued that this is better implemented by just enabling VRR and
> handling the flip timing in userspace, in the compositor: issue an
> atomic flip at the exact time it needs to be executed instead of
> issuing it well in advance and letting the driver wait for vblank.
>
>
> Worth to note: neither case needs the kernel to expose new manufactured
> video modes. Whether the feature is available or not is detected by an
> atomic TEST_ONLY commit without ALLOW_MODESET.
>
>>> However the exact modes amdgpu adds are just some guesses. I think it would be
>>> great if user-space could control the min/max refresh rate values directly.
> Setting min==max could be used to achieve the fixed refresh rate
> proposed here, but it would also allow setting custom min < max limits.
> This would be more flexible, but I'm not sure what the use case for it
> could look like... oh, there are the use cases mentioned below: user
> preferences. :-)
>
> Maybe the min/max setting is better than fiddling with custom video
> modes. If we have min/max to control, then there is no problem with
> going back to the "original" video mode like in my example use case 1.
>
>>> Not only this would remove the need for the kernel to hard-code "well-known
>>> video refresh rates", but this would also enable more use-cases. For instance
>>> some users might want to mitigate flickering on their screen by reducing the
>>> VRR range. Some users might want to lower their screen refresh rate for power
>>> savings.
>>>
>>> What do you think? Would you be fine with adding min/max VRR range properties?
>>>
>>> If you're scared about the user-space code requirement, I can
>>> provide that.
>> This sounds like a reasonable approach, and there is no reason why we
>> can't do this if we have the proper userspace support as you
>> mentioned.
> Maybe the min/max controls are the way to go, considering that
> the seamless refresh rate change feature in general cannot be
> implemented without VRR. Or can it?
>
> But if it can be implemented while not supporting VRR on some hardware,
> then the video mode fiddling without ALLOW_MODESET is still a usable
> approach. Or maybe such a driver could special-case VRR=enabled &&
> min==max.
>
> Yeah, min/max controls seems like the best idea to me so far.
>
>
> Thanks,
> pq
>
> _______________________________________________
> amd-gfx mailing list
> amd-gfx at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/amd-gfx
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/amd-gfx/attachments/20201211/f6390999/attachment.htm>
More information about the amd-gfx
mailing list