RFC for a render API to support adaptive sync and VRR
manasi.d.navare at intel.com
Fri Apr 20 20:32:41 UTC 2018
On Wed, Apr 18, 2018 at 09:39:02AM +0200, Daniel Vetter wrote:
> On Wed, Apr 18, 2018 at 5:58 AM, Keith Packard <keithp at keithp.com> wrote:
> > Michel Dänzer <michel at daenzer.net> writes:
> >> Time-based presentation seems to be the right approach for preventing
> >> micro-stutter in games as well, Croteam developers have been researching
> >> this.
> > Both the Vulkan GOOGLE_display_timing extension and X11 Present
> > extension offer the ability to specify the desired display time in
> > seconds.
> > Similarly, I'd suggest that the min/max display refresh rate values be
> > advertised as time between frames rather than frames per second.
So there is a global min and max refresh rate as advertised by the monitor
range descriptor. That I guess can be exposed as a global range in terms of
min and max time between frames as a global property of the connector.
We dont need the per mode min and max refresh rate to be exposed right?
> > I'd also encourage using a single unit for all of these values,
> > preferably nanoseconds. Absolute times should all be referenced to
> > CLOCK_MONOTONIC.
> +1 on everything Keith said. I got somehow dragged in khr vk
> discussions around preventing micro-stuttering, and consensus seems to
> be that timestamps for scheduling frames is the way to go, most likely
> absolute ones (not everything is running Linux unfortunately, so can't
> go outright and claim it's guaranteed to be CLOCK_MONOTONIC).
And yes I also got consensus from Mesa and media folks about using the
absolute timestamp for scheduling the frames and then the driver will
modify the vblank logic to "present no earlier than the timestamp"
> Daniel Vetter
> Software Engineer, Intel Corporation
> +41 (0) 79 365 57 48 - http://blog.ffwll.ch
> dri-devel mailing list
> dri-devel at lists.freedesktop.org
More information about the amd-gfx