[Glamor] [RFC] Delay render flushing in DDX driver

Chris Wilson chris at chris-wilson.co.uk
Wed Jul 11 01:45:13 PDT 2012


On Wed, 11 Jul 2012 16:20:18 +0800, "Zhigang Gong" <zhigang.gong at linux.intel.com> wrote:
> Hi,
> 
>  
> 
> I'm considering to implement a delay flushing mechanism to reduce the huge
> overhead due to
> 
> many tiny operations.  Currently, I found two typical scenario:
> 
> 1.       Tiny rects filling. The example is firefox-particles. It renders to
> many small rects with a solid color,

Note those are using a radial gradient, not a solid. :)

My experience with using a delayed flush is that it does work extremely
well for throughput, especially in the scenarios you outline which are
surprisingly common. However, I also found that you need to keep the GPU
as busy as possible, and if ever suspect that it is idle, you need to
give it more work. So my current blockhandler looks like

  block_hander {
    if (!pending_requests) submit();
    if (outstanding_work && !timer_active) start_timer();
    if (timer_expired) { flush_dirty(); submit(); reset_timer(); }
  }

The periodic flush interval is set to vrefresh to avoid visible delays
between batches. If the screen is cleared in one vblank, and then the
subsequent redraw is delayed, the user sees the flash of the redraw
rather the result in the first vblank.

Another issue I found is that on older machines, the OsTimer code likes
to use the hpet, which by itself ends up being the most expensive
function call on the entire system. For that reason, I had to avoid the
OsTimer interface and so reduce the number of calls to GetTimeInMillis().

Good luck,
-Chris

-- 
Chris Wilson, Intel Open Source Technology Centre


More information about the Glamor mailing list