Xserver and Gettimeofday

Lukas Hejtmanek xhejtman at ics.muni.cz
Tue Aug 28 01:48:45 PDT 2007


Hello,

I was playing with some HD streaming and I noticed that XV overlays highly
utilize gettimeofday (in particular nvidia closed source driver, but the open
source one is even worse) resulting in up to 50% CPU usage spent in kernel in
clock_gettime and context switches. 

Is there any possible solution for this? I guess that it is just stupid driver
architecture that iterates over gettimeofday instead of waiting for IRQ.

-- 
Lukáš Hejtmánek



More information about the xorg mailing list