[PATCH 1/2] drm/kms: Make i2c buses faster

Jean Delvare jdelvare at suse.de
Wed Mar 21 03:45:12 PDT 2012


Hi Keith,

Sorry for the late reply.

On Sunday 29 January 2012 02:26:25 am Keith Packard wrote:
> On Sat, 28 Jan 2012 11:07:09 +0100, Jean Delvare <jdelvare at suse.de>
> wrote:
> > A udelay value of 20 leads to an I2C bus running at only 25 kbps.
> > I2C devices can typically operate faster than this, 50 kbps should
> > be fine for all devices (and compliant devices can always stretch
> > the clock if needed.)
> >
> > FWIW, the vast majority of framebuffer drivers set udelay to 10
> > already. So set it to 10 in DRM drivers too, this will make EDID
> > block reads faster. We might even lower the udelay value later if
> > no problem is reported.
> 
> That runs the DDC at a whopping 50kbps, which is half of the maximum
> rate specified in the DDC/CI standard. I don't know if we can count
>  on clock stretching (http://www.i2c-bus.org/clock-stretching/), but
>  if so, I don't know why we wouldn't just go to the standard 100kbps
>  data rate and be done with it.

We may end up doing that. I wanted to play it safe for now as at least 
Alan Cox expressed concerns with increasing the speed of DDC buses. I 
don't share them, but being cautious can't hurt.

Clock stretching is optional, each slave is free to implement it or not. 
I very much doubt it is needed when reading and EDID though, even at 100 
kbps. Typically what takes time is writing to EEPROMs, but in general 
EEPROMs will buffer the write and simply stop responding to their slave 
addresses until done. This is why most EEPROMs have a documented write 
page size.

Displays must be interoperable by design, so I'd hope that every serious 
display maker would only use EEPROMs that can either cope with 100 kbps 
or do clock stretching as needed. I have no doubt crappy hardware 
exists, but I'd rather decrease the clock speed on repeated errors than 
default to a slow clock speed. Users with good hardware should get the 
best out of it.

> Might be nice to see what frequency Windows uses for i2c; anyone want
>  to pull a vga cable apart and hook up a logic analyser?

Can't do that, sorry. It would certainly be valuable if someone has the 
time, hardware and interest, however I don't think "Windows" uses a 
frequency, rather separate video drivers are likely to have their own 
implementation and speed (if nothing else, simply because recent video 
cards use hardware I2C engines rather than bit-banging.)

-- 
Jean Delvare
Suse L3


More information about the dri-devel mailing list