[RFC] Deprecate AGP GART support for Radeon/Nouveau/TTM
Michel Dänzer
michel at daenzer.net
Fri May 22 10:49:09 UTC 2020
On 2020-05-22 12:40 p.m., Christian König wrote:
> Am 20.05.20 um 18:25 schrieb Michel Dänzer:
>> On 2020-05-20 4:43 p.m., Christian König wrote:
>>> Am 13.05.20 um 13:03 schrieb Christian König:
>>>> Unfortunately AGP is still to widely used as we could just drop
>>>> support for using its GART.
>>>>
>>>> Not using the AGP GART also doesn't mean a loss in functionality since
>>>> drivers will just fallback to the driver specific PCI GART.
>>>>
>>>> For now just deprecate the code and don't enable the AGP GART in TTM
>>>> even when general AGP support is available.
>>> So I've used an ancient system (32bit) to setup a test box for this.
>>>
>>>
>>> The first GPU I could test is an RV280 (Radeon 9200 PRO) which is easily
>>> 15 years old.
>>>
>>> What happens in AGP mode is that glxgears shows artifacts during
>>> rendering on this system.
>>>
>>> In PCI mode those rendering artifacts are gone and glxgears seems to
>>> draw everything correctly now.
>>>
>>> Performance is obviously not comparable, cause in AGP we don't render
>>> all triangles correctly.
>>>
>>>
>>> The second GPU I could test is an RV630 PRO (Radeon HD 2600 PRO AGP)
>>> which is more than 10 years old.
>>>
>>> As far as I can tell this one works in both AGP and PCIe mode perfectly
>>> fine.
>>>
>>> Since this is only a 32bit system I couldn't really test any OpenGL game
>>> that well.
>>>
>>> But for glxgears switching from AGP to PCIe mode seems to result in a
>>> roughly 5% performance drop.
>>>
>>> The surprising reason for this is not the better TLB performance, but
>>> the lack of USWC support for the PCIe GART in radeon.
>> I suspect the main reason it's only 5% is that PCIe GART page tables are
>> stored in VRAM, so they don't need to be fetched across the PCIe link
>> (and presumably it has more than one TLB entry as well). The difference
>> is much bigger with native AGP ASICs with PCI GART.
>
> Do you have some hardware you could give that a try on?
As I mentioned before, I tested this many times on my AGP PowerBooks
back in the day. The result was always a similar, big hit with PCI GART
vs AGP (even just 1x). I haven't seen any reason to believe this has
changed.
> While I agree that it means a performance regression, this is a rather
> high motivation to go ahead with at least the first patch.
I totally agree with the benefits, I just want everyone to be honest and
clear about the performance hit with native AGP Radeons, which already
have very weak performance by today's standards even with AGP.
--
Earthling Michel Dänzer | https://redhat.com
Libre software enthusiast | Mesa and X developer
More information about the dri-devel
mailing list