[Bug 702] Radeon only supports a maximum point size of 1.0.

bugzilla-daemon at freedesktop.org bugzilla-daemon at freedesktop.org
Sat Nov 6 11:17:39 PDT 2010


--- Comment #12 from Daniel Richard G. <skunk at iskunk.org> 2010-11-06 11:17:39 PDT ---
> I think either this should be commited or instead it should be handled
> by the driver and the DD_POINT_SIZE flag completely removed as it's
> currently just broken. Since the plan initially was to remove the
> whole _TriangleCaps stuff and r200 is the only driver which makes use
> of this particular flag I'm leaning towards the latter.

Sounds like a good way to go. Fewer idiosyncrasies in older drivers should be a

> In fact, the driver is quite broken anyway wrt point size, since with
> vertex programs you could output the point size per vertex, but the
> driver might still use the 1-sized point primitive, since
> _TriangleCaps DD_POINT_SIZE only looks at the global point size.

Still a better bug to have, at least, since vertex programs are a newer
construct anyway (and less likely to be used in CAD-type scenarios).

> But I really have no idea why the driver tries to use the point
> primitive instead of point sprite for 1-pixel points, otherwise it
> would be easiest to just always use point sprite prim (at least for aa
> points) which would get rid of both bugs.

FWIW, I modified the "point" demo into a poor man's benchmark (by putting the
glBegin(GL_POINTS) and glVertex*() calls inside large loops, and adding
gettimeofday() calls), and I'm seeing basically no differences between 1.0 and
1.001 for glPointSize(). You'd think the 1-pixel-point primitives would be
faster on some hardware, but r200 doesn't seem to special-case those.

Configure bugmail: https://bugs.freedesktop.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.

More information about the dri-devel mailing list