[Bug 40931] r600g: interpret integer texture types as ints regresses VDPAU/XVMC decode.

bugzilla-daemon at freedesktop.org bugzilla-daemon at freedesktop.org
Fri Sep 16 09:52:30 PDT 2011


https://bugs.freedesktop.org/show_bug.cgi?id=40931

Christian König <deathsimple at vodafone.de> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
             Status|NEW                         |ASSIGNED

--- Comment #4 from Christian König <deathsimple at vodafone.de> 2011-09-16 09:52:30 PDT ---
Hi Dave & Andy,

I think we have a disagreement here about what SCALED types should be.

As I understood it SCALED types should be represented as integers in memory,
but when loaded into a shader converted to floats in the range 0..2^n (in
opposite to normalized types), and that's how I used them in g3dvl.

I also added code to r600g to support those types as textures and render
target, but this code got removed before merge because I figured out why
working with normalized textures/render targets didn't got the expected
results.

@Andy: The attached patch should fix your issue, but I will delay committing it
until we have figured out how SCALED types should be really handled.

-- 
Configure bugmail: https://bugs.freedesktop.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.


More information about the dri-devel mailing list