[Xorg] Pixmap 24/32 on dual videocard system

Gene Heskett gene.heskett at verizon.net
Mon Jul 5 18:57:10 PDT 2004


On Monday 05 July 2004 18:00, Nils Breunese wrote:
>Thomas Winischhofer wrote:
>>>> The SiS 6326 does not support 32 bpp (32 bit framebuffer depth).

I am not seeing Thomas's posts coming in from the list, whats up with 
this?

But, the above statement brought me up a bit short in that I have a 
sis6326 chipset on an old Diamond SpeedStar A50 card that claims to 
have 8 megs of ram, but the only limit I've seen is several folks 
have posted that it can only access 4 megs of it from that chipset.

But to me, it appears that it supports 32bit inputs just fine, hence 
the surprise at seeing that statement.  This BTW, has absolutely 
nothing to do with nvidia, also mentioned in this thread.

Nevertheless, here is a snip of a piece of xdpyinfo from that machine:

name of display:    localhost:10.0
version number:    11.0
vendor string:    The XFree86 Project, Inc
vendor release number:    40300000
XFree86 version: 4.3.0
maximum request size:  4194300 bytes
motion buffer size:  256
bitmap unit, bit order, padding:    32, LSBFirst, 32
image byte order:    LSBFirst
number of supported pixmap formats:    7
supported pixmap formats:
    depth 1, bits_per_pixel 1, scanline_pad 32
    depth 4, bits_per_pixel 8, scanline_pad 32
    depth 8, bits_per_pixel 8, scanline_pad 32
    depth 15, bits_per_pixel 16, scanline_pad 32
    depth 16, bits_per_pixel 16, scanline_pad 32
    depth 24, bits_per_pixel 32, scanline_pad 32
    depth 32, bits_per_pixel 32, scanline_pad 32
[... to some actual settings]
screen #0:
  dimensions:    1600x1200 pixels (542x406 millimeters)
  resolution:    75x75 dots per inch
  depths (7):    24, 1, 4, 8, 15, 16, 32
  root window id:    0x8e
  depth of root window:    24 planes
  number of colormaps:    minimum 1, maximum 1
  default colormap:    0x20
  default number of colormap cells:    256
  preallocated pixels:    black 0, white 16777215
  options:    backing-store NO, save-unders NO
  largest cursor:    64x64
[...]

So it appears to be acknowledgeing a 32 bit setting, but running with 
24, this on a 1600x1200 screen.  In any event, there is no difference 
in color reproduction between that machine and this one, which has a 
gforce2 mx200, 32meg card in it.  Noticeable diff in speed, but thats 
only a 500mhz AMD K6-III too, where this is a 1600mhz athlon.
Gimp runs just fine for photo editing and printing but I have to 
import the pix from this box as that old TYAN S1590 has a badly 
broken set of usb chips on it.  Its my firewall anyway.

>>> Apparently, but the nvidia driver also doesn't seem to support
>>> 24. Just a bad combination of videocards then?
>>
>> I find it strange that you encounter this problem at all. Are you
>> trying to run "Xinerama" or the "normal" kind of dual-head where
>> each screen gets an X session of its own?
>>

>> If the latter, I would think that the pixmap depth should not
>> matter at all; the screens are entirely independent. With
>> Xinerama, however, the screens must be in sync wrt color depth,
>> framebuffer depth and some other criteria.
>>
>> Please let me know if my current driver fixes your problem.
>
>I haven't had the time to try your driver yet, but I would like to
>respond to the statement above. I currently have a 'normal' kind of
>dual-head setup (as can be seen from the xorg.conf I attached to my
>original post). If the pixmap should not matter at all, then how
> should I set it up? Can I specify a separate pixmap depth for each
> card/driver? Now it's just a server-wide flag I'm setting.
>
>Thanks,
>
>Nils.

-- 
Cheers, Gene
There are 4 boxes to be used in defense of liberty. 
Soap, ballot, jury, and ammo.
Please use in that order, starting now.  -Ed Howdershelt, Author
Additions to this message made by Gene Heskett are Copyright 2004, 
Maurice E. Heskett, all rights reserved.




More information about the xorg mailing list