30-bit X session and programs requiring 24-bit depth

Antoine Martin antoine at nagafix.co.uk
Mon Mar 9 10:53:54 UTC 2020


On 17/02/2020 16:40, Ilya Anfimov wrote:
> On Mon, Feb 17, 2020 at 04:01:35PM +0700, Antoine Martin wrote:
>> On 17/02/2020 15:51, Ilya Anfimov wrote:
>>> On Sat, Feb 15, 2020 at 12:32:15AM +0700, Antoine Martin wrote:
>>>> On 14/02/2020 18:53, Marek Szuba wrote:
>>>>> Hello,
>>>>>
>>>>> I do quite a lot of photo editing on my box and with both my monitors
>>>>> and my graphics card (amdgpu) supporting 10-bit colour channels, I tend
>>>>> to run X at colour depth 30. Unfortunately some software, most notably
>>>>> programs using OpenGL it seems, refuse to run in this mode - presumably
>>>>> (I have seen this mentioned as the reason of problems with 30-bit mode
>>>>> under Windows when the first cards supporting it came out) because the
>>>>> software assumes 8-bit alpha channel but with the frame buffer still
>>>>> being only 32-bit it is only 2-bit. Seeing as there seem to be no way of
>>>>> switching colour depth on the fly, the best I have been able to come up
>>>>> with is two separate X sessions - one at depth 30 and one at 24.
>>>>>
>>>>> Is there, or perhaps will there be some way in the near future, to work
>>>>> around this problem - either by increasing framebuffer BPP (tried it a
>>>>> while ago but it didn't seem to accept anything more than 32), using a
>>>>> virtual X server (tried using Xephyr but it complained about there being
>>>>> no matching screen), or some other way I haven't thought of?
>>>> If you don't mind the indirection this introduces:
>>>> xpra start --start=xterm --attach=yes
>>>> This will start your application (ie: xterm) on a 24-bit virtual
>>>> framebuffer and display it on your local 30-bit display.
>>>
>>>  And with software-only OpenGL.
>> If you want accelerated OpenGL within the xpra session, use VirtualGL:
>> https://xpra.org/trac/wiki/Usage/OpenGL
>> ie:
>> xpra start --start="vglrun xterm" --attach=yes
>  
> 
>   If  virtualgl  could  be used -- then yes, but then there is no
> need in xpra.
VirtualGL will tunnel 32-bit / 30-bit rendering to a real GPU but it
won't change the X11 visuals that the application sees when connecting
to the X11 server, and when applications fail to render at 30-bit it is
usually because they can't find a matching X11 visual, not a GL visual.
30-bit displays will happily give you regular 24/32-bit OpenGL rendering
contexts.

Antoine

>   virtualgl itself has some other thin moments.
> 
>>>  xpra itself is not really good at either proxying or accelerating
>>> glx/dri.
>> Xpra doesn't even attempt to do these things, that's not its job.
> 
>  Well, I don't agree with that either, but the result is the same.
> 


More information about the xorg mailing list