Question about the future of Xorg
Vladimir Dergachev
volodya at mindspring.com
Sat Jun 14 00:34:59 UTC 2025
On Sat, 14 Jun 2025, Carsten Haitzler wrote:
>>
>> This is a Dell, it has an integrated Intel GPU and and NVidia one. I
>> usually don't use NVidia GPU - too much heat and no visible benefit.
>>
>> The NVidia chip is Geforce MX 130, 2 GB RAM.
>
> what intel chip/gpu?
Intel Core i7-8550U
>>
>> I found that restarting kwin and restarting plasmashell helps, and also
>> occasionally I kill firefox and restart it. The latter is a nuisance,
>> because while it does try to restore windows and tabs it does not restore
>> all of them.
>
> this smells of some kind of leak? where? ... dunnos.
Could be a leak, could be entropy (like fragmentation in memory).
>
>> Btw, if you have the same problem there is a setting in about:config
>> that lets you increase timeout for reaching website. When you restart
>> firefox and it tries to open 300 windows with 10-15 tabs each, the urls
>> will timeout too early in default configuration. Change that setting fixes
>> it (mostly).
>
> i never restore tabs... when i close my browser.. i'm done. :)
Except that nowadays you cannot easily run more than one browser instance.
Somehow there is a drive to turn every app into an operating system.
Firefox has "about:processes" to let you find out what each window and tab
are doing because it all gets lumped irregularly by top.
>> What's your screen resolution? Mine is 3840x2160, so a single full screen
>> buffer should be 33MB. A few hundred buffers and you would be out of RAM
>> on discrete GPU (and way earlier on my NVidia chip).
>
> 2560x1440. i do have 16gb ram on the card - but ... how many windows do you
> have normally around... and are they all "maximized" ? as it's a window that
> consumes a buffer.
Firefox is usually maximized. I think right now I have around 120 Firefox
windows.
>
>>> what's your working style? put 50+ windows on a single desktop and "alt+tab"
>>> between them?
>>
>> I got 8 virtual screens, and a bunch of windows on each of them. Most are
>> firefox.
>
> and how many ffox windows?
>
>> What happens is that I work on something and it usually involves 10-20
>> windows, but then I have to pause or wait for one reason or another and
>> I switch to something else.
>
> 10-20 is not too much. and at your fullscreen 33m per window that's ~300-700m -
> so not a problem for memory usage. even if you run out of vram the gpu can
> migrate some buffers/textures to system ram and map them over the pcie bus as
> render targets. its a bit slower. it might not migrate and just alloc new
> buffers there never migrating lesser used ones off. that'd be a "poor caching
> algorithm" :)
It's 10-20 times the number of different things I work on. So it adds up.
>
>> It is very convenient that I just leave things as they are and then I just
>> come back and pick up where I left off. Often I minimize the windows
>> because I only got 8 virtual screens - its a compromise for space in KDE
>> panel.
>>
>> Looking at xrestop right now, kwin has 69 pixmaps and uses 2.7GB RAM,
>> while firefox has 276 pixmaps and only 300MB RAM. There are a bunch of
>> other windows, mostly konsole. Compositing is on right now. I am pretty
>> sure firefox has a lot more than 69 windows.
>
> you said like 10-20 windows above?
No, no, not 10-20 total, but 10-20 at a time. So I am using 10-20 windows,
and another 100 or more I'll return to later.
I think I counted around 120 firefox windows right now (not tabs, many of
the windows have multiple). I am travelling, so fewer than usual.
> 69 windows? or 69 tabs? i just opened 18
Actually I got it wrong - the KDE Plasma has 69 pixmaps, kwin has 199. I
don't know what kwin uses pixmaps for, but I would imagine it needs at
least one per window.
> maximized terminals. also have this email client (2 windows) plus hexchat +
> chromium ... e uses about 330m with 25 pixmaps but reality is pixmaps are
> really only for the windows and nothing else - everything else is rendered
> inside the compositor with gl (or software) and thus is part of texture
> atlases etc. nvtop says e uses 440m of video memory which makes sense
> (2 screens, each 2560x1440). it'd be 270m just for the terminal textures mapped
> in (shared between x and the client and compositor), as well as wallpaper,
> other icons/buffers and backbuffers for rendering (90m for backbuffers for both
> screen if triple buffered). so all in all 440m doesn't sound wrong to me.
18 terminals are too small a test. Does it work if you open 1000 ?
It makes sense that if you expect users to have no problem handling 100
windows, you need to test with at least an order of magnitude more.
>>>
>>> they are the same really as a composited x11. no real difference at all.
>>
>> Naively, if all the windows always have a buffer than 8GB GPU can only
>> afford 242 4K windows. And you don't get more memory in consumer GPUs
>> because they will then compete with AI market devices.
>
> ??? the default for consumer gpu's is 16g these days. 8g is a low end "cut
> price" gpu. the latest gen of gpu's is now more pushing towards 24/32g.
They are all "cut price" right now - you cannot buy 24/32gb, at least in
stores near me. The companies do this on purpose for market segmentation.
And on a notebook the RAM and bandwidth are even smaller.
Also, right now Microsoft is very busy alienating a lot of people with
computers without TPM that cannot upgrade to new Windows version.
Those people are happily installing Linux and we should not impose
requirements of more than 8GB video RAM just to open some webpages.
>>> if its a 1-off "screenshot then display a copy of it and just scale that up"
>>> then there are wayland protocols for that - but the idea is that
>>> screenshotting protocol access will be limited and a compositor may do a
>>> very android/ios thing of ask you to grant permission first.
>>
>> I hate the permission stuff on android. The worst is that they've taken to
>> removing permissions from apps when you don't use them. So you have some
>
> so you're happy with rogue games you run screenshotting your browser with
> banking details and sending it back to home? :)
This problem only arises on Android and IOS because they are designed for
closed source apps and for controlling the user.
On Linux there is no such problem as long as you use software you can
examine.
On Android you could improve things immeasurably if open source apps were
installed with complete user access to app directory (to check which
binary actually shipped) and no permission restrictions.
>
> that's the point of this. the point is that the display system should stop
> being a leak of info/security. it cant force you to sandbox apps... but it can
> STOP being the problem that makes sandboxing ineffective.
I would actually argue that X is very secure, and gotten more secure over
the years.
Why? Because before 2000 you often had multiple users on the same system.
And now I have several systems and I am the only user. They are on the
same network that I control, and there is no way to access those sessions.
The only potential problem comes from Firefox, and is really mostly due to
javascript. And, as I see, Firefox developers (and authors of uBlock
and noScript) are on top of it.
>
>> app that you use once a month and then you have to debug why it does not
>> work. Especially sucks if you need to take a quick snapshot with a thermal
>> camera or a similar tool.
>
> and this is the current problem area - how to grant permission AND keep it
> granted persistently.
It is a very simple problem - you have an xmag/kmag like app. You examine
code. You see it does not send screenshots to some random IP or random
country. You install it and use with no restrictions.
Same goes for screenshot app, WaylandVNC (if it exists), screen recorder
and so on. And you can let Neko run around your screen as well.
>> A lot of it is pointless anyway - the apps that do shady stuff will find a
>> way anyway, and the users of good apps are just getting annoyed.
>
> and on the flip side if you go to a lot of effort to sandbox an app in a
> container or a smack label (read up on them) that then quite effectively limits
> that app - the display system is a massive leaking hole you cant plug... and
> this is one of the things wayland wants to address and does.
My number one step now after installing Kubuntu is to "de-snap" Firefox
and install a debian package.
>> Screenshot is not perfect - you really want a video, so you can play an
>> animation.
>
> a screenshot is just 1 frame of a video... that is how zoom, teams and every
> video conf app works now today. they keep taking screenshots repeatedly and
> quickly. that's how they can "share my screen" over that video conference...
> they grab these frames then encode them into a video stream - on the fly. they
> do that in x11 today...
Yes, but ideally you could do it in such a way as to guarantee a frame
every 1/N seconds and also guarantee that a frame is fully rendered to
avoid tearing. This is something that I think X cannot do right now.
>
>> Another useful tool is screen recorder like vokoscreenNG - with it you can
>> record your talk to be played back later. Again, you want the ability to
>> record a video of the screen.
>
> guess how those work too... :) see above :)
I was just giving an example of apps you want to have full screen access.
>
> then this is certainly something your vnc viewer should support IMHO. as how
> much you want to scale THAT session may vary from target to target it is
> connecting to... and it should remember such scale settings machine by machine
> you register/connect to. the compositor has no clue what is inside that app's
> window. in wayland or x11. it's the app's business.
Not really - I just run x11vnc on the remote and connect to it. I don't
start a new session, and I don't change fonts. Very handy, both to help
someone else and to use your desktop when you are away.
best
Vladimir Dergachev
More information about the xorg
mailing list