"Name" key value in desk. entry spec collides with file names, could misguide users?
jub at sun.com
Mon Mar 21 15:30:34 EET 2005
Mike Hearn wrote:
> On Sun, 2005-03-20 at 18:55 +0100, Diego Calleja wrote:
>>I must admit I hate compatibility when it stops us from doing things better.
> I'm assuming you never actually wrote software that millions of people
> depend on, after all it's easy to hate compatibility until you need it.
> Then it's too late.
Well, free desktops have been breaking far too many things between
releases. I've certainly come to hate non-compatibility. But why should
we stop for this one?
Requiring an 'executable' attribute on a file in order to execute its
contents is long-standing and universally supported practice in the
unix/linux world. The concept to prevent executing arbitrary data as
code by requiring special permissions attached to executables still
looks valid to me. Of course this is not the end of it all, but it is
something we have commonly available today.
>>Many people won't fall so easily on that one
> Are you sure? I, again, see no evidence, just assertion.
But neither have I seen evidence that it doesn't work. I don't know, if
anyone has done a study with real users to check if it works. Would you
accept any other evidence.
OTOH I find it highly plausible that it is effective to prevent the one
sort of exploit it is meant to. It should be safe to download a file and
then open it from a file manager (barring outright vulnerabilities in
the software involved - this is not in the scope of this discussion).
>>>Why? Because I don't believe requiring a magic flag to be set to run a
>>>program makes things more secure, it just decreases usability.
Do you believe that downloading executables that can then run with full
access to the users data is something that is common and that uneducated
users need to be able to do effortlessly? Requiring that flag to be set
means that we require an explicit step to tell the system that we want
this to be executable.
It is the responsiblity of file managers to make this functionality
available in a way that is usable for those who need it, but that
prevents unsuspecting users from doing it by accident.
>>...selinux actually makes this worse, by not allowing you to do anything - at all with what you
>>downloaded. And if it does have a method to run it, it's not different than the +x solution,
>>expect for being more complex and less portable to other systems.
> Who says you can't do anything at all? I never said what a quarantined
> program would be able to do, or even implied it. Perhaps a good set of
> abilities would be:
> a) Read/write to a special "quarantine zone", a directory under $HOME
> b) Display windows on the X server (but not access others)
> c) Load shared libraries
> Note that establishing network connections is not in that list. Nor is
> being able to delete users documents, or modify the system
Can you describe for what use cases users need exactly this facility? If
I download a viewer for a new file type, I'd probably want it to be able
to read any user document, for an editor even writing is necessary. But
when I download and launch something where I don't even know that it is
an executable rather than data, I probably don't want it to be able to
> You could argue that this just pushes the decision to a later time - if
> the program asks for it, do you release it from quarantine or not? I
> think it's certainly possible to present this decision in a way that any
> users can comprehend. Before you tell me that users are clueless again
> please, read on.
If you think that this is the case, why do you believe that it is not
possible to present the decision to allow or disallow execution of a
file that lacks an 'executable' bit in an equally comprehensible way?
>>Again, this is not a hypotetical situation.
>>This is what people has been observed to do
>>for years in "other platforms".
> No, what we have observed is that it's extremely easy to impersonate
> people on the internet and that generally, people are trusting of email
> or files that appear to come from people they know.
This is something else that also has been observed. You argument is "The
problem you describe isn't worth addressing, because there also is a
> This has nothing to
> do with how easy or hard it is to make your own hardware do what you
> want. It has everything to do with how easy it is to gain somebodies
> trust by spoofing communcations.
No. The problem that has been discussed here is how easy it is to getg
somethin executed even without trust, by making active contents look
like passive data.
> Solving that requires we make it harder
> to spoof emails, instant messages etc and easier to make correct
> decisions in the presence of untrusted (but also potentially trusted)
Yours is a different problem and requires different countermeasures. Of
course, if people trust content and because of this follow instructions
they've be sent more or less blindly, then it is much more difficult to
prevent them from harming their system accidentally while still allowing
legitimate uses usably.
> Requiring the +x bit set on anything (not just .desktop files) does not
> give the user any more information than they already had, so it will not
> change the decision they make.
If the user did not know that the file he downloaded is not just another
piece of data (image, text, whatever) that will open in one of
preinstalled applications, then telling them that it would in fact need
to be executed - and needs certain priviledges for that - is new
information. The fact that they have to make an explicit decision at all
certainly does change something.
> It *will* increase their sense of
> frustration and helplessness if they're unable to figure out how to do
> what they want to do, but that achieves nothing except making people
> trust computers even less than they already do.
If you think that there must be a more usable way to mark content
downloaded from the internet as executable, then that is a different
issue. Nothing prevents file managers from presenting informative
dialogs and allowing setting the executable bit on the fly, when they
reckognize the case that a user double-clicks an executable file that
doesn't have this bit set.
>>>b) Breaks existing software and specifications (this is BAD!)
>>I agree that it does, but I have never cared about breaking things when it means
>>doing things better.
> Do you want Linux to be a research OS that never leaves the lab? No?
> Good. Me neither.
> Producing something that people actually *use* to get things done
> requires stability. There are better APIs than POSIX out there. It
> doesn't matter. The value stability brings outweighs the value a better
> API would bring. It's a simple cost:benefit analysis, and in this case
> the cost is high and the benefit is low to non-existant.
I don't see the cost as that high, if we provide a transition strategy
for legitimate legacy .desktop files - e.g ones that are installed as
part of legitimate packages for platforms that use a package manager or
whose Run line is just a simple command. And I do see a benefit.
>>>c) Gives people an impression of security which does not exist, so
>>> reducing incentives to work on real solutions
If internet tools don't set executable bits on downloaded files (as
they've been doing) and file don't get executed unless marked as
executable, then attempts to open downloaded files to view them (if
there is a viewer for them) should be safe. That allows simple security
guidelines for user that don't require grasp of underlying concepts. I
don't see a false sense of security here.
>>The set of population for which I propose this change (ie: more than 90% ) have no
>>clue about what "security" means. This is not a real reason, it doesn't stops people
>>other people from working on other things.
> I think this idea of people not caring or knowing about security is
> wrong. It doesn't apply in any other area of life: cars have
> immobilisers, houses have burglar alarms and cards have PINs. Nobody
> blames stupid users for card fraud, do they? This is a rather nasty
> habit of IT workers alone.
Maybe they care or know about security. But many lack the knowledge to
take security-related decisons themselves. Car immobilisers work largely
transparently and people don't choose to buy them - they are just there
(or the insurance company makes them buy them). Where I live most houses
don't have burglar alarms.
And PINs are actually close to being a counter example: people are held
responsible for keeping their PIN secret, but many people don't live up
to that in ways that would protect them from a malicious attack. I am
pretty sure I could glance the PIN of some persons, if I really wanted
to, by watching an ATM or in a shop. Any (too) many people even write
down their PINs and carry that in their wallets despite all education to
the contrary. I believe 'stupid' users are regularly blamed for card
fraud: if cards are abused (before you notice and lock the account)
there is an assumption that users haven't taken sufficient care with
their PINs. As you can't normally prove the opposite, the damage is with
> Nobody suggests making it harder to buy things as a solution to card
> fraud - instead better security systems like the European Chip+PIN
> schemes are designed to replace signature checks. These aren't quick 5
> minute fixes, in fact they're very expensive to implement. But they work
> and solve the problem at its roots.
Well, IMHO PINs aren't a prime example of usability. Having to memorize
yrandom sequences of digits - and possibly a multitude of them for
multiple cards - doesn't come easy to many people, and that is what
causes people to either act stupidly (carrying the PIN with the card) or
to make do without PIN card usage. So requiring PINS is alread making it
harder. And only after that people are now working on mitigating that
problem with a 'real' solution.
Still without the clumsy PINs the system wouldn't have gotten off the
ground, because people would not have used a completely insecure system.
So yes, to that degree people do care about security. But they don't
care how PINs, smart cards, magnetic stripes and online checks interact
to get security out of these instruments.
> Instead of writing off users as "clueless" and "idiots" as it's so
> tempting to do, how about we provide them with the information they need
> to make accurate decisions and strengthen security to make it less of a
> winner-takes-all situation if they make the wrong one?
People don't want to be charged with taking these decisions, regardless
how much information they have been presented. I don't want my card to
ask me to choose between not getting the money or revealing my PIN to an
ATM that may have been compromised - even if I have been told 20 ways to
distinguish an ATM that is fine from one that has been tampered with.
By making the risky choice harder and requiring extra steps to go that
way, I make that choice explicit. The safe choice should be implicit -
even if it has some drawbacks.
More information about the xdg