[Clipart] OpenClipart and Debian.

Jonadab the Unsightly One jonadab at bright.net
Thu Jan 13 13:28:13 PST 2005

Daniel Carrera <dcarrera at math.umd.edu> writes:

> Your personal opinion on this is not the issue. 

No, I just felt the need to express surprise.  I though Germany was
more civilised than to censor abstract political symbols that are
important in history.  It worries me that they have not got past
that, makes me think they would rather repeat their history than learn
from it.

>> If we put the keyword "nazi" in its metadata, will that be adequate, or 
>> do we need to work out a more elaborate system of 
>> remove-for-such-and-such-a-country keywords? "remove-for-germany", perhaps?
> I think that country names are a bad idea. Too much effort.
> "nazi" sounds good, but I wouldn't advocate doing the same thing every 
> time we find anything that is offensive of illegal in one place, simply 
> because it might be too much work. Then again, maybe it's no work at all.

I was thinking in terms of categories.  If we use "nazi" for things
associated with Naziism, then we might use other keywords such as
"nudity" for other kinds of content that someone might want to
filter.  Similarly, we might flag religious content with keywords
related to the particular religion in question, so that for
distribution e.g. in Islamic countries someone might choose to filter
out anything with "judaism" or "christianity" keywords, and so on.

> I like the idea of separating UN and Olympic flags (which are fairly
> safe) and other flags elsewhere. That's a fairly simple criteria
> that scales well,

Someone will need to determine which flags are UN flags and which ones
are Olympic flags and add the appropriate keywords.

> an the majority of the issue will be in non-UN non-Olympic flags, 

I've seen this asserted about three times now, always without any
reasons given.  What makes you think most of the offensive content
will be flags, and what makes you think most of the offensive flags
will not be UN or Olympic flags?

> so it addresses most of the problem. And really, "most" is
> enough. It's okay if we can't do everything, no one is expecting
> OCAL to be perfect.  Just make a "reasonable" effort to address the
> issue.

Any image that can be identified as offensive for a particular reason
can be flagged with a keyword related to that reason, and then people
who are offended by such things (or, more likely, people distributing
to people who are offended by such things) can filter accordingly.
This has the additional benefit of also providing (down the road, when
we get certain pieces of infrastructure in place) better search
functionality, since what offends one person, someone else may be
looking for.  An example:  images related to Islam (or almost any
specific religion) are exactly the sort of thing one person would be
seeking out, and someone else would be offended.  Indeed, down the
road, I personally might want to put together a collection of
specifically Christian-themed clipart, but someone else might want it
filtered out.  Whereas, someone else might be searching for Santa
Claus images, and I would be more likely to search for Christmas
images that are *not* related to Santa Claus, because Santa Claus
offends me.

I don't think we can flag images as simply "offensive" or
"inoffensive".  We'd never be able to agree on what's what.  It is IMO
better to flag them with keywords by category.

> Reminder: I didn't suggest complete removal, but putting it in a
> separate collection (e.g. non-UN non-Olympic flags).

Down the road, when more of our infrastructure is complete (such as
the document management system), it will be easier to retrieve
tailor-made subsets of the collection based on desired criteria,
directly from the OCAL website, but for now we can easily provide a
filtering script for purging images with troublesome keywords (where
"troublesome keywords" is defined as keywords on a list provided by
the person doing the purging).

Other keywords that some distributors might want to use for purging,
depending on the intended audience, include "gradient" and "pattern",
both of which are used for categories that would only be useful to
image content creators, not end users.  These are not politically
offensive, but users who don't want to draw any images of their own
might find that they merely take up space.

split//,"ten.thgirb\@badanoj$/ --";$\=$ ;-> ();print$/

More information about the clipart mailing list