[Clipart] More nsfw stuff
chovynz at gmail.com
Mon May 3 19:27:43 PDT 2010
(i) Just to clarify, I don't think seeing naked images as traumatic (and I
don't know of anyone who thinks that way either), but there are a whole
bunch of people who don't want to see that stuff - for various reasons. We
are dealing with people's perceptions here. If we want them to be
comfortable in browsing the site we need to consider how to meet their
needs. Because not only are we dealing with a single people here, but we are
also dealing with those peoples generations.
I can tell you now, until those filters are in place, I will not allow my
little kids to browse the site by themselves. They are not ready, and I am
not ready to explain that stuff to them either. I can guarantee there are
others who think the same. Normal searches like "green" or "school"
shouldn't show up these images. I would even extend that stance to the
searches for "sexy" or "woman". I think it would be good to have them
available in the library, but not on any normal searches. Tasteful sexiness
is different to porn (which is what these cliparts are.) I cannot recommend
OCAL to churches, or youth groups (that is straight away cutting out 400+
people right there,) with the current search results.
What I would love to do is let my kids browse, download and share freely on
OCAL. I want THEM to guide their learning. In saying that, some things are
meant to be for adults not children.
What needs to be done for this sort of filtering? Is this a job that only
Bassel can do? And again, why are we relying on one person? That's not
healthy in my opinion.
(ii) "potential copyright violation" would probably suffice as a tag. @Jon &
Bassel, that would probably be a system tag, yes? I can apply that as a tag
at the moment, but there's nothing stopping the uploader from removing it
Sigh ... who's working on Openclipart? Who do we turn to for solutions?
Where can we get training on the new system?
On Tue, May 4, 2010 at 1:11 PM, Francis Bond <bond at ieee.org> wrote:
> There are two issues here:
> (i) NSFW images --- I agree that we should flag them, and that they
> should not be shown by default. On the other hand, I don't actually
> think that seeing a picture of a naked person is so traumatic, so I
> see no reason not to include NSFW pictures in the collection.
> However, to make the collection accessible to schools, I think it is
> fine to tag even borderline cases as nsfw. My experience with schools
> and online dictionaries is that what will actually happen is that kids
> will search for everything tagged with nsfw, but that is their choice.
> I think it is probably useful for things like the open-office
> clipart repackaging to be able to filter out the NSFW images.
> (ii) the particular image Naked Asian Lady 2 is a popular Japanese
> porn star (I can't recall her name, but I am sure I have seen the face
> before). I very much doubt that the image creator has permission to
> use it, so it should be flagged as a probable copyright violation.
> I don't yet know how to do that, so could someone else please do so?
> The face is so obscured in Naked Asian Lady that I think it is
> probably OK, but IANAL.
> Francis Bond <http://www3.ntu.edu.sg/home/fcbond/>
> Division of Linguistics and Multilingual Studies
> Nanyang Technological University
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the clipart