[Clipart] More nsfw stuff
Bassel Safadi
bassel.safadi at gmail.com
Tue May 4 08:50:17 PDT 2010
On Tue, May 4, 2010 at 3:04 PM, chovynz <chovynz at gmail.com> wrote:
>
>
> On Tue, May 4, 2010 at 9:08 PM, Nicu Buculei <nicu_gfx at nicubunu.ro> wrote:
>>
>> On 05/04/2010 11:28 AM, chovynz wrote:
>>>
>>> On Tue, May 4, 2010 at 6:01 PM, Nicu Buculei wrote:
>>>
>>> I think automatic traces are valid. Jon does (as per his recent upload
>>> of 300+). However it's a different style. Autotrace is still Vector.
>>> Vectored photos can be scaled up with no loss of quality, original
>>> photos often can not. As we can see Jon has been re-defining "clipart"
>>> for a long time now so I have no problem with autotraced photos.
>>
>> What's the benefit of scaling the vectored image with no quality loss then
>> the vectorisation itself brought q *huge* quality loss already compared
>> with the original photo?
>
> Comparing apples with oranges. It has no relevance to the issues at hand.
> Vectors are scalable and do not lose quality of themselves, when a photo
> can't necessarily be scaled up without losing quality. I used to think they
> weren't clipart, however Jon says they are. Try convincing Jon that his
> vectorised photos are "shit" and are not clipart. If you have a problem with
> vectorised photos then you need to talk to Jon. What's valid for Jon is
> valid for all and vice versa.
>
> If you say remove these cliparts on the basis that they are not clipart,
> then Jon will need to remove 300 of his vectorised photos. If you say Jon's
> photos are clipart then there is nothing stopping anyone from auto-tracing
> photos and uploading them.
>
> That's besides the points I am trying to make and a distraction to the topic
> on hand of adult content. On with it.
>
>>
>>
>>> Free speech does not apply when we (OCAL) are supplying pornography to
>>> minors. Vectorised or not, that is what those particular images are, and
>>> there is no precedent "against" them. If there ARE original PD
>>> pornographic clipart then my kids will find them if there are no filters
>>> in place. It is not about offense. It is about being able to include a
>>> large number of people of all sorts of walks in life that do not want to
>>> see pornographic images - for whatever reason.
>>
>> Hold your horses: http://en.wiktionary.org/wiki/pornographic
>>
>> Showing a nipple is NOT pornography. You did a huge jump here from NSFW to
>> pornography. To my knowledge, w don't have any pornography on the website.
>
> No, showing a nipple is not porno. Did I say that? No. Showing genital muff
> and organs is classified as porno. For the purposes of age-appropriatness
> and these cliparts and the proposed filters there's no difference between
> nsfw and porno. What's to stop someone uploading a clipart of some child
> porn or beastiality? OCAL doesn't censor "offensive material", remember?
> Through these conversations, it has become clear to me that there IS nothing
> to stop someone from uploading porno. So, as a result we must also have
> filters so that those, who do not want to see the
> generally-and-widely-accepted-as-adult behaviours and images, can browse
> freely.
>
> I want my children to be active in OCAL.
> I want my students to be active in OCAL.
> I want many more people involved in OCAL.
> I want many of those universities who would ban and block students from
> using OCAL because of porno supply to be able to use and be involved in
> OCAL.
>
> Having porn visibly available on OCAL will stop many people from using and
> joining OCAL.
> I am proposing to remove that hinderance to OCAL's growth.
>
> http://en.wiktionary.org/wiki/pornography
> http://en.wikipedia.org/wiki/Legal_status_of_Internet_pornography#United_States
> http://en.wikipedia.org/wiki/Pornography_in_the_United_States
>
> Note: OCAL is registered under US legislation.
>
> Registrant City:San Francisco
> Registrant State/Province:CA
> Registrant Postal Code:94107
> Registrant Country:US
>>
>>
>>> Also, note here that I am not advocating removal of the cliparts in
>>> question, but rather that they can be chosen to be hidden. There is a
>>> huge difference. They are still there IN the library, but the user can
>>> hide them if they don't want to see them. OR they can view them if they
>>> want to. THAT is free speech in spirit and in truth.
>>
>> I agree with tagging to people can hide stuff, but hiding should be
>> opt-in.
>
> Other way around friend. Showing should be choice. Hiding porno or other
> things tagged as "adult-only" should be default, so that kids can browse
> without being exposed to these things, without their or their parents
> consent. They are not adults, therefore should not have easy access to
> "adult-content."
>
>>
>>> Many churches will not be able to use OCAL because of the search results
>>> providing pornographic clipart. I personally know of a church that has
>>> 400+ members. I know for certain they will not use OCAL because of the
>>> potential porno issue. I know of many churches that could benefit from
>>> using OCAL, IF there were suitable filters in place. Why do we need OCAL
>>
>> My first reaction would be: why do I care about churches? but I can go on:
>> the nudity we have so far does not show more skin than you can see, for
>> example, on the Sistine Chapel.
>
> What about universities? What about schools? What about other people with
> different values than you? Are you thinking of the bigger picture or only of
> yourself? I find it interesting that you "picked up" on caring about
> churches. What about the people IN those churches? What about the people IN
> those universities? What about my 6 year old daughter? Or my friends 4 year
> old daughter? Or your cousins 5 year old boy or 11 year old daughter? Those
> are the people that I would like to see using OCAL regularly.
>
> Again you are not comparing relevent images and info. The sistine Chapel is
> physical "art". It's also 60 feet (?) in the air and can't be easily
> recreated by my little kids. Nor can it be copied easily, only really by
> taking photgraphs of that, and that, by being there. It is a recognised
> masterpiece, done by one person. And the Chapel does not display the stuff I
> am talking about.
>
> OCAL on the other hand, is readily available, is on the internet, has high
> Net-profile, can be accessed from anywhere, is promoting using their stuff
> in any way, and can be seen and copied by all humans with a computer. The
> content is uploaded by users from around the world with a variety of
> viewpoints, interests and skills. Any pedophile with computer skills can
> upload images, and if I understand Jon's stance of "No censorship", and "no
> value judgements" correctly, there is nothing we can do about that image
> that that pedophile has uploaded.
>
>>
>>
>>> to be a porn factory? Isn't google enough?
>>
>> You went to such an extreme... from a bit of pubic hair to "porn factory",
>> seriously... take a step back, breath and think.
>
> I am thinking. Very seriously. See below.
>>
>>
>>> Other cases. Universities are not keen on the same. Their school
>>> policies don't allow those kinds of things. Students get expelled if
>>> caught looking at pornograhic material. And so on.
>>
>> Stop repeating "pornographic", we don't have such stuff.
>
> We do. And we might (probably will) have more in the future. This is an
> issue that has come up now, and one that we can deal with now, BEFORE those
> images get onto OCAL. See below.
>
>>
>>
>>> You misunderstand. My children cannot take parts of the library, they
>>> must take the whole. We must assume that any user that comes to OCAL
>>> must take it as the whole. My children do not have the technical
>>> knowledge to separate those images that they do not want out. Niether do
>>> they have the life-experience knowledge to filter out images that I do
>>> not want them to see. Don't you dare tell me that they have freedom to
>>> choose what images they can view, when that's exactly what we are not
>>> providing in the first place.
>>
>> The sky is falling: your children will the the badly vectored "Naked Asian
>> Lady", masturbate to it and go blind.
>
> Exaggeration and misleading information based on misinformation and fear (or
> frustration!). Masturbation doesn't cause blindness. You are ignoring my
> points because you don't like the stance. Reply with reason or not at all.
>
>>
>>> *Choice. Filters. Filters give choice. They don't remove "free speech"
>>> as you like to call it, rather filters enhance free speech. They enhance
>>> freedom rather than take away freedom. Filters are a good thing.
>>> Deletion is judgemental censorship. Filters are freedom.*
>>
>> You, tag the images, then go in your children;s profile and activate some
>> tags to be kept out.
>
> Which is what I am advocating. We cannot do this yet. You seem to be
> contradicting yourself. You say we shouldn't filter, yet you tell me to do
> so. Which is it to be? Filter or not filter? Growth by inclusion of people,
> or smallness and elitism by sticking to a false "free speech" mentality?
>
>>
>>
>>> I accept that I have used an appeal to emotion, however, I do not accept
>>> that I have used fallacy. Be wary of throwing out the good points by
>>> bringing up "fallacy" issues.
>>
>> You used a *huge* fallacy by equating "nudity" with "porn".
>
> You misunderstood my statements. I have not stated that nudity is the same
> as porn. It occured to me while we were talking that this IS an issue that
> needs sorting out and I changed my words accordingly. The real issue goes
> way beyond just these cliparts mentioned here. I apologise for not being
> more clear with my words. I don't have a problem with these cliparts being
> available. I do have a problem with the visibility of the cliparts stopping
> other (LARGE) groups of people from using OCAL freely. It's supposed to be a
> community, not a select few. It could be a global community, and OCAL has
> the potential to be HUGE! not just 63,000 members, but over 11,425,733 of
> active, contributing people. I want that. I'm proposing a method of growth
> for OCAL.
>
>>
>>
>>> Current system excludes many people (all those mentioned before,
>>> including many muslims and other faiths, not to mention just the
>>> practical side of schools and universities). I want to see more people
>>> use OCAL, and therefore OCAL gains directly. Hence; put filters in place
>>> so that they can browse "safely" while allowing those who want to view
>>> all content to do so. I would go so far as to suggest we make a weapons
>>> filter as per your example above. Or a christianity filter if you
>>> prefer. That's bordering on ridiculus though. Where do we stop? Magic
>>> mushrooms? Aqua Icons?
>>
>> Yes, is ridiculous, that's I don't want to go on the slippery slope.
>
> You aren't a librarian as far as I'm aware. You don't need to do any work on
> this issue in the background. You will still have access to your nude
> cliparts and pictures. So why do you have an issue with bringing in some
> processes that would help to bring in more people onto OCAL? To me, your
> reasoning is one of the reasons why OCAL has grown so slowly. It's been 6
> years. SIX YEARS! since OCAL opened.
>
> 30,000 cliparts is a sad number for a clipart site that has been around for
> six years.
> According to a news feed recently we have 60,000 members. But looking at the
> artist page I have to ask "how many people are contributing"?
>
> There are only 13 artists (WORLDWIDE) who have contributed over 300
> cliparts. The total combined of these 13 artists are 13100 cliparts, one
> account of which is used by librarians to upload "old clipart". That's
> almost one half of "reported" cliparts on the site contributed by a small
> handful of people.
>
> That does not speak of a healthy community to me. My aim with this proposal
> is
>
> to promote the growth of OCAL by
>
> allowing adults to view adult content and
> allowing children to view children content
> Enabling them both to browse together, with age appropriateness filters in
> place
> Enabling the freely given word of mouth, and advertising that will happen
> when we build up the trust of involved community that their kids will be
> "safe".
>
> Therefore encouraging more groups of people to join, who may not want to see
> nude, or obscene, or pornographic, or drug-use-promoting clipart, on a
> normal search for "green" or an innocent search for a colouring in picture
> of a "girl" or a "skirt" for their dress up doll program. Filters are a must
> in this environment of supplying PD clipart, to all humanity, for all ages.
>
> Note that these are statements of fact. I don't have a problem with nudity.
> I do have a problem with small growth for something as special as OCAL.
>
> Never underestimate the power of word of mouth. Currently, Nicu's mindest
> towards this issue promotes negative word of mouth. Jon's seeming
> indifference doesn't help. Am I the only one who cares about what our
> children see? No. I do not just speak for myself when I propose these
> measures.
>
>>
>>
>>> Porno (and drugs) is a recognised adult thing. It is not suitable for
>>> children. If we want more people who are involved with children to use
>>> OCAL "freely" then porno/drug-promotion filters are a must. That's not
>>> fallacy. That is proven via many surveys. I do not need to go into it to
>>> any great lengths here. The very reason of many universities policies
>>> are because of those surveys.
>>
>> Is fallacy, we don't have any porn.
>
> Really? what would you call, under US law, showing genitals, in public then?
> If it's an offense to do so under law in public, then it's an offense to
> supply such cliparts to minors. I'm no expert on US law, but it IS an
> offense to show your genitals in Public in NZ. I assume that's the same in
> US?
>
> Case in point; this is legitimate clipart, and (as far as I know) IS public
> domain. This is a legitimate clipart for OCAL.
>
> http://www.openclipart.org/detail/26406
>
> However it IS also porn. It is sexually suggestive, it displays genitals, it
> is not appropriate for children of 6 years old to see. It also comes up when
> searching for "Skirt", and "school", and "work" all of which are innocent
> little searches. You cannot argue with this. My arguements are not based on
> fallacy. They are based on fact. Porn CAN and will appear on OCAL.
> Beastiality can and will appear (it's only a matter of time due to human
> social standards entropy.) Perhaps even child porno. Would you still support
> "free speech" when those types of images appear on OCAL in mainstream
> searches for "Baby" or "Kid"? That is the mindset that you currently have
> Nicu, and I will not stand by to allow that mindset to influence what my
> children view, when they are simply looking for pictures to colour in. That
> is my personal viewpoint, but it stands alongside many ten thousands who
> think this way (not fallacy - fact.). I share it with you in the hopes that
> you will see I'm not just talking hot air. Nor is my stance based on fear,
> or ignorance, but a desire to see things grow, and community to become
> stronger. The ONLY way to build community is to include other people in your
> community, in your circle of influence, in your friendship.
>
> If someone wants a nude fix then there are many other places someone can go
> to get that. OCAL does not need to be one of those places. Because of the
> stance of the founders of "no censorship", and "no value judgements", we are
> forced into a place where we either change the way we think, for the good
> growth of OCAL, or we face becoming "just another clipart library" with
> little membership and not much community. I personally don't want to see
> that happen. I also don't want to see another six years go by and OCAL has
> only 120,000 members - most of whom don't do anything.
>
> Filters are a good way forward, so that people can choose to view what they
> want. Tags are not filters. They can be used BY the filters, but by
> themselves, tags are useful only for finding things. Filters are the next
> step to making a clipart community safe for all. It is not a bad thing, when
> you supply choice to the users. Heck, you could even think of it as a bonus
> to have access to the adult material when you become a member!
>
>
>>
>>
>>> I thought we needed proof to remove them. I have no way of knowing - I'm
>>> not a porno expert - and I don't want to go searching either. Some of
>>> you have said that she looks recognisable so I guess it would be good to
>>> remove her. I disagree that they are not clipart. I think they are.
>>
>> I am no porn expert either and I don't recognize the model, but my
>> experience as photographer tells me they are not self-made photos.
>
> Then, only, on the copyright issue, they should be deleted from the Library,
> not just "Hidden." This is a moot point since we librarians cannot yet
> delete images. The visibility toggle only "hides" clipart from searches. You
> can still access those cliparts if you know the URL of it.
>>
>>
>>> Thanks again Nicu. You had thoughtful replies. I appreciate that you've
>>> taken the time to reply. I hope we can come to some workable solution to
>>> this.
>>
>> But please stop with the exaggerations, again, nudity != porn.
>
> When have I called nudity porn? You are bringing in something I have not
> said. I think my best points so far are up above. Re-read them.
>
> Summary :
> Let's bring in filters.
> Let's let the user choose what they want to view, by providing adult content
> filter toggles in their user preferences.
> Let's enable young people and children to browse the site "safely" and
> therefore allow universities and large groups of people involved with
> children to interact, together with others who have no such restriction, in
> a taylored Open Clipart experience.
>
> Now it's off to bed for me. I'm tired after all this thinking and advocacy.
> :) Be well. Debate hard. The floor is yours.
>
>
>
> _______________________________________________
> clipart mailing list
> clipart at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/clipart
>
>
Ok I'm implementing filters now, what filters do we need? NSFW and... ?
More information about the clipart
mailing list