[Foundation-l] Controversial content software status - the image filter disguised under a new label
Andreas Kolbe
jayen466 at gmail.com
Mon Mar 12 19:22:25 UTC 2012
On Mon, Mar 12, 2012 at 5:05 PM, Fae <faenwp at gmail.com> wrote:
> Strangely enough, searching Commons for "Male figure" rather than
> "Male human" shows me artwork from the National Museum of African Art
> and a Michelangelo Buonarroti sketch from the Louvre in top matches.
> No problem with wading through "100 dicks and arseholes". In fact,
> carefully checking through the first 100 matches of that search gave
> me no explicit photographs of naked people or their private parts at
> all.
>
Well, if you just search for "male", you still get lots of penises and
sphincters.
http://en.wikipedia.org/w/index.php?title=Special%3ASearch&profile=images&search=male&fulltext=Search
Bear in mind that this is what students get in schools, too.
> Having a better optimized search engine is the issue here, not
> filtering all images of body parts.
I agree that a better search engine is part of the answer. Niabot made an
excellent proposal (clustered search) a week ago, which is written up here:
http://meta.wikimedia.org/wiki/Controversial_content/Brainstorming#Clustering_for_search_results_on_Commons
But I don't think it obviates the need for a filter, which
is frankly standard even in mainstream *Western* sites that contain adult
material.
> Commons has over 10,000,000
> images, having several hundred images of human genitals is not to be
> unexpected, or a reason to give up on collaboration and turn to
> extremes of lobbying multiple authorities and newspapers with claims
> that the WMF is promoting paedophilia with the side effect of fuelling
> well known internet stalkers to harass staff and users.
>
We have had a consistent problem with pedophilia advocates in Commons
becoming involved in curating sexual images. It is a problem when an editor
with a child pornography conviction that was prominent enough to hit the
press, who did several years in jail and was deported from the US, is so
involved in our projects.
It is a problem when that editor's block is promptly endorsed by the
arbitration committee on English Wikipedia, but is equally quickly
overturned in Commons.
It is a problem if a Commons admin says, when being made aware of Sue
Gardner's statement about Wikimedia's zero-tolerance policy towards
pedophilia advocacy, that
"You can quote Sue if you want - but Sue is Sue and not us. Sue also tried
to install a image filter and was bashed by us."
http://commons.wikimedia.org/w/index.php?title=Commons:Administrators%27_noticeboard/User_problems&diff=prev&oldid=68051777
By the way, that statement of Sue's has now been removed from the Meta page
on pedophilia:
http://meta.wikimedia.org/w/index.php?title=Pedophilia&diff=3557747&oldid=3546718
Now, English Wikipedia has for some time had a well-defined process for
such cases. They are not to be discussed on-wiki, but are a matter for
private arbcom communication. That is sensible. However, Commons has lacked
both an arbitration committee, and any equivalent policy. (There are
efforts underway now to write one:
http://commons.wikimedia.org/wiki/Commons:Child_protection)
This being so, there has been no other way to address this in Commons than
to discuss it on-wiki, and it is a problem if an editor who posts evidence
on Commons proving that the person in question has continued to advocate
pedophilia online quite recently, and well after their release from prison,
is blocked for "harassment", while the editor in question remains free to
help curate pornographic material. But that is Commons for you.
I am afraid that to most people out there in the real world, it will seem
absolutely extraordinary that an educational charity lets someone with a
child pornography conviction curate adult material, while its
administrators block an editor who points out that the person has continued
to be an open and public "childlove" advocate online.
Andreas
More information about the wikimedia-l
mailing list