[Foundation-l] Letter to the community on Controversial Content (???)
WereSpielChequers
werespielchequers at gmail.com
Mon Oct 17 14:10:55 UTC 2011
Re
>> I claim that you are talking total crap. It is not *that* difficult to
>> get the
>> categories of an image and reject based on which categories the image
>> is in are. There are enough people out there busily categorizing all the
>> images already that any org that may wish to could block images that
>> are in disapproved categories.
>It is incredibly easy. One justs says any image within Category:Sex is
>not acceptable.
>Its not hard to do. An organisation can run a script once a week or so
>to delve down
>through the category hierachy to pick up any changes.
>You already categorize the images for any one with enough processing
>power, or the
>will to censor the content. I doubt that anyone doing so is going to be
>too bothered
>whether they've falsely censored an image that is in Category:Sex that
>isn't 'controversial'
>or not.
Anyone who thinks that a category based solution can work because we have
enough categorisors, may I suggest that you go to
http://commons.wikimedia.org/wiki/Category:Images_from_the_Geograph_British_Isles_project_needing_categories_by_date
And try categorising 0.01% of that part of the backlog yourself before being
so over optimistic about our categorisation resources.
When we've cleared all the subscategories in there then maybe I could be
convinced that Commons has enough categorisors to handle what it already
does. Taking on a major new obligation would be another matter though, even
if that obligation could be defined and its categories agreed.
A categorisation approach also has the difficult task of getting people to
agree what porn is. This is something that varies enormously around the
world, and while there will be some images that we can all agree are
pornographic, I'm pretty sure there will be a far larger number where people
will be genuinely surprised to discover that others have dramatically
different views as to whether they should be classed as porn. For some
people this may seem easy, anything depicting certain parts of the human
anatomy or certain poses is pornographic to them. But different people will
have a different understanding as to which parts of the body should be
counted as pornographic. Getting the community to agree whether all images
depicting human penises are pornographic will not be easy, and that's before
you get into arguments as to how abstract a depiction of a penis has to be
be before it ceases to be an image of a penis.
We also need to consider how we relate to outside organisations,
particularly with important initiatives such as the GLAM program. This
mildly not safe for work image
http://commons.wikimedia.org/wiki/File:JinaVA.jpg is a good example of the
challenge here. To those concerned about human penises it may well count as
pornographic, though the museum that has it on display certainly does not
bar children from that gallery. This image was loaded by one of our GLAM
partners, our hope for the GLAM project is that hundreds of partners will
load millions perhaps tens of millions of images onto Commons. If that
succeeds then our current categorisation backlog will be utterly dwarfed by
future backlogs. If we start telling GLAM partners that yes we want them to
upload images, but they will need to categorise them through an ill defined
and arbitrary offensiveness criteria, then our GLAM program will have a
problem. In principle I support an image filter, I've even proposed one
design. But if people want to go down the route of a category based image
filter they don't just have to convince the many who oppose any filter as
censorship, they also need to be aware that to me and probably others GLAM
is core to our mission and important, whilst an image filter is non-core and
of relatively low importance. If the two conflict then choosing between them
would be easy.
If people want to advocate a categorisation approach to an image filter I
would suggest they start with the difficult areas of defining where the
boundary would be between porn and non-porn, or between hardcore and
softcore. Drawing clear and sharp lines between different shades of grey is
not easy, especially where you want them to be perceived as right by a
globally diverse population. My advice to anyone considering a category
based filter system is to focus on the shades of grey, not at the extreme
examples on the uncontentious contentious scale.
Then if you manage to square that particular circle an equally difficult
task would be to recruit sufficient categorisers. As someone who has
categorised many hundreds of the Geograph images I'd be surprised to find
any Geograph images that I would be offended by. The sort of statues of
topless ladies that you find on display in England certainly don't offend
me, but bare breasts are pornographic to some people in some contexts. So
there will be some long uncategorised images amongst the 1.7 million from
the Geograph load that meet some peoples definition of porn. Any
categorisation based approach needs to explain how it would recruit more
categorisers, retain those we have, and get those volunteers to work to a
categorisation scheme that for many will seem arbitrary and foreign to their
culture.
As for "I doubt that anyone doing so is going to be too bothered whether
they've falsely censored an image that is in Category:Sex". Quality matters
to Wikimedians, false positives and a tolerance for shoddy work offend
almost all of us. A large proportion of the community don't approve of
censorship even if it was done conscientiously and with a deep concern for
getting it right. Personally I'm in the camp that thinks we could justify an
image filter as part of making our data available to some of the people we
don't currently reach; But I'm all too aware that there are Wikimedians who
are not just bothered by inaccurate censorship, but who consider any
censorship to be out of scope and Foundation money spent on it to be a
misuse of charitable funds. Simply asserting that such people don't exist is
unlikely to get them to agree to any form of censorship, better in my view
to try and design a censorship tool that would give a high quality result.
WereSpielChequers
More information about the foundation-l
mailing list