/Please distribute this message widely/
*Call for referendum*: The Wikimedia Foundation, at the direction of the Board of Trustees, will be holding a vote to determine whether members of the community support the creation and usage of an opt-in personal image filter, which would allow readers to voluntarily screen particular types of images strictly for their own account.
Further details and educational materials will be available shortly. The referendum is scheduled for 12-27 August, 2011, and will be conducted on servers hosted by a neutral third party. Referendum details, officials, voting requirements, and supporting materials will be posted at http://meta.wikimedia.org/wiki/Image_filter_referendum shortly.
For the coordinating committee, Philippe (WMF) Cbrown1023 Risker Mardetanha PeterSymonds Robert Harris
On 7/1/2011 3:55 PM, Casey Brown wrote:
*Call for referendum*: The Wikimedia Foundation, at the direction of the Board of Trustees, will be holding a vote to determine whether members of the community support the creation and usage of an opt-in personal image filter, which would allow readers to voluntarily screen particular types of images strictly for their own account.
This proposal is just astonishingly vague. It sounds like this is something like the "safe" flag in Flickr.
The software side of implementing such filters is simple, but the real problem is maintaining the tags that keep track of what's offensive.
I run a site that uses content from Wikimedia Commons and other sources and I've recently had some problems with my partners after one of them found a picture of a human anus on the site. I'd made some effort to remove offensive content from the site before, but I redoubled my efforts after this.
I did a lot of thinking about it and personally I decided I'd rather take the change of excluding a few good things if I can get rid of some bad things. Currently my system thinks that about 0.15% of images used in Wikipedia are "offensive", which roughly means "connected with nudity, sexuality, pornography, or illegal drugs." Now, I'm trying to make something that's useful for K-12 education, so I'm probably more exclusionary than some people would be -- the site has already gotten an endorsement from a board of education in a relatively conservative state and frankly, I'd rather preserve relationships that help students find the %99.85 of images that everyone will agree on.
Now somebody else can make a different decision for another site and that's fine.
I had to make all sorts of decisions here: for instance, I wasn't sure if I wanted to get rid of illegal drugs, because there's a slippery slope here: a picture of a pot plant is relevant to botany, people abuse uncontrolled drugs such as cough syrup, and there's a very common mushroom that's possibly growing on your lawn that contains trace quantities of psilocybin. On the other hand, I felt that 30% of drug images were offensive, such as pictures of identifiable people using cocaine. Since it would have been hard to make an operational definition of what exactly is 'offensive', I decided to just remove all of them.
Now, Wikipedia is widely used in K-12 education, but people don't often mention all of the things you can find in Wikipedia that aren't in the Encyclopedia Britannica, such as the video and images that you'd find on this page:
http://en.wikipedia.org/wiki/Ejaculation
In a consensus-based organization, I think it will be very difficult to set tagging standards and get them consistently enforced. Where I'm the king of my own domain I had a lot of agony about getting things right -- add politics to the mix, and it all gets worse.
To take a specific example, the category "Gay Culture" in Wikipedia is particularly problematic because "Gay" as a category is related to sexuality (just as is "Straight".) Maybe 60% or so of "Gay Culture" topics (in Category:LGBT_culture) could be said to be offensive, such as
http://en.wikipedia.org/wiki/Glory_hole_(sexual_slang) http://en.wikipedia.org/wiki/Glory_hole_%28sexual_slang%29
now the way I see it, most of the "offensive" acts related to homosexuality can also be performed by heterosexuals and would be equally offensive. On the other hand, there might be some people who'd see an "offensive" tag on a gay-related topic and see that as some kind of hate speech, even if an effort is being made to treat gay and straight the same. If, however, a conservative school board complained that I had pictures of the Stonewall or a gay pride parade I'd tell them to go to hell.
Other areas of "offensiveness" which may be problematic are gambling and hate speech. Cards and dice, for instance, are used for many non-gambling games and pictures of the exteriors of casinos on the Vegas strip have a high relevance to post-modern architecture and aren't likely to incite people to gamble illegally or destructively. Similarly, there are reasons to suppress active hate speech, but you can't flag every picture of Nazi Germany as "offensive:hate_speech" or, going a bit further back in history where things are murkier and more controversial, every picture that has a confederate flag in it.
On Sat, Jul 2, 2011 at 10:37 AM, Paul Houle paul@ontology2.com wrote:
This proposal is just astonishingly vague. It sounds like this is something like the "safe" flag in Flickr.
Thank you for your detailed reply, Paul! I just want to comment on your initial mini-paragraph:
Yes, this isn't the proposal itself, this is just the announcement that a referendum will be held and that more information and the proposal itself will be available on Meta-Wiki in the coming days.
Although it isn't "official" or at all definitive, I believe the "personal image filter mockup" would be interesting to look at if you haven't already: http://www.mediawiki.org/wiki/Personal_image_filter
2011/7/2 Casey Brown lists@caseybrown.org
Although it isn't "official" or at all definitive, I believe the "personal image filter mockup" would be interesting to look at if you haven't already: http://www.mediawiki.org/wiki/Personal_image_filter
I took a look, at first glance I dislike such an idea. I think that simply a good set of categories freely classifying as many kinds of "offensive contents" as needed from endless list of personal idiosyncrasies coupled with some simple user-side js tool, freely built by user itself to "filter" images, should be sufficient. Users should be simply encouraged to add such categories to offending images, and to build filtering tools, by themselves or with some help from willing friends; problem solved, IMHO.
And - what about wikipedia, or sister projects, using "offensive images"? Is such a filtering procedure to be extended to articles using offensive images too into any wiki?
Alex brollo
I do have concerns about censorship and can see both sides of the equations but to suggest that "*built by user itself to "filter" images*" is an unrealistic endeavour and to hold such a position is an arogant[sp]assumption that people have the ability and resources to create their own filter. Its also ignorant of the needs of many of the people who benefit from the work of our contributors, remember our basic principle is to make knowledge freely available. Misconceptions about Wikipedias demographic of being the male computer geek aged between 16-24 is only reinforced by such comments, when we make such misconceptions we knowingly discriminate against people outside that group, when we discriminate we no longer make the sum of human knowldedge freely available.
Whether a person choose to use a filter or not it should be their own choice, I know some of the content is an issue in certain environments and by being personally able to filter out such content would enable greater participation. If a voluntary self selective filter enable minorities within our society to participate then that must be a good thing, those same people can then participate and expand content within thier knowledge/resource base that benefits all of us.
On 5 July 2011 06:35, Alex Brollo alex.brollo@gmail.com wrote:
2011/7/2 Casey Brown lists@caseybrown.org
Although it isn't "official" or at all definitive, I believe the "personal image filter mockup" would be interesting to look at if you haven't already: http://www.mediawiki.org/wiki/Personal_image_filter
I took a look, at first glance I dislike such an idea. I think that simply a good set of categories freely classifying as many kinds of "offensive contents" as needed from endless list of personal idiosyncrasies coupled with some simple user-side js tool, freely built by user itself to "filter" images, should be sufficient. Users should be simply encouraged to add such categories to offending images, and to build filtering tools, by themselves or with some help from willing friends; problem solved, IMHO.
And - what about wikipedia, or sister projects, using "offensive images"? Is such a filtering procedure to be extended to articles using offensive images too into any wiki?
Alex brollo
Commons-l mailing list Commons-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/commons-l