Hiya Bishakha
On Sat, Oct 1, 2011 at 2:20 AM, Bishakha Datta bishakhadatta@gmail.comwrote:
On Fri, Sep 30, 2011 at 10:54 PM, Theo10011 de10011@gmail.com wrote:
Bishakha, call it editorial-content, call it censorship or any other euphemism - at the heart of it, it is deciding what someone gets to see
and
what not.
Theo: they are different things, and given the premium on accuracy and precision at wikipedia, I don't think we can claim that editorial judgments and censorship are the same.
I have said, it is a matter of perspective how you view them. But if we go by the assumption that editorial judgement is a separate thing, whose job is it to exercise it? WMF has long held the position that the project are independent and it has not editorial control over what the community decides- this would not be the case if we consider the filter an editorial judgement. Keeping in mind the reaction that has been shown by different communities, would it mean, WMF would be exercising that control? using an already existing structure of categories created earlier, possibly by editors who don't agree with the filter, to implement the said editorial control? What about editorial independence[1]?
It should not be our job to censor our own content.
We're not suggesting that as far as I know. Nothing is being removed from the sites. [1]
No, it is only being hidden. Based on an arbitrary system of categories that can be exploited. We are indeed hiding our content, same as any dictatorial regime who chooses to hide works of literature, art or knowledge (I hope the last one is not us) from its people.
Mediawiki also works in a similar fashion, it hides revisions rather than delete it outright when an article is deleted - Irony?
The strongest argument I read against this has been - it is not something WMF and the board should implement and develop, If there was a need to censor/cleanse graphic content, there would a successful mirror or a fork of the project already somewhere. Instead, we have small distributions/projects which
use
1-2 year old offline dumps to cleanse and then consider safe.
Now, If you were to apply this argument to a government, or a regime and they decide on removing things that make them flinch -
how different would
we be from dictatorial regimes who limit/restrict access to Wikipedia for all the people that do flinch?
There is no proposal to remove anything from the sites; as I understand it, it is proposed that users can click on a button to turn off some images - those who want to continue to see everything can continue to do so. Nothing goes.
I never said there was. I said "restrict access to Wikipedia for all the people that do flinch". There is a big gap on how this system would be implemented, if we go by the proposed system in the mock-up, it would be using categories to implement what is deemed offensive. The problem is, when you click on a filter the decision on what is offensive might not be a users alone, but a standardized one across the board.
But when the Indian government bans Salman Rushdie's Satanic Verses or James Lane's book on Shivaji, that is censorship.[2]
I can point to Indian I&B ministry issues or
Film censor board of India, but you probably know more about them than
me.
Yes, I know from personal experience - had a huge brush with the Censor
Board in 2001 and refused to remove any content from my docu as demanded by them. [3]
Depending on the perspective, one can argue that they only wanted the content hidden, not visible to those who do flinch. Would it be different if they argued that they were only exercising editorial control? for the children, the general public and all the people who do flinch.
Regards Theo