On Sun, May 9, 2010 at 2:34 PM, Gregory Maxwell <gmaxwell(a)gmail.com> wrote:
On Sun, May 9, 2010 at 9:24 AM, Derk-Jan Hartman
<d.j.hartman(a)gmail.com>
wrote:
This message is CC'ed to other people who
might wish to comment on this
potential approach
---
Dear reader at FOSI,
As a member of the Wikipedia community and the community that develops
the
software on which Wikipedia runs, I come to you with a few questions.
Over the past years Wikipedia has become more and
more popular and
omnipresent. This has led to enormous
I am strongly in favour of allowing our users to choose what they see.
"If you don't like it, don't look at it" is only useful advice when
it's easy to avoid looking at things— and it isn't always on our
sites. By marking up our content better and providing the right
software tools we could _increase_ choice for our users and that can
only be a good thing.
I agree and I'm in favor of WMF allocating resources in order to develop a
system that allows users to filter content based on the particular needs of
their setting.
At the same time, and I think we'll hear a similar message from the
EFF and the ALA, I am opposed to these organized "content labelling
systems". These systems are primary censorship systems and are
overwhelmingly used to subject third parties, often adults, to
restrictions against their will. I'm sure these groups will gladly
confirm this for us, regardless of the sales patter used to sell these
systems to content providers and politicians.
(For more information on the current state of compulsory filtering in
the US I recommend the filing in Bradburn v. North Central Regional
Library District an ongoing legal battle over a library system
refusing to allow adult patrons to bypass the censorware in order to
access constitutionally protected speech, in apparent violation of the
suggestion by the US Supreme Court that the ability to bypass these
filters is what made the filters lawful in the first place
http://docs.justia.com/cases/federal/district-courts/washington/waedce/2:20…
)
It's arguable if we should fight against the censorship of factual
information to adults or merely play no role in it— but it isn't
really acceptable to assist it.
And even when not used as a method of third party control, these
systems require the users to have special software installed— so they
aren't all that useful as a method for our users to self-determine
what they will see on the site. So it sounds like a lose, lose
proposition to me.
Labelling systems are also centred around broad classifications, e.g.
"Drugs", "Pornography" with definitions which defy NPOV. This will
obviously lead to endless arguments on applicability within the site.
Many places exempt Wikipedia from their filtering, after all it's all
educational, so it would be a step backwards for these people for us
to start applying labels that they would have gladly gone without.
The filter the "drugs" category because they want to filter pro-drug
advocacy, but if we follow the criteria we may end up with our factual
articles bunched into the same bin. A labelling system designed for
the full spectrum of internet content simply will not have enough
words for our content... or are there really separate labels for "Drug
_education_", "Hate speech _education_", "Pornography
_education_",
etc. ?
Urban legend says the Eskimos have 100 words for snow,
it's not
true... but I think that it is true that for the Wiki(p|m)edia
projects we really do need 10 million words for education.
Using a third party labelling system we can also expect issues that
would arise where we fail to "correctly" apply the labels, either due
to vandalism, limitations of the community process, or simply because
of a genuine and well founded difference of opinion.
Instead I prefer that we run our own labelling system. By controlling
it ourselves we determine its meaning— avoiding terminology disputes
without outsiders; we can operate the system in a manner which
inhibits its usefulness to the involuntary censorship of adults (e.g.
not actually putting the label data in the pages users view in an
accessible way, creating site TOS which makes the involuntary
application of our filters on adults unlawful), and maximizes its
usefulness for user self determination by making the controls
available right on the site.
The wikimedia sites have enough traffic that its worth peoples time to
customize their own preferences.
There are many technical ways in which such a system could be
constructed, some requiring more development work than others, and
while I'd love to blather on a possible methods the important point at
this time is to establish the principles before we worry about the
tools.
I agree and prefer a system designed for the special needs of WMF wikis and
our global community. We may take some design elements and underlying
concepts from existing systems, but our needs are somewhat unique.
The main objective is not to child proof our sites. We need to recognize
that the people using the system may be adults who choose to put a filtering
system in place to make it possible to edit from setting where some types of
content is inappropriate or disallowed. This makes WMF projects more
accessible to people which moves us toward our overall mission.
Sydney Poore
(FloNight)
Cheers,
_______________________________________________
foundation-l mailing list
foundation-l(a)lists.wikimedia.org
Unsubscribe:
https://lists.wikimedia.org/mailman/listinfo/foundation-l