[Foundation-l] Letter to the community on Controversial Content

Thomas Morton morton.thomas at googlemail.com
Tue Oct 18 12:00:13 UTC 2011


On 18 October 2011 11:56, Tobias Oelgarte <tobias.oelgarte at googlemail.com>wrote:

> I don't assume that. I say that they should have the opportunity to
> change if they like to.


Absolutely - we do not disagree on this.


> That controversial content is hidden or that we
> provide a button to hide controversial content is prejudicial.


I disagree on this, though. There is a balance between encouraging people to
question their views (and, yes, even our own!) and giving them no option but
to accept our view.

This problem can be addressed via wording related to the filter and
avoidance of phrases like "controversial", "problematic" etc.

I disagree very strongly with the notion that providing a button to hide
material is prejudicial.

It deepens the viewpoint that this content is objectionable and that it is
> generally accepted this way, even if not. That means that we would
> fathering the readers that have a tendency to enable a filter (not even
> particularly an image filter).
>

This is a reasonable objection; and again it goes back to this idea of how
far do we enforce our world view on readers. I think that there are ways
that a filter could be enabled that improves Wikipedia for our readers
(helping neutrality) and equally there are ways that it could be enabled
that adversely affect this goal.

So if done; it needs to be done right.


> ... and that is exactly what makes me curious about this approach. You
> assume that we aren't neutral and Sue described us in median a little
> bit geeky, which goes in the same direction.


We are not; over time it is fairly clear that we reflect certain world
views. To pluck an example out of thin air - in the 9/11 article there is
extremely strong resistance to adding a "see also" link to the article on
9/11 conspiracies. This reflects a certain bias/world view we are imposing.
That is an obvious example - there are many more.

The bias is not uniform; we have various biases depending on the subject -
and over time those biases can swing back and forth depending on
the prevalent group of editors at that time. Many of our articles have
distinctly different tone/content/slant to foreign language ones (which is a
big giveaway IMO).

Another example: English Wikipedia has a pretty strong policy on BLP
material that restricts a lot of what we record - other language Wiki's do
not have the same restrictions and things we would not consider noting (such
as non-notable children names) are not considered a problem on other Wikis.


But if we aren't neutral at
> all, how can we even believe that an controversial-content-filter-system
> based upon our views would be neutral in judgment or as proposed in the
> referendum "cultural neutral". (Question: Is there even a thing as
> cultural neutrality?)
>

No; this is the underlying problem I mentioned with implementing a filter
that offers pre-built lists.

It is a problem to address, but not one that kills the idea stone dead IMO.


> We also don't force anyone to read Wikipedia.
>

Oh come on :) we are a highly visible source of information with millions of
inbound links/pointers. No we don't force anyone to read, but this is not an
argument against accommodating as many people as possible.


> If he does not like it, he
> has multiple options. He could close it, he could still read it, even if
> he don't like any part of it, he could participate to change it or he
> could start his own project.
>

And most of those options belie our primary purpose.

We can definitely think about possible solution. But at first i have to
> insist to get an answer to the question: Is there a problem, big and
> worthy enough, to make it a main priority?
>

Absolutely - and the first question I asked in this debate (weeks ago) was
when we were going to poll readers for their opinion. This devolved slightly
into an argument over whether our readers should have a say in Wikipedia...
but the issue still stands - no clear picture has been built.

We are still stuck in our little house....

I doubt it will ever be done; which is why if it comes to a "vote", despite
my advocacy here, I will staunchly oppose any filter on grounds of process
and poor planning.

I am willing to be pleasantly surprised.


> After that comes the question for (non neutral) categorization of
> content. That means: Do we need to label offensive content, or could
> same goal be reached without doing this?
>

Well from a practical perspective a self-managed filter is the sensible
option.

I think we can do an objective categorisation of things people might not
like to see, though. Say, nudity, we could have an entirely objective
classification for nudity.. just thinking off-hand & imperfectly:

- Incidental nudity (background etc.)
- Partial nudity
- Full nudity
- Full frontal nudity / Close ups
- Sexual acts

And then independently classify articles as "sexuality topic", "physiology
topic" & "neither" (with neither being default). By combining the two
classifications you can build a dynamic score of how likely that any image
of nudity in any article should be shown based on the user preference (on
the basis that nudity might is more expected in sexuality topics).

Then the user would have the option to filter all nudity, or some level of
nudity - defaulting to no filter. Plus they could include/remove images from
the filter at will.

I hope this rough idea shows how you could approach the difficult topic of
classification under an objective criteria without allowing article-specific
abuse.

I think part of the problem is constant re-use of the term "offensive" -
because this biases us to approach content from the perspective of
controversy. But I go back to a previous point that I made; which is that
this is about filtering things users don't want to see. This may not be
offensive content (although obviously that is the most common example).

Many people have fear of clowns; allowing them to filter out clown images is
a good thing and helps make their experience better.


> In the other mail you said that you have a problem with reading/working
> on Wikipedia in public, because some things might be offensive to
> bystanders. One typical, widely spread argument. But this problem can
> easily be solved without the need for categorization. The brainstorming
> sections are full with easy, non disturbing solutions for exactly this
> potential problem.
>

That wasn't me; and it was perhaps not a good example any way.

Although I would love to see a "public" mode for me to enable that hid
nudity etc. for use in public - nice and simple without totally degrading my
browsing experience.


> The only group claiming, that not being able to hide something is
>
censorship, are the people, that want to hide content from others.


Strong accusation. Total nonsense I'm afraid - if you mean it in the way it
comes across (i.e. we are pursuing an agenda to try and hide content from
people).

You'll notice I have not offered a counter accusation (say, that people
arguing the filter would be censoring are people trying to enforce their
politics/world view on everyone); perhaps we could be afforded the same
courtesy?

Tom


More information about the foundation-l mailing list