[Foundation-l] Letter to the community on Controversial Content

Thomas Morton morton.thomas at googlemail.com
Wed Oct 12 14:41:09 UTC 2011


>
> It contains facts about opinions - it does not itself express an opinion.
> It
> is both factual, and a fact.
>

It expresses the *opinion* of the judge that Abbey killed Betty :) We
include it because the global *opinion* is that judges are in a position to
make such statements with authority. And the fact is that Abbey is convicted
of killing Betty.

My point was that opinion influences both our content and our choice of
material (just not our opinion, theoretically).

Perhaps I was confused by your original:

*an encyclopedia, at least in intention, does not deal in opinions at all,
but rather in facts*

Which suggested were were uninterested in opinion (not true, of course).

There is information content in an image - if there wasn't, we wouldn't need
> any.
>

We regularly (and rightly) use images in a purely illustrative context -
this is fine. Images look nice. They can also express the same concepts as
the prose in a different way (which might connect with different people).
But in the vast majority of cases images are supplementary to the prose.

Yes; in some cases an image may contain information not in the prose - this
is a legitimate problem to consider (although if we are just hiding images &
leaving them accessible then there doesn't seem to be an issue to me).


> Making a
> decision to use or not to use an image is an editorial decision, and in
> some
> cases it could enhance or detract from the neutrality of the article.
>

Yes, it could. But this is where we get to the finicky part of the situation
- because if we get the filtering right this won't matter, because it is an
individual choice about what to see/not see.

What you are talking about there is abusing any filter to bias or detract
from the neutrality of an article for readers.

When I put together a product for a user base you have to look at what they
want, and what they also need. They want a filter to hide X, and they need
one that does so properly and without abuse.

So, yes, I agree that a filter has potential for abuse - and any technical
solution should take that into consideration and prevent it.


> > Removal of, say, a nude image on the Vagina article does not bias or
> > detract
> > from the information.
>
>
> Then we can solve the problem by removing the image completely, since the
> article would be completely unaffected by it.


Not really; the image certainly has value for some. Hiding it on page load
for those who do not wish it to appear is also good. We don't have to have a
binary solution....

So long as the image

a) Appears for people who use and appreciate it
b) Is initially hidden for those who do not wish to see it
c) Appears for those apathetic to it's appearance

Then this is surely a nice improvement to the current situation of "Appears
for everyone", one which does not remove *any* information from the reader
and provides them with the experience they wish.

Here's a similar point; if we had a setting that said "do not show plots
initially" that collapsed plots on movies, books, etc. this would effect the
same thing. The reader would have expressed a preference in viewing
material; none of that material is removed from his access, but he is able
to browse Wikipedia in a format he prefers. Win!

If a reader wanted to read Wikipedia with the words "damn" and "crap"
substituted for every (non-quoted) "fuck" and "shit" why is this a problem?
It alters presentation of the content to suit their sensibilities, but
without necessarily detracting from the content.

Another thought; the mobile interfaces collapses all sections by default on
page load (apart from the lead). Hiding material in this format (where the
reader has expressed an implicit preference to use Wikipedia on a mobile
device) doesn't seem to be controversial.

Hiding an image to suit individual preference is a good thing. It's just a
technical challenge to make sure the preference is well reflected, the
system is not abused and the content remains accessible.

Tom


More information about the foundation-l mailing list