[Foundation-l] Letter to the community on Controversial Content
lifeisunfair at gmail.com
Tue Oct 11 18:50:06 UTC 2011
Andreas Kolbe wrote:
> If we provide a filter, we have to be pragmatic, and restrict its application
> to media that significant demographics really might want to filter.
Define "significant demographics." Do you have a numerical cut-off
point in mind (below which we're to convey "you're a small minority,
so we've deemed you insignificant")?
> We should take our lead from real-world media.
WMF websites display many types of images that most media don't.
That's because our mission materially differs. We seek to spread
knowledge, not to cater to majorities in a manner that maximizes
For most WMF projects, neutrality is a core principle. Designating
certain subjects (and not others) "potentially objectionable" is
> Real-world media show images of lesbians, pork, mixed-race couples, and
> unveiled women (even Al-Jazeera).
> There is absolutely no need to filter them, as there is no significant target
> group among our readers who would want such a filter.
So only insignificant target groups would want that?
Many ultra-Orthodox Jewish newspapers and magazines maintain an
editorial policy forbidding the publication of photographs depicting
women. Some have even performed digital alterations to remove them
from both the foreground and background.
These publications (which routinely run photographs of deceased
women's husbands when publishing obituaries) obviously have large
enough readerships to be profitable and remain in business.
"As of 2011, there are approximately 1.3 million Haredi Jews. The
Haredi Jewish population is growing very rapidly, doubling every 17 to
Are we to tag every image containing a woman, or are we to deem this
religious group insignificant?
> You mentioned a discussion about category-based filter systems in your other
The ability to blacklist categories is only one element of the
proposal (and a secondary one, in my view).
> One other avenue I would like to explore is whether the existing Commons
> category system could, with a bit of work, be used as a basis for the filter.
> I've made a corresponding post here:
This was discussed at length on the talk pages accompanying the
"referendum" and on this list.
Our current categorization is based primarily on what images are
about, *not* what they contain. For example, a photograph depicting a
protest rally might include nudity in the crowd, but its
categorization probably won't specify that. Of course, if we were to
introduce a filter system reliant upon the current categories, it's
likely that some users would seek to change that (resulting in harmful
Many "potentially objectionable" subjects lack categories entirely
(though as discussed above, you evidently have deemed them
On the brainstorming page, you suggest that "[defining] a small number
of categories (each containing a group of existing Commons categories)
that users might want to filter" would "alleviate the concern that we
are creating a special infrastructure that censors could exploit." I
don't understand how. What would stop censors from utilizing the
categories of categories in precisely the same manner?
> I understand you are more in favour of users being able to switch all images
> off, depending on the page they are on.
The proposal that I support includes both blacklisting and whitelisting.
> This has some attractive aspects, but it would not help e.g. the Commons user
> searching for an image of a pearl necklace. To see the images Commons
> contains, they have to have image display on, and then the first image they
> see is the image of the woman with sperm on her throat.
This problem extends far beyond the issue of "objectionable" images,
and I believe that we should pursue solutions separately.
> It also does not necessarily prepare users for the media they might find in
> WP articles like the ones on fisting, ejaculation and many others; there are
> always users who are genuinely shocked to see that we have the kind of media
> we have on those pages, and are unprepared for them.
Such users could opt to block images by default, whitelisting only the
articles or specific images whose captions indicate content that they
wish to view.
More information about the foundation-l