[Foundation-l] Letter to the community on Controversial Content

Andreas Kolbe jayen466 at yahoo.com
Tue Oct 11 22:17:16 UTC 2011


> From: David Levy <lifeisunfair at gmail.com>

> Andreas Kolbe wrote:

> > If we provide a filter, we have to be pragmatic, and restrict its application
> > to media that significant demographics really might want to filter.

> Define "significant demographics."  Do you have a numerical cut-off
> point in mind (below which we're to convey "you're a small minority,
> so we've deemed you insignificant")?


I would use indicators like the number and intensity of complaints received.

That is one of the indicators the Foundation has used as well. 


> WMF websites display many types of images that most media don't.
> That's because our mission materially differs.  We seek to spread
> knowledge, not to cater to majorities in a manner that maximizes
> revenues.


Generally, what we display in Wikipedia should match what reputable educational sources in the field display. Just like Wikipedia text reflects the text in reliable sources. Anything that goes beyond that should be accessible via a Commons link, rather than displayed on the article page.

Commons, however, is different, and has a wider scope. It has an important role in its own right, and its media categories should be linked from Wikipedia.


> For most WMF projects, neutrality is a core principle.  Designating
> certain subjects (and not others) "potentially objectionable" is
> inherently non-neutral.


What we present should be neutral (where neutrality is, as always, defined by reliable sources, rather than editor preference).

That does not mean that we should not listen to users who tell us that they don't want to see certain media because they find them upsetting, or unappealing. 


> So only insignificant target groups would want that?

> Many ultra-Orthodox Jewish newspapers and magazines maintain an
> editorial policy forbidding the publication of photographs depicting
> women.  Some have even performed digital alterations to remove them
> from both the foreground and background.

> http://en.wikipedia.org/wiki/The_Situation_Room_(photograph)

> These publications (which routinely run photographs of deceased
> women's husbands when publishing obituaries) obviously have large
> enough readerships to be profitable and remain in business.

> "As of 2011, there are approximately 1.3 million Haredi Jews.  The
> Haredi Jewish population is growing very rapidly, doubling every 17 to
> 20 years."

> http://en.wikipedia.org/wiki/Haredi_Judaism

> Are we to tag every image containing a woman, or are we to deem this
> religious group insignificant?


I would deem them insignificant for the purposes of the image filter. They are faced with images of women everywhere in modern life, and we cannot cater for every fringe group. At some point, there are diminishing returns, especially when it amounts to filtering images of more than half the human race.

We need to look at mainstream issues (including Muhammad images). 


> > You mentioned a discussion about category-based filter systems in your other
> > post.

> The ability to blacklist categories is only one element of the
> proposal (and a secondary one, in my view).

> > One other avenue I would like to explore is whether the existing Commons
> > category system could, with a bit of work, be used as a basis for the filter.
> > I've made a corresponding post here:
> >
> > http://meta.wikimedia.org/wiki/Controversial_content/Brainstorming#Refine_the_existing_category_system_so_it_forms_a_suitable_basis_for_filtering

> This was discussed at length on the talk pages accompanying the
> "referendum" and on this list.

> Our current categorization is based primarily on what images are
> about, *not* what they contain.  For example, a photograph depicting a
> protest rally might include nudity in the crowd, but its
> categorization probably won't specify that.  Of course, if we were to
> introduce a filter system reliant upon the current categories, it's
> likely that some users would seek to change that (resulting in harmful
> dilution).

> Many "potentially objectionable" subjects lack categories entirely
> (though as discussed above, you evidently have deemed them
> insignificant).


I believe the most important content is identifiable by categories.


> On the brainstorming page, you suggest that "[defining] a small number
> of categories (each containing a group of existing Commons categories)
> that users might want to filter" would "alleviate the concern that we
> are creating a special infrastructure that censors could exploit."  I
> don't understand how.  What would stop censors from utilizing the
> categories of categories in precisely the same manner?



What I meant is that compiling a collection of a few hundred categories would not be saving censors an awful lot of work. They could -- and can -- achieve the same thing in an afternoon now, based on our existing category system.



> > I understand you are more in favour of users being able to switch all images
> > off, depending on the page they are on.

> The proposal that I support includes both blacklisting and whitelisting.


That would involve a user switching all images off, and then whitelisting those they wish to see; is that correct? Or blacklisting individual categories?

This would be better from the point of view of project neutrality, but would seem to involve a *lot* more work for the individual user. 

It would also be equally likely to aid censorship, as the software would have to recognise the user's blacklists, and a country or ISP could then equally generate its own blacklists and apply them across the board to all users.


> > It also does not necessarily prepare users for the media they might find in
> > WP articles like the ones on fisting, ejaculation and many others; there are
> > always users who are genuinely shocked to see that we have the kind of media
> > we have on those pages, and are unprepared for them.

> Such users could opt to block images by default, whitelisting only the
> articles or specific images whose captions indicate content that they
> wish to view.


> David Levy


Again, requiring these users to do without *any pictures at all*, except those they individually whitelist, doesn't seem like a feasible proposition. It's not user-friendly.

Regards,
Andreas


More information about the foundation-l mailing list