You say that these organisations do what they do to maximise their profits. I would
that they maximise their profits by serving their customers as well as they can do.
customers well is something that we should aspire to as well, regardless of whether our
customers are paying us or not. We are providing a service; it is a well-established
of quality management that the quality of a service is defined by the customer, not the
provider. Providers who insist that they know better than the customer population go out
business. You'll probably say now that no one surveyed the customer population, and I
would agree with you -- as far as I am concerned, the referendum should have targeted
readers. But going by what commercial companies do is not a bad method to gauge
customer preferences. While we're not out of pocket if we fail to respond to customer
wishes, these companies are, and they do and pay for research to prevent that.
As for your question about the creampie example, some Wikipedians have said they might
use the image filter at work, just so they don't have explicit images popping up on
Speaking for myself, my wife and son do sometimes give me a funny look when they walk
past me and I'm on some page with in-your-face explicit media content, like the
article -- the point is I'm not on there to look at the juvenile and embarrassing
picture on that
page ( http://en.wikipedia.org/wiki/Creampie_(sexual_act)
), but to sort out some issue
the text. Yet that is not apparent to someone walking past you. I might well use the
filter, just to stop freaking out my son when he comes out of the kitchen.
--- On Sat, 1/10/11, David Levy <lifeisunfair(a)gmail.com> wrote:
From: David Levy <lifeisunfair(a)gmail.com>
Subject: Re: [Foundation-l] Blog from Sue about censorship, editorial judgement, and image
Date: Saturday, 1 October, 2011, 13:42
Andreas Kolbe wrote:
We'd still be in good company, as all other major
Google, YouTube and Flickr, use equivalent systems, systems that are
I'm going to simply copy and paste one of my earlier replies (from a
Websites like Flickr (an example commonly cited) are commercial
endeavors whose decisions are based on profitability, not an
obligation to maintain neutrality (a core element of most WMF
projects). These services can cater to the revenue-driving majorities
(with geographic segregation, if need be) and ignore minorities whose
beliefs fall outside the "mainstream" for a given country. We mustn't
One of the main issues regarding the proposed system is the need to
determine which image types to label "potentially objectionable" and
place under the limited number of optional filters. Due to cultural
bias, some people (including a segment of voters in the "referendum,"
some of whom commented on its various talk pages) believe that this is
as simple as creating a few categories along the lines of "nudity,"
"sex," "violence" and "gore" (defined and populated in
For a website like Flickr, that probably works fairly well; a majority
of users will be satisfied, with the rest too fragmented to be
accommodated in a cost-effective manner. Revenues are maximized.
The WMF projects' missions are dramatically different. For most,
neutrality is a nonnegotiable principle. To provide an optional
filter for "image type x" and not "image type y" is to formally
validate the former objection and not the latter. That's
An alternative implementation, endorsed by WMF trustee Samuel Klein,
is discussed here:
If I google for images of cream pies in my office in
the lunch break,
because I want to bake one, I'm quite happy not to have dozens of images of
sperm-oozing rectums and vaginas pop up on my screen. Thanks, Google.
Are you suggesting that a comparable situation is likely to arise at a
foundation-l mailing list