Andreas Kolbe wrote:
We'd still be in good company, as all other major
Google, YouTube and Flickr, use equivalent systems, systems that are
I'm going to simply copy and paste one of my earlier replies (from a
Websites like Flickr (an example commonly cited) are commercial
endeavors whose decisions are based on profitability, not an
obligation to maintain neutrality (a core element of most WMF
projects). These services can cater to the revenue-driving majorities
(with geographic segregation, if need be) and ignore minorities whose
beliefs fall outside the "mainstream" for a given country. We mustn't
One of the main issues regarding the proposed system is the need to
determine which image types to label "potentially objectionable" and
place under the limited number of optional filters. Due to cultural
bias, some people (including a segment of voters in the "referendum,"
some of whom commented on its various talk pages) believe that this is
as simple as creating a few categories along the lines of "nudity,"
"sex," "violence" and "gore" (defined and populated in
For a website like Flickr, that probably works fairly well; a majority
of users will be satisfied, with the rest too fragmented to be
accommodated in a cost-effective manner. Revenues are maximized.
The WMF projects' missions are dramatically different. For most,
neutrality is a nonnegotiable principle. To provide an optional
filter for "image type x" and not "image type y" is to formally
validate the former objection and not the latter. That's
An alternative implementation, endorsed by WMF trustee Samuel Klein,
is discussed here:
If I google for images of cream pies in my office in
the lunch break,
because I want to bake one, I'm quite happy not to have dozens of images of
sperm-oozing rectums and vaginas pop up on my screen. Thanks, Google.
Are you suggesting that a comparable situation is likely to arise at a