[Foundation-l] Wikipedia is not the Karma Sutra, was Re: commons and freely licensed sexual imagery

Thomas Dalton thomas.dalton at gmail.com
Thu May 14 18:43:20 UTC 2009


2009/5/14 Brion Vibber <brion at wikimedia.org>:
> IMHO any restriction that's not present in the default view isn't likely
> to accomplish much. The answer an objecting parent wants to "my daughter
> saw a lady with semen on her neck on your website" is *not* "you should
> have told her to log in and check 'no sexual imagery' in her profile"!

Indeed, that wouldn't work. How about a link parents can click to
download an indefinite cookie blocking the things they don't like?
Easy for kids with some technical know-how to bypass, but that's
always going to be the case. They are plenty of existing options for
parents to block things from their children, we can probably learn
something from them.

> Slippery-slope arguments aside, it seems unfortunate that as creators of
> "educational resources" we don't actually have anything that's being
> created with a children's audience in mind -- Wikipedia is primarily
> being created *by adults for adults*.

Well, there are things like www.schools-wikipedia.org which is a
start, although that's from the other direction than we're talking
about (selecting good stuff, rather than deselecting bad stuff).

> The challenge here isn't technical, but political/cultural; choosing how
> to mark things and what to mark for a default view is quite simply
> _difficult_ as there's such a huge variance in what people may find
> objectionable.

That's why we long ago decided to be completely uncensored. It's not
just difficult, it's impossible.

> Sites like Flickr and Google image search keep this to a single toggle;
> the default view is a "safe" search which excludes items which have been
> marked as "adult" in nature, while making it easy to opt out of the
> restricted search and get at everything if you want it.

Image searching is very different to reading an encyclopaedia, though.
You know when you're typing the search terms whether you are looking
for porn or not, and that's all the filters are really there for.
There aren't many instances where someone would be searching flickr or
google and wanting to find "adult" images when they aren't simply
looking for porn.

> Ultimately it may be most effective to implement something like this
> (basically an expansion of the "bad image list" implemented long ago for
> requiring a click-through on certain images which were being frequently
> misused in vandalism)

How would you implement that? You can make it so "adult" images don't
load straight away and get replaced by a notice saying "Explicit image
- click here to view" or something (with an option somewhere to view
all images by default and, while you're there, you might as well make
an option to view no images by default with the same click-to-view
system). That would stop people accidentally seeing explicit images,
but it isn't going to be a major inconvenience to anyone else (the
images should be pre-loaded, so it's an instant *click* and the image
is there). Obviously, this is so easy to get around that it would
require paranoid parents to supervise their children's browsing, but
if they are that paranoid they should be doing that anyway.

However, such a system doesn't solve the problem of determining what
to censor, and I don't think that problem has a solution.



More information about the foundation-l mailing list