[Foundation-l] Principle and pragmatism with nudity and sexual content

Marcus Buck me at marcusbuck.org
Tue Apr 21 13:06:20 UTC 2009

Yaroslav M. Blanter hett schreven:
> May be I misunderstand smth but as far as nudity is concerned (to return
> to the original topic), obviously standards are very much different in
> Denmark and Iran. Does it make sense to make a global standard and impose
> it on Danish and Farsi wikipedias at the same time? Especially if this
> standard gets voted for on Meta dominated by Americans who have the third
> standard? As far as I am concerned as soon as the issue gets over BLP (for
> which we need to have the common policy) it is not a subject for global
> meta or foundation decision anymore.
> Cheers
> Yaroslav
In my opinion the best system would be like this:
We create a software measure to apply tags to specific content. These 
tags would say something like "contains explicit depiction of human 
vaginal intercourse", "contains explicit depiction of human penis", 
"contains depiction of fascist propaganda material" (so this system 
would cover all types of content that are not universally considered 
inoffensive), "contains depiction of naked child", "contains explicit 
depiction of 'very ugly' disease" (e.g. open ulcer). These tags should 
be rather fine-grained (so no tag would say "contains nudity", but the 
exact naked body parts would be part of the tag) and should cover all 
topics that are considered to be "taboo" by one of the societies on our 
planet (and when I say "society" I mean ethnic, cultural and religious 
societies, but not political societies, I'd rather not like to see this 
system applied for political censorship, but only for censorship, that 
is common sense in a specific civil society).

Every project could then decide what kind of tags would be allowed on 
their project. As options they could prohibit the inclusion of that 
content (ideally namespace-wise), or make them hidden by default (so if 
an image is included in an article, you would have to do one more click 
to display it).
Every user then could change the behavior in the preferences. So if 
sexually explicit images are hidden by default on the project, any user 
could change his preferences to show those pictures by default.
If a user visits an image description page of an image that is tagged 
with one of the tags that are specified in the project's preferences or 
in his personal preferences, the user will get a warning "Do you really 
want to see this X?" (where X is the type of "tabooness") and can decide 
whether he wants to see it or not.

Perhaps this could also be extended to provide a child-proof measure. 
Instead of getting a warning the user would be blocked from looking at 
the content. Although this would be on the border to real censorship and 
could be abused.

I think, a system like that would provide the highest amount of 
flexibility without imposing any restrictions on anybody.

Marcus Buck

More information about the foundation-l mailing list