Hi,
As has been mentioned elsewhere in comments on your writings, you have
good ideas which aren't directly related to nudity or sexual content.
1) respect human subjects of photos and other media
1a) get explicit model consent, both for models who are 'many meters
away' and for significant models even when their faces are not visible
1b) doublecheck model ages; minors cannot consent without guardian
approval, and cannot be in some media at all. the more controversial
the image, the more important it is to confirm consent
2) support readers who want a SFW browsing experience (say, on their
machines at work!)
2a) this is largely about setting expectations. If you might
reasonably follow a link while at work, and be surprised and
embarrassed by the result, that's something you'd rather avoid. So
while there's nothing unexpected or nsfw about a dead fetus on
[[abortion]] or a naked body on [[human body]] or [[anatomy]], if a
gallery of 400px mangled fetus images is on a prominently linked
"Brilliant photography" page, it might merit a NSFW tag. ditto for a
gallery of adult models on an "Internet sensations" page, or a prank
video with a mild-mannered name designed to make loud noise and scare
you (and anyone nearby).
3) delete potentially illegal content thoroughly
Start by implementing these ideas, and discuss the rest later.
On Mon, Apr 20, 2009 at 1:19 AM, private musings <thepmaccount(a)gmail.com> wrote:
Here's a few questions about the foundation's
role in
ensuring the projects are responsible media hosts - Can the foundation play
a role in discussing and establishing things like what it means to be
'collegial' and 'collaborative' on the various projects? Can the
foundation
offer guidance, and dare I say it 'rules' for the boundaries of behaviour?
Is there space, beyond limiting project activities to legality, to offer
firm leadership and direction in project governance?
I'm hoping the answer to all of the above is a careful 'yes'.
I believe the answer to the above, as worded, may be a careful 'no'.
These are important decisions, and should be made and improved over
time, but I believe it is the community's role to make them - and the
foundation's to help provide interface or infrastructure to support
the community's resolutions. Feel free to elaborate if you disagree.
A strong and sustainable group within the community can absolutely
work towards and establish the definitions and guidance you suggest.
Past discussions have generally been useful, and not spiteful, but
never pushed through to a resolution at least on meta and en:wp.
Currently commons and the english wikipedia have very
few restrictions
beyond limiting media to what volunteers hope is legal. Media which is
deleted as possibly illegal remains available to administrators, and no
effort beyond the assumption of good faith is possible to ascertain model
ages and release permissions - I neither hope nor believe this is
sustainable.
On a tangential note, I've also been looking at various governmental, and
NGO 'codes of conduct', some of which recommend things like accurate record
keeping on model information
Valid points.
- Permanent image deletion : this could be a new class of deletion
request, and a technical change.
- Guidelines for getting consent/model info are a good idea; have you
proposed some elsewhere?
It's also my view that, generally speaking, the
level of conversation about
this is rubbish - please try to avoid pulling either the 'censorship' and
'prude' guns or the 'immoral' and 'depraved' guns out -
they're just not
helpful.
Sometimes it is. I've seen perfectly reasonable discussions on-wiki
in the past. Try starting with the above, sharing specific ideas and
suggestions for implementation, and building discussion and
implementation out from a base of consensus.
SJ