Hi all,
On Wikipedia Review, 'tarantino' pointed out that on WMF projects, self-identified minors (in this case User:Juliancolton) are involved in routine maintenance stuff around sexually explicit images reasonably describable as porn (one example is 'Masturbating Amy.jpg').
http://wikipediareview.com/index.php?showtopic=27358&st=0&p=204846&a...
I think this is wrong on a number of levels - and I'd like to see better governance from the foundation in this area - I really feel that we need to talk about some child protection measures in some way - they're overdue.
I'd really like to see the advisory board take a look at this issue - is there a formal way of suggesting or requesting their thoughts, or could I just ask here for a board member or community member with the advisory board's ear to raise this with them.
best,
Peter, PM.
On Sun, Nov 15, 2009 at 6:04 PM, private musings thepmaccount@gmail.comwrote:
Hi all,
On Wikipedia Review, 'tarantino' pointed out that on WMF projects, self-identified minors (in this case User:Juliancolton) are involved in routine maintenance stuff around sexually explicit images reasonably describable as porn (one example is 'Masturbating Amy.jpg').
http://wikipediareview.com/index.php?showtopic=27358&st=0&p=204846&a...
I think this is wrong on a number of levels - and I'd like to see better governance from the foundation in this area - I really feel that we need to talk about some child protection measures in some way - they're overdue.
I'd really like to see the advisory board take a look at this issue - is there a formal way of suggesting or requesting their thoughts, or could I just ask here for a board member or community member with the advisory board's ear to raise this with them.
best,
Peter, PM.
Wikipedia is not porn.
29 posts left.
I should add that if folk are interested in the english wikipedia, and have any ideas / comments etc. in this area, I kicked this off here;
http://en.wikipedia.org/wiki/Wikipedia:Child_Protection
cheers,
Peter, PM.
On Mon, Nov 16, 2009 at 12:07 PM, Brian J Mingus Brian.Mingus@colorado.eduwrote:
On Sun, Nov 15, 2009 at 6:04 PM, private musings <thepmaccount@gmail.com
wrote:
Hi all,
On Wikipedia Review, 'tarantino' pointed out that on WMF projects, self-identified minors (in this case User:Juliancolton) are involved in routine maintenance stuff around sexually explicit images reasonably describable as porn (one example is 'Masturbating Amy.jpg').
http://wikipediareview.com/index.php?showtopic=27358&st=0&p=204846&a...
I think this is wrong on a number of levels - and I'd like to see better governance from the foundation in this area - I really feel that we need
to
talk about some child protection measures in some way - they're overdue.
I'd really like to see the advisory board take a look at this issue - is there a formal way of suggesting or requesting their thoughts, or could I just ask here for a board member or community member with the advisory board's ear to raise this with them.
best,
Peter, PM.
Wikipedia is not porn.
29 posts left. _______________________________________________ foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
2009/11/16 private musings thepmaccount@gmail.com:
I should add that if folk are interested in the english wikipedia, and have any ideas / comments etc. in this area, I kicked this off here;
http://en.wikipedia.org/wiki/Wikipedia:Child_Protection
cheers,
Peter, PM.
On Mon, Nov 16, 2009 at 12:07 PM, Brian J Mingus Brian.Mingus@colorado.eduwrote:
On Sun, Nov 15, 2009 at 6:04 PM, private musings <thepmaccount@gmail.com
wrote:
Hi all,
On Wikipedia Review, 'tarantino' pointed out that on WMF projects, self-identified minors (in this case User:Juliancolton) are involved in routine maintenance stuff around sexually explicit images reasonably describable as porn (one example is 'Masturbating Amy.jpg').
http://wikipediareview.com/index.php?showtopic=27358&st=0&p=204846&a...
I think this is wrong on a number of levels - and I'd like to see better governance from the foundation in this area - I really feel that we need
to
talk about some child protection measures in some way - they're overdue.
I'd really like to see the advisory board take a look at this issue - is there a formal way of suggesting or requesting their thoughts, or could I just ask here for a board member or community member with the advisory board's ear to raise this with them.
best,
Peter, PM.
Wikipedia is not porn.
29 posts left. _______________________________________________ foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
2009/11/16 private musings thepmaccount@gmail.com:
I should add that if folk are interested in the english wikipedia, and have any ideas / comments etc. in this area, I kicked this off here;
http://en.wikipedia.org/wiki/Wikipedia:Child_Protection
cheers,
Peter, PM.
Already been addressed:
http://en.wikipedia.org/wiki/Wikipedia:Youth_protection
In practice defacto policy is that we remove personal information posted by younger users.
Thanks for the link to the 'youth protection' page, geni - I've linked to it from Wikipedia:Child protection rather than redirect or abandon that page just yet - I hope we might make some progress :-)
With that in mind, it occurred to me that this list would be a good spot to ask folks if they are aware of any child protection measures in place in any of the WMF communities. I get the feeling that some feel child protection measures might either be fundamentally a bad thing, or in some way be a net negative, perhaps in terms of participation etc. - if this is your view, pipe up! It would be good to debate a little, and try and find some common ground and move forward :-)
So here's my question - is anyone out there aware of any community discussion / policies / practices in regard to regulating children's participation in a project - be it a wikipedia of any language, or any project in the wiki-verse. I'd also really like to extend this to ask those readers of this list who partipate in other collaborative projects, or have experience of other large web sites, to see what measures are out there, and how they might work.
Finally, I'd like to repeat my request that the smarter brains than I on the advisory board might like to offer some thoughts in this area - or maybe any foundation staff, and / or board members could indicate whether or not it's been discussed, and whether or not I might have any luck in getting this issue onto the radar - I think it's very important.
best,
Peter, PM.
On Mon, Nov 16, 2009 at 12:34 PM, geni geniice@gmail.com wrote:
2009/11/16 private musings thepmaccount@gmail.com:
I should add that if folk are interested in the english wikipedia, and
have
any ideas / comments etc. in this area, I kicked this off here;
http://en.wikipedia.org/wiki/Wikipedia:Child_Protection
cheers,
Peter, PM.
Already been addressed:
http://en.wikipedia.org/wiki/Wikipedia:Youth_protection
In practice defacto policy is that we remove personal information posted by younger users.
-- geni
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On 16/11/2009, at 1:04 AM, private musings wrote:
Hi all,
On Wikipedia Review, 'tarantino' pointed out that on WMF projects, self-identified minors (in this case User:Juliancolton) are involved in routine maintenance stuff around sexually explicit images reasonably describable as porn (one example is 'Masturbating Amy.jpg').
http://wikipediareview.com/index.php?showtopic=27358&st=0&p=204846&a...
I think this is wrong on a number of levels - and I'd like to see better governance from the foundation in this area - I really feel that we need to talk about some child protection measures in some way - they're overdue.
I'd really like to see the advisory board take a look at this issue
- is
there a formal way of suggesting or requesting their thoughts, or could I just ask here for a board member or community member with the advisory board's ear to raise this with them.
You just won't give up this topic, will you?
I'm not sure where you get the idea that it's somehow inappropriate for minors to be viewing or working on images depicting human nudity and sexuality. Cultural sensibilities on this matter are inconsistent, irrational and entirely lacking in substance.
I'm also unsure how you propose to define "sexually explicit". The definitions under law are elaborate, attempting to make distinctions that would be irrelevant to any negative impact on children, if one existed. Are images of the statue of David, the Mannekin Pis or the Ecstacy of Theresa deserving of such restrictions? What about the detailed frescoes of sexual acts displayed in brothels and living rooms in ancient Pompeii and Herculaneum? How are those distinct from the image you've used as an example, and how is that distinction relevant to whatever supposed harm you are claiming to children?
If it is truly inappropriate or harmful for children to be working on such images, then those children should be supervised in their internet access, or have gained the trust of their parents to use the internet within whatever limits those parents (or, indeed, the minor) believe is appropriate.
It is absolutely not the job of the Wikimedia Foundation, nor the Wikimedia community, to supervise a child's internet access and/or usage, and certainly not to make arbitrary rules regarding said usage on the basis of a single culture's sensibilities on children and sexuality, especially sensibilities as baseless and harmful as this one.
-- Andrew Garrett agarrett@wikimedia.org http://werdn.us/
On 16/11/2009, at 1:04 AM, private musings wrote:
Hi all,
On Wikipedia Review, 'tarantino' pointed out that on WMF projects, self-identified minors (in this case User:Juliancolton) are involved in routine maintenance stuff around sexually explicit images reasonably describable as porn (one example is 'Masturbating Amy.jpg').
http://wikipediareview.com/index.php?showtopic=27358&st=0&p=204846&a... 46
I think this is wrong on a number of levels - and I'd like to see better governance from the foundation in this area - I really feel that we need to talk about some child protection measures in some way - they're overdue.
I'd really like to see the advisory board take a look at this issue
- is
there a formal way of suggesting or requesting their thoughts, or could I just ask here for a board member or community member with the advisory board's ear to raise this with them.
on 11/17/09 5:37 AM, Andrew Garrett at agarrett@wikimedia.org wrote:
You just won't give up this topic, will you?
I'm not sure where you get the idea that it's somehow inappropriate for minors to be viewing or working on images depicting human nudity and sexuality. Cultural sensibilities on this matter are inconsistent, irrational and entirely lacking in substance.
I'm also unsure how you propose to define "sexually explicit". The definitions under law are elaborate, attempting to make distinctions that would be irrelevant to any negative impact on children, if one existed. Are images of the statue of David, the Mannekin Pis or the Ecstacy of Theresa deserving of such restrictions? What about the detailed frescoes of sexual acts displayed in brothels and living rooms in ancient Pompeii and Herculaneum? How are those distinct from the image you've used as an example, and how is that distinction relevant to whatever supposed harm you are claiming to children?
If it is truly inappropriate or harmful for children to be working on such images, then those children should be supervised in their internet access, or have gained the trust of their parents to use the internet within whatever limits those parents (or, indeed, the minor) believe is appropriate.
It is absolutely not the job of the Wikimedia Foundation, nor the Wikimedia community, to supervise a child's internet access and/or usage, and certainly not to make arbitrary rules regarding said usage on the basis of a single culture's sensibilities on children and sexuality, especially sensibilities as baseless and harmful as this one.
-- Andrew Garrett
Yes. Very well said, Andrew.
Marc Riddell, Ph.D. Clinical Psychology/Psychotherapy
Even though I do agree to some extent with you, Andrew, I would like to make a remark.
You correctly state that the cultural sensibilities differ over the world on this topic. However, this does not excuse for calling the sensibilities "irrational" and "lacking in substance" (inconsistent is fair enough). Clearly, you belong to the group of people who do not have a problem at all with these images, and PM belongs to the group of people that has huge problems with them. The mere fact that you two disagree should not lead to the conclusion we should not think about a way of taking away the problem for the people in side of the spectrum where PM is located.
I think you could lay a comparison between people having significant problems with these images and therefore are not able (or less able) to access Wikipedia with people who have technical issues because they do not want to download a piece of propitiatory software. We care a lot about the latter group, why abolish even the idea of caring about the first? Because we do not belong to it?
Some people do indeed think that ancient pornography should be hidden as well by the way, although I do get your point. Sometimes there is clearly an educational purpuse involved, and the images add value.
Now let it be clear I do not vouch at all for getting rid of the images, or any free content. However, if that would suit a significant group of people, we could consider to make them a little less prominently accessible. Please speak up if the following procedure would make no sense at all to you:
0) think about whether we want (if it exists) to help reduce this group of people with siginificant problems in the first place. 1) research / find research on how large the group of people is that have significant problems with this issue (I define significant here as "having the impact that because of this, they will visit Wikipedia less frequently or not at all") 2) consider which approaches would be possible 3) research which of these approached would be help to decrease the group of people having significant problems with this issue 4) consider whether this has any negative impact for the people not having these significant problems 5) balance these advantages/disadvantages
lets not jump to 5) immediately.
To get to the original question of PM, I am not sure actually whether the advisory board would have people on it who would be helpful on this specific topic. Angela, could you advise on this?
Perhaps this topic could, however, better be approached through the often named Strategy Process. Philippe, do you have a suggestion how this can be incorporated?
Thanks,
Lodewijk
2009/11/17 Andrew Garrett agarrett@wikimedia.org:
On 16/11/2009, at 1:04 AM, private musings wrote:
Hi all,
On Wikipedia Review, 'tarantino' pointed out that on WMF projects, self-identified minors (in this case User:Juliancolton) are involved in routine maintenance stuff around sexually explicit images reasonably describable as porn (one example is 'Masturbating Amy.jpg').
http://wikipediareview.com/index.php?showtopic=27358&st=0&p=204846&a...
I think this is wrong on a number of levels - and I'd like to see better governance from the foundation in this area - I really feel that we need to talk about some child protection measures in some way - they're overdue.
I'd really like to see the advisory board take a look at this issue
- is
there a formal way of suggesting or requesting their thoughts, or could I just ask here for a board member or community member with the advisory board's ear to raise this with them.
You just won't give up this topic, will you?
I'm not sure where you get the idea that it's somehow inappropriate for minors to be viewing or working on images depicting human nudity and sexuality. Cultural sensibilities on this matter are inconsistent, irrational and entirely lacking in substance.
I'm also unsure how you propose to define "sexually explicit". The definitions under law are elaborate, attempting to make distinctions that would be irrelevant to any negative impact on children, if one existed. Are images of the statue of David, the Mannekin Pis or the Ecstacy of Theresa deserving of such restrictions? What about the detailed frescoes of sexual acts displayed in brothels and living rooms in ancient Pompeii and Herculaneum? How are those distinct from the image you've used as an example, and how is that distinction relevant to whatever supposed harm you are claiming to children?
If it is truly inappropriate or harmful for children to be working on such images, then those children should be supervised in their internet access, or have gained the trust of their parents to use the internet within whatever limits those parents (or, indeed, the minor) believe is appropriate.
It is absolutely not the job of the Wikimedia Foundation, nor the Wikimedia community, to supervise a child's internet access and/or usage, and certainly not to make arbitrary rules regarding said usage on the basis of a single culture's sensibilities on children and sexuality, especially sensibilities as baseless and harmful as this one.
-- Andrew Garrett agarrett@wikimedia.org http://werdn.us/
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Hoi, Thank you effe for your analysis. I do agree that this can be considered something that some people feel strongly about and, also as something that would entice certain groups of people to use Wikipedia more freely. Growing our community or readers and editors is a priority.
The problem is that while private musing is quite outspoken about the priority for his concern, we can spend effort on one issue at a time and while I sympathise, I would not give it priority because I favour effort on the issue that has my concern. As long as our imagery is serchable, usable only for people who speak English I think that this trumps the strategic value of prudery.
When for either of the two issues a solution is to be found, there will be a need for substantial investment of resources. There is no obvious and sensible solution that will be accepted for the cncern of private musing. It will even be hard to come up with an acceptable default position, more likely is different default positions that can be chosen by communities. It is even questionable that such positions can be found; I expect that the best that can be had is a bad compromise for everyone.
Multi lingual support is easy; either you wish for it or you don't there is no half way house, the good thing is, the ability for search in other languages will not detract from the ability to search in English..
We do not have even a glimmer of what can be technically done to address private musings concerns and we do not know what would be acceptable by our communities. So let us work on the technical aspects of multi lingual support and divine a workable compromise for imagery that show people in the flesh and as biological entities at the same time. Once we have multi lingual support sorted out and grown our audience even further, we may have something that we can do that is practical and will have some support. Thanks, GerardM
2009/11/17 effe iets anders effeietsanders@gmail.com
Even though I do agree to some extent with you, Andrew, I would like to make a remark.
You correctly state that the cultural sensibilities differ over the world on this topic. However, this does not excuse for calling the sensibilities "irrational" and "lacking in substance" (inconsistent is fair enough). Clearly, you belong to the group of people who do not have a problem at all with these images, and PM belongs to the group of people that has huge problems with them. The mere fact that you two disagree should not lead to the conclusion we should not think about a way of taking away the problem for the people in side of the spectrum where PM is located.
I think you could lay a comparison between people having significant problems with these images and therefore are not able (or less able) to access Wikipedia with people who have technical issues because they do not want to download a piece of propitiatory software. We care a lot about the latter group, why abolish even the idea of caring about the first? Because we do not belong to it?
Some people do indeed think that ancient pornography should be hidden as well by the way, although I do get your point. Sometimes there is clearly an educational purpuse involved, and the images add value.
Now let it be clear I do not vouch at all for getting rid of the images, or any free content. However, if that would suit a significant group of people, we could consider to make them a little less prominently accessible. Please speak up if the following procedure would make no sense at all to you:
- think about whether we want (if it exists) to help reduce this
group of people with siginificant problems in the first place.
- research / find research on how large the group of people is that
have significant problems with this issue (I define significant here as "having the impact that because of this, they will visit Wikipedia less frequently or not at all") 2) consider which approaches would be possible 3) research which of these approached would be help to decrease the group of people having significant problems with this issue 4) consider whether this has any negative impact for the people not having these significant problems 5) balance these advantages/disadvantages
lets not jump to 5) immediately.
To get to the original question of PM, I am not sure actually whether the advisory board would have people on it who would be helpful on this specific topic. Angela, could you advise on this?
Perhaps this topic could, however, better be approached through the often named Strategy Process. Philippe, do you have a suggestion how this can be incorporated?
Thanks,
Lodewijk
2009/11/17 Andrew Garrett agarrett@wikimedia.org:
On 16/11/2009, at 1:04 AM, private musings wrote:
Hi all,
On Wikipedia Review, 'tarantino' pointed out that on WMF projects, self-identified minors (in this case User:Juliancolton) are involved in routine maintenance stuff around sexually explicit images reasonably describable as porn (one example is 'Masturbating Amy.jpg').
http://wikipediareview.com/index.php?showtopic=27358&st=0&p=204846&a...
I think this is wrong on a number of levels - and I'd like to see better governance from the foundation in this area - I really feel that we need to talk about some child protection measures in some way - they're overdue.
I'd really like to see the advisory board take a look at this issue
- is
there a formal way of suggesting or requesting their thoughts, or could I just ask here for a board member or community member with the advisory board's ear to raise this with them.
You just won't give up this topic, will you?
I'm not sure where you get the idea that it's somehow inappropriate for minors to be viewing or working on images depicting human nudity and sexuality. Cultural sensibilities on this matter are inconsistent, irrational and entirely lacking in substance.
I'm also unsure how you propose to define "sexually explicit". The definitions under law are elaborate, attempting to make distinctions that would be irrelevant to any negative impact on children, if one existed. Are images of the statue of David, the Mannekin Pis or the Ecstacy of Theresa deserving of such restrictions? What about the detailed frescoes of sexual acts displayed in brothels and living rooms in ancient Pompeii and Herculaneum? How are those distinct from the image you've used as an example, and how is that distinction relevant to whatever supposed harm you are claiming to children?
If it is truly inappropriate or harmful for children to be working on such images, then those children should be supervised in their internet access, or have gained the trust of their parents to use the internet within whatever limits those parents (or, indeed, the minor) believe is appropriate.
It is absolutely not the job of the Wikimedia Foundation, nor the Wikimedia community, to supervise a child's internet access and/or usage, and certainly not to make arbitrary rules regarding said usage on the basis of a single culture's sensibilities on children and sexuality, especially sensibilities as baseless and harmful as this one.
-- Andrew Garrett agarrett@wikimedia.org http://werdn.us/
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
The New York City public library system--and I would imagine most municipal library systems in general--is filled with underage interns (or pages, or whatever they're called now) who play a not insignificant role in curating collections that contain material every bit as explicit as those examples given here, including computers, which offer unrestricted access to the breadth of the internet, which contains material very, very much more explicit than the examples given here. And libraries are not age-segregated or censored in the manner you describe here. Wikipedia is an educational endeavor, not the MPAA, and the constant re-flourishing of this topic under varying guises, particularly when the consensus of community standards on things like pearl necklace, Virgin Killer &c &c have been demonstrated time and again, by the same old hands quite frankly makes this list a chore to read.
FMF
On Tue, Nov 17, 2009 at 8:58 AM, Gerard Meijssen gerard.meijssen@gmail.comwrote:
Hoi, Thank you effe for your analysis. I do agree that this can be considered something that some people feel strongly about and, also as something that would entice certain groups of people to use Wikipedia more freely. Growing our community or readers and editors is a priority.
The problem is that while private musing is quite outspoken about the priority for his concern, we can spend effort on one issue at a time and while I sympathise, I would not give it priority because I favour effort on the issue that has my concern. As long as our imagery is serchable, usable only for people who speak English I think that this trumps the strategic value of prudery.
When for either of the two issues a solution is to be found, there will be a need for substantial investment of resources. There is no obvious and sensible solution that will be accepted for the cncern of private musing. It will even be hard to come up with an acceptable default position, more likely is different default positions that can be chosen by communities. It is even questionable that such positions can be found; I expect that the best that can be had is a bad compromise for everyone.
Multi lingual support is easy; either you wish for it or you don't there is no half way house, the good thing is, the ability for search in other languages will not detract from the ability to search in English..
We do not have even a glimmer of what can be technically done to address private musings concerns and we do not know what would be acceptable by our communities. So let us work on the technical aspects of multi lingual support and divine a workable compromise for imagery that show people in the flesh and as biological entities at the same time. Once we have multi lingual support sorted out and grown our audience even further, we may have something that we can do that is practical and will have some support. Thanks, GerardM
2009/11/17 effe iets anders effeietsanders@gmail.com
Even though I do agree to some extent with you, Andrew, I would like to make a remark.
You correctly state that the cultural sensibilities differ over the world on this topic. However, this does not excuse for calling the sensibilities "irrational" and "lacking in substance" (inconsistent is fair enough). Clearly, you belong to the group of people who do not have a problem at all with these images, and PM belongs to the group of people that has huge problems with them. The mere fact that you two disagree should not lead to the conclusion we should not think about a way of taking away the problem for the people in side of the spectrum where PM is located.
I think you could lay a comparison between people having significant problems with these images and therefore are not able (or less able) to access Wikipedia with people who have technical issues because they do not want to download a piece of propitiatory software. We care a lot about the latter group, why abolish even the idea of caring about the first? Because we do not belong to it?
Some people do indeed think that ancient pornography should be hidden as well by the way, although I do get your point. Sometimes there is clearly an educational purpuse involved, and the images add value.
Now let it be clear I do not vouch at all for getting rid of the images, or any free content. However, if that would suit a significant group of people, we could consider to make them a little less prominently accessible. Please speak up if the following procedure would make no sense at all to you:
- think about whether we want (if it exists) to help reduce this
group of people with siginificant problems in the first place.
- research / find research on how large the group of people is that
have significant problems with this issue (I define significant here as "having the impact that because of this, they will visit Wikipedia less frequently or not at all") 2) consider which approaches would be possible 3) research which of these approached would be help to decrease the group of people having significant problems with this issue 4) consider whether this has any negative impact for the people not having these significant problems 5) balance these advantages/disadvantages
lets not jump to 5) immediately.
To get to the original question of PM, I am not sure actually whether the advisory board would have people on it who would be helpful on this specific topic. Angela, could you advise on this?
Perhaps this topic could, however, better be approached through the often named Strategy Process. Philippe, do you have a suggestion how this can be incorporated?
Thanks,
Lodewijk
2009/11/17 Andrew Garrett agarrett@wikimedia.org:
On 16/11/2009, at 1:04 AM, private musings wrote:
Hi all,
On Wikipedia Review, 'tarantino' pointed out that on WMF projects, self-identified minors (in this case User:Juliancolton) are involved in routine maintenance stuff around sexually explicit images reasonably describable as porn (one example is 'Masturbating Amy.jpg').
http://wikipediareview.com/index.php?showtopic=27358&st=0&p=204846&a...
I think this is wrong on a number of levels - and I'd like to see better governance from the foundation in this area - I really feel that we need to talk about some child protection measures in some way - they're overdue.
I'd really like to see the advisory board take a look at this issue
- is
there a formal way of suggesting or requesting their thoughts, or could I just ask here for a board member or community member with the advisory board's ear to raise this with them.
You just won't give up this topic, will you?
I'm not sure where you get the idea that it's somehow inappropriate for minors to be viewing or working on images depicting human nudity and sexuality. Cultural sensibilities on this matter are inconsistent, irrational and entirely lacking in substance.
I'm also unsure how you propose to define "sexually explicit". The definitions under law are elaborate, attempting to make distinctions that would be irrelevant to any negative impact on children, if one existed. Are images of the statue of David, the Mannekin Pis or the Ecstacy of Theresa deserving of such restrictions? What about the detailed frescoes of sexual acts displayed in brothels and living rooms in ancient Pompeii and Herculaneum? How are those distinct from the image you've used as an example, and how is that distinction relevant to whatever supposed harm you are claiming to children?
If it is truly inappropriate or harmful for children to be working on such images, then those children should be supervised in their internet access, or have gained the trust of their parents to use the internet within whatever limits those parents (or, indeed, the minor) believe is appropriate.
It is absolutely not the job of the Wikimedia Foundation, nor the Wikimedia community, to supervise a child's internet access and/or usage, and certainly not to make arbitrary rules regarding said usage on the basis of a single culture's sensibilities on children and sexuality, especially sensibilities as baseless and harmful as this
one.
-- Andrew Garrett agarrett@wikimedia.org http://werdn.us/
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
"It is absolutely not the job of the Wikimedia Foundation, nor the Wikimedia community, to supervise a child's internet access and/or usage" Frankly, I dont think that is what I read in PMs post which started this discussion.
In many countries it is the responsibility of parents for their childs behaviour, inlcuding their behavious on internet. However, also in many countries it is the responsibility of volunteer organizations to that under age volunteers do while they are active as a volunteer for that organization. In that respect Wikimedia foundation may be held responsible for what minors during their vi\olunteer acticvities for wikimedia do and see.
Viewn as such, it might indeed be a responsibility for the foundation, and not for an individual wiki.
i wish you health and happiness, teun spaans
On Tue, Nov 17, 2009 at 11:37 AM, Andrew Garrett agarrett@wikimedia.orgwrote:
On 16/11/2009, at 1:04 AM, private musings wrote:
Hi all,
On Wikipedia Review, 'tarantino' pointed out that on WMF projects, self-identified minors (in this case User:Juliancolton) are involved in routine maintenance stuff around sexually explicit images reasonably describable as porn (one example is 'Masturbating Amy.jpg').
http://wikipediareview.com/index.php?showtopic=27358&st=0&p=204846&a...
I think this is wrong on a number of levels - and I'd like to see better governance from the foundation in this area - I really feel that we need to talk about some child protection measures in some way - they're overdue.
I'd really like to see the advisory board take a look at this issue
- is
there a formal way of suggesting or requesting their thoughts, or could I just ask here for a board member or community member with the advisory board's ear to raise this with them.
You just won't give up this topic, will you?
I'm not sure where you get the idea that it's somehow inappropriate for minors to be viewing or working on images depicting human nudity and sexuality. Cultural sensibilities on this matter are inconsistent, irrational and entirely lacking in substance.
I'm also unsure how you propose to define "sexually explicit". The definitions under law are elaborate, attempting to make distinctions that would be irrelevant to any negative impact on children, if one existed. Are images of the statue of David, the Mannekin Pis or the Ecstacy of Theresa deserving of such restrictions? What about the detailed frescoes of sexual acts displayed in brothels and living rooms in ancient Pompeii and Herculaneum? How are those distinct from the image you've used as an example, and how is that distinction relevant to whatever supposed harm you are claiming to children?
If it is truly inappropriate or harmful for children to be working on such images, then those children should be supervised in their internet access, or have gained the trust of their parents to use the internet within whatever limits those parents (or, indeed, the minor) believe is appropriate.
It is absolutely not the job of the Wikimedia Foundation, nor the Wikimedia community, to supervise a child's internet access and/or usage, and certainly not to make arbitrary rules regarding said usage on the basis of a single culture's sensibilities on children and sexuality, especially sensibilities as baseless and harmful as this one.
-- Andrew Garrett agarrett@wikimedia.org http://werdn.us/
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Andrew Garrett wrote:
On 16/11/2009, at 1:04 AM, private musings wrote:
On Wikipedia Review, 'tarantino' pointed out that on WMF projects, self-identified minors (in this case User:Juliancolton) are involved in routine maintenance stuff around sexually explicit images reasonably describable as porn (one example is 'Masturbating Amy.jpg').
http://wikipediareview.com/index.php?showtopic=27358&st=0&p=204846&a...
I think this is wrong on a number of levels - and I'd like to see better governance from the foundation in this area - I really feel that we need to talk about some child protection measures in some way - they're overdue.
I'm not sure where you get the idea that it's somehow inappropriate for minors to be viewing or working on images depicting human nudity and sexuality. Cultural sensibilities on this matter are inconsistent, irrational and entirely lacking in substance.
If it is truly inappropriate or harmful for children to be working on such images, then those children should be supervised in their internet access, or have gained the trust of their parents to use the internet within whatever limits those parents (or, indeed, the minor) believe is appropriate.
It is absolutely not the job of the Wikimedia Foundation, nor the Wikimedia community, to supervise a child's internet access and/or usage, and certainly not to make arbitrary rules regarding said usage on the basis of a single culture's sensibilities on children and sexuality, especially sensibilities as baseless and harmful as this one.
I agree that a common sense approach is warranted. In large measure applying complex controls on child viewing is totally unrealistic. We would begin with the problem of defining what is too young. In an other topic, underage drinking, it is relatively far easier to define the offending act but the age at which drinking is permitted still varies widely from one jurisdiction to another. So what age is appropriate for viewing such material? 12? 16? 18? 21? And even if we agree on an age, except for the few self-identified individuals how are we to know what someone's age really is? Those who are too young very quickly learn that lying is a valuable skill founded upon necessity.
Not many years ago in a bible-belt suburb there was a very loud campaign to block books that depicted same sex parents from a school library. There was no question of those parents engaging in sexual activity in the books, only a depiction that they could be loving and committed parents just as much as opposite sex parents. The aim of the books was to combat the development of homophobia among children of "normal" parents. Yes, that is at the other extreme from the raunchy photos that are most often complained about, but that merely illustrates the problem of definition.
As is often stated WMF is an ISP, and not a publisher. The more it seeks to control content, the more it acquires characteristics of a publisher. Indeed as an ISP it must respond to specific legal demands to remove certain material, but random complaints are not legal demands. Perhaps at the same time those complainers should be asking why murder is so much more socially acceptable on TV than consensual sex.
The responsibility of parents remains paramount ... even if some are incapable of exercising that responsibility. It would also be irresponsible if parents with the means to provide internet access exercised control to the extent of raising internet illiterates incapable of functioning in a wired world. What teachers and other public institutions can do has severe limitations. The sad unavoidable fact is that the seamier side of life exists. A parent does not protect his child by pretending to him that such things don't happen. More is accomplished by directing him toward a mature attitude.
Ec
Ray,
you seem to me to be essentially discussing the 'users' perspective on wikipedia - whilst it's my view that the foundation, and the projects could (and should) do more to allow things like descriptive image filtering for users (I think it would drive participation in places like schools, and librairies) I'm also interested in discussing the perspective of 'participant' in the project.
I think there are important duty of care issues for whomever is responsible for children's involvement in projects like wikipedia, and I don't believe the foundation, and projects, should simply pass the buck of responsibility upstream to the parent. Encyclopedia's are rightly exciting and interesting to children, and I think it's just reality that large numbers of participants are minors (wiki's fun, right! :-) - we really should at least talk about whether or not these participants are protected / treated / advised appropriately.
for example, it would be my advice to a minor that it's inappropriate for them to join this (not safe for work discussion) about whether or not to include 'hardcore photos' in the oral sex article ( http://en.wikipedia.org/wiki/Talk:Oral_sex#Hardcore_photos )
There are important ethical issues here (maybe legal ones too, I don't know) - I've tried to reach out to Volunteering Australia ( http://www.volunteeringaustralia.org/html/s01_home/home.asp ) who I hope may be able to offer some advice about good practice in working with volunteer kids etc. but I think this might be able to go much further much quicker on a foundation level.
I'd like to see some concrete progress (a report, some ideas, anything really!) related to ensuring appropriate and adequate measures are in place to protect child participants in foundation projects. I've copied this message Angela, who I hope I may persuade to raise this issue with the advisory board, and also sj who may be able to raise the issue with the board, or perhaps join this discussion to offer any ideas about handy next steps. Regardless, I'll hop back on this list following a meeting with Volunteering Australia, just in case they have any useful or interesting advice :-)
cheers,
Peter, PM.
On Wed, Nov 18, 2009 at 2:05 PM, Ray Saintonge saintonge@telus.net wrote:
Andrew Garrett wrote:
On 16/11/2009, at 1:04 AM, private musings wrote:
On Wikipedia Review, 'tarantino' pointed out that on WMF projects, self-identified minors (in this case User:Juliancolton) are involved in routine maintenance stuff around sexually explicit images reasonably describable as porn (one example is 'Masturbating Amy.jpg').
http://wikipediareview.com/index.php?showtopic=27358&st=0&p=204846&a...
I think this is wrong on a number of levels - and I'd like to see better governance from the foundation in this area - I really feel that we need to talk about some child protection measures in some way - they're overdue.
I'm not sure where you get the idea that it's somehow inappropriate for minors to be viewing or working on images depicting human nudity and sexuality. Cultural sensibilities on this matter are inconsistent, irrational and entirely lacking in substance.
If it is truly inappropriate or harmful for children to be working on such images, then those children should be supervised in their internet access, or have gained the trust of their parents to use the internet within whatever limits those parents (or, indeed, the minor) believe is appropriate.
It is absolutely not the job of the Wikimedia Foundation, nor the Wikimedia community, to supervise a child's internet access and/or usage, and certainly not to make arbitrary rules regarding said usage on the basis of a single culture's sensibilities on children and sexuality, especially sensibilities as baseless and harmful as this one.
I agree that a common sense approach is warranted. In large measure applying complex controls on child viewing is totally unrealistic. We would begin with the problem of defining what is too young. In an other topic, underage drinking, it is relatively far easier to define the offending act but the age at which drinking is permitted still varies widely from one jurisdiction to another. So what age is appropriate for viewing such material? 12? 16? 18? 21? And even if we agree on an age, except for the few self-identified individuals how are we to know what someone's age really is? Those who are too young very quickly learn that lying is a valuable skill founded upon necessity.
Not many years ago in a bible-belt suburb there was a very loud campaign to block books that depicted same sex parents from a school library. There was no question of those parents engaging in sexual activity in the books, only a depiction that they could be loving and committed parents just as much as opposite sex parents. The aim of the books was to combat the development of homophobia among children of "normal" parents. Yes, that is at the other extreme from the raunchy photos that are most often complained about, but that merely illustrates the problem of definition.
As is often stated WMF is an ISP, and not a publisher. The more it seeks to control content, the more it acquires characteristics of a publisher. Indeed as an ISP it must respond to specific legal demands to remove certain material, but random complaints are not legal demands. Perhaps at the same time those complainers should be asking why murder is so much more socially acceptable on TV than consensual sex.
The responsibility of parents remains paramount ... even if some are incapable of exercising that responsibility. It would also be irresponsible if parents with the means to provide internet access exercised control to the extent of raising internet illiterates incapable of functioning in a wired world. What teachers and other public institutions can do has severe limitations. The sad unavoidable fact is that the seamier side of life exists. A parent does not protect his child by pretending to him that such things don't happen. More is accomplished by directing him toward a mature attitude.
Ec
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
2009/11/18 private musings thepmaccount@gmail.com:
Ray,
you seem to me to be essentially discussing the 'users' perspective on wikipedia - whilst it's my view that the foundation, and the projects could (and should) do more to allow things like descriptive image filtering for users (I think it would drive participation in places like schools, and librairies) I'm also interested in discussing the perspective of 'participant' in the project.
Given how disruptive "think of the children" can be mere interest is not a valid reason for getting involved with this area.
I think there are important duty of care issues for whomever is responsible for children's involvement in projects like wikipedia,
That would be their parents.
and I don't believe the foundation, and projects, should simply pass the buck of responsibility upstream to the parent.
What you believe isn't relevant. The responsibility is with the parents not us.
Encyclopedia's are rightly exciting and interesting to children, and I think it's just reality that large numbers of participants are minors (wiki's fun, right! :-) - we really should at least talk about whether or not these participants are protected / treated / advised appropriately.
Per current US law yes. If the law changes we can reconsider our activities.
for example, it would be my advice to a minor that it's inappropriate for them to join this (not safe for work discussion) about whether or not to include 'hardcore photos' in the oral sex article ( http://en.wikipedia.org/wiki/Talk:Oral_sex#Hardcore_photos )
Please provide a list of your qualifications to provide such advice.
There are important ethical issues here (maybe legal ones too, I don't know)
- I've tried to reach out to Volunteering Australia (
http://www.volunteeringaustralia.org/html/s01_home/home.asp ) who I hope may be able to offer some advice about good practice in working with volunteer kids etc.
Wikipedia does not answer to Australian law. Please provide a transcript of what you have said.
but I think this might be able to go much further much quicker on a foundation level.
I'd like to see some concrete progress (a report, some ideas, anything really!) related to ensuring appropriate and adequate measures are in place to protect child participants in foundation projects.
They are.
On Tue, Nov 17, 2009 at 10:05 PM, Ray Saintonge saintonge@telus.net wrote:
As is often stated WMF is an ISP, and not a publisher.
Stating it often doesn't make it true. The WMF is quite clearly a publisher. It even has admitted as much when it exercised the GFDL clause purporting to allow "any World Wide Web server that publishes copyrightable works and also provides prominent facilities for anybody to edit those works" to "republish" Wikipedia (et. al.) under CC-BY-SA. Anyone who says the WMF is not a publisher is just plain wrong.
So state it as much as you want. The WMF is a publisher. Under Section 230 of the CDA it most likely won't be treated as a publisher, but that doesn't mean it isn't a publisher.
On Tue, Nov 17, 2009 at 9:27 PM, Anthony wikimail@inbox.org wrote:
On Tue, Nov 17, 2009 at 10:05 PM, Ray Saintonge saintonge@telus.net wrote:
As is often stated WMF is an ISP, and not a publisher.
Stating it often doesn't make it true. The WMF is quite clearly a publisher. It even has admitted as much when it exercised the GFDL clause purporting to allow "any World Wide Web server that publishes copyrightable works and also provides prominent facilities for anybody to edit those works" to "republish" Wikipedia (et. al.) under CC-BY-SA. Anyone who says the WMF is not a publisher is just plain wrong.
So state it as much as you want. The WMF is a publisher. Under Section 230 of the CDA it most likely won't be treated as a publisher, but that doesn't mean it isn't a publisher.
The section 230 that would seem to matter here?
The WMF has all sorts of roles, depending on who you are, how you look at it, and what your perspective is (and what day of the month it is, etc). Referring to legal issues, one has to remain domain specific when using specific terms in a legal sense.
George Herbert wrote:
On Tue, Nov 17, 2009 at 9:27 PM, Anthony wikimail@inbox.org wrote:
So state it as much as you want. The WMF is a publisher. Under Section 230 of the CDA it most likely won't be treated as a publisher, but that doesn't mean it isn't a publisher.
The section 230 that would seem to matter here?
The WMF has all sorts of roles, depending on who you are, how you look at it, and what your perspective is (and what day of the month it is, etc). Referring to legal issues, one has to remain domain specific when using specific terms in a legal sense.
It's also quite unsettled what Section 230 protections consist of to begin with. Some U.S. courts have applied them *extremely* broadly. One still-current Circuit Court precedent, which is binding in the distrct Wikimedia servers are located, is _Batzel v. Smith_ (9th Circuit, 2003), which holds that a blogger who reposts material emailed to him, even though he chooses which emails to republish, is entitled to Section 230 protection by virtue of the mere fact that the material he publishes originates ultimately with his "users", and is not something he personally authored. It's hard to imagine any Wikimedia Foundation activity w.r.t. Wikipedia that doesn't meet at least the _Batzel_ standard, apart from Wikimedia Foundation employees literally inserting original content into Wikipedia articles while on the clock. If the ultimate source of the content is elsewhere, regardless of what editorial or publishing decisions are made in the middle, it's Section-230-protected under _Batzel_. Of course, _Batzel_ might be wrong and overturned in the future, which is the risk of relying too much on law in this as-yet-unsettled area...
-Mark
There are two possible discussions:
1) a discussion about the legal requirements - please leave this to the legal experts. I'm confident that Mike Godwin keeps an eye onto it, and if he doesn't you could solicit the advice of a legal expert, and bring that advice to him or the WMF ED/board. 2) a discussion on whether we want to make Wikimedia better accessible to people having significant problems with a category of content. - that discussion be held here, if the necessary data is found (as laid out in a previous email).
best, eia
2009/11/18 Delirium delirium@hackish.org:
George Herbert wrote:
On Tue, Nov 17, 2009 at 9:27 PM, Anthony wikimail@inbox.org wrote:
So state it as much as you want. The WMF is a publisher. Under Section 230 of the CDA it most likely won't be treated as a publisher, but that doesn't mean it isn't a publisher.
The section 230 that would seem to matter here?
The WMF has all sorts of roles, depending on who you are, how you look at it, and what your perspective is (and what day of the month it is, etc). Referring to legal issues, one has to remain domain specific when using specific terms in a legal sense.
It's also quite unsettled what Section 230 protections consist of to begin with. Some U.S. courts have applied them *extremely* broadly. One still-current Circuit Court precedent, which is binding in the distrct Wikimedia servers are located, is _Batzel v. Smith_ (9th Circuit, 2003), which holds that a blogger who reposts material emailed to him, even though he chooses which emails to republish, is entitled to Section 230 protection by virtue of the mere fact that the material he publishes originates ultimately with his "users", and is not something he personally authored. It's hard to imagine any Wikimedia Foundation activity w.r.t. Wikipedia that doesn't meet at least the _Batzel_ standard, apart from Wikimedia Foundation employees literally inserting original content into Wikipedia articles while on the clock. If the ultimate source of the content is elsewhere, regardless of what editorial or publishing decisions are made in the middle, it's Section-230-protected under _Batzel_. Of course, _Batzel_ might be wrong and overturned in the future, which is the risk of relying too much on law in this as-yet-unsettled area...
-Mark
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
This discussion seems to have branched out somewhat. Peter's concern was that underage admins shouldn't be involved in the maintenance of sexually explicit images. OK, so, legalese aside -
* you could put in place a vetting process for admins akin to what we have for OTRS - real names, ages, etc. ** Can't see this happening, can you? 1,701 admins on the English Wikipedia alone - more for Commons, where other images are stored. Many are unwilling, for what I consider to be practical reasons, to identify themselves even to the foundation. ** Docs can be faked - in absence of face-to-face verification of identity, there's not a lot you can do about underage admins.
* I don't suppose it's a great deal of reassurance, but I've interacted with a fair few underage admins regarding issues like these, and most take a fairly clinical approach to it all. I realise this is a subjective opinion, so I won't push this too much.
* The fact that these images exist at all will be a perennial concern. I dislike the argument that if WP doesn't host them, the kiddies will be bound to find them somewhere else on the Internet - but this *is* true. I'm fairly sure most of us on this list grew up without the Internet and still saw our share of explicit material before the age of 18. There really isn't a lot you can do.
* I don't think that this sort of moral concern should be completely trashed outright, as some of the previous posters have done, but our readership and user base are both so vast that there's honestly not a lot that can be done. This sounds a bit like admitting defeat - I'm trying not to make it out like that. But Wikimedia sites are unable to tailor-make their content to specific users' needs to the extent that would be necessary to satisfy most of these concerns. (Or we could, and, as someone pointed out, you'd have Simple English WP). Home Internet filters tackle this more effectively (and smart kids will still override these).
For what it's worth, my young teenaged brother is still watched like a hawk by my parents during his limited Internet usage hours. Parents *can* take the responsibility if they are concerned enough.
On Wed, Nov 18, 2009 at 7:58 AM, Delirium delirium@hackish.org wrote:
If the ultimate source of the content is elsewhere, regardless of what editorial or publishing decisions are made in the middle, it's Section-230-protected under _Batzel_. Of course, _Batzel_ might be wrong and overturned in the future, which is the risk of relying too much on law in this as-yet-unsettled area...
All of which is irrelevant to my point. The Wikimedia Foundation is a publisher - just like Batzel was a publisher, just like AOL is a publisher, just like Craigslist is a publisher, just like roommates.com is a publisher, just like eBay is a publisher. Given the mission of the foundation, being a publisher is inescapable.
Using "we can't be a publisher" as an excuse for knowingly and intentionally providing children with access to hard-core pornography without so much as even checking with their parents to see how they feel about the issue, is just pathetic. People have gone to jail for less.
And frankly, whether or not the WMF can do anything about the issue is also irrelevant. The WMF isn't the only body that makes the rules of the Wikimedia websites. We do.
This isn't necessarily directed at you, Mark, as you don't seem to have declared your position on the matter. But all these strawmen about cultural relevance and homophobia and publisher vs. ISP and parental responsibility are just hiding the real issue. Do we want to be a place where children can come to view pictures of "cock and ball torture", or don't we?
Don't try to tell me we can ban by-permission-only images which aren't released under a free license but we can ban images of "cock and ball torture". We can, if we want to. Stop hiding behind irrelevant issues and defend providing access to these images, on its own merits, if that's what you suggest we ought to do. I don't see the argument for it. I certainly won't be knowingly and intentionally providing my children with access to these images. Some people do really stupid things - they don't need these images to teach them that.
Andrew Garrett agarrett@wikimedia.org wrote:
I'm not sure where you get the idea that it's somehow inappropriate for minors to be viewing or working on images depicting human nudity and sexuality. Cultural sensibilities on this matter are inconsistent, irrational and entirely lacking in substance.
Andrew, consider the way that a reasonable parent (often conservative) would look at that statement. We don't in fact show too much in the way of "sexuality" - and the truth is that we censor images all the time, and quite *rightly so. I recall in particular some comments by one of our founders when he unilaterally deleted an image of someone stimulating (or pretending to stimulate) their own penis with their mouth. To me, the poster of the image was thinking more about his perceived freedom to gratuitously express his own concepts of gratuity. The deleter, on the other hand was simply thinking about what is or is not actually good for the project - a subjective consideration, true, but nevertheless one which we need to make all the time.
Consider also that the distinction you refer to as "irrational and entirely lacking in substance" is just your own POV. I do not agree with it, because its inaccurate: The distinction itself is *arbitrary, and it is due to related conceptual *ambiguity that this arbitrariness manifests itself in ways that appear "irrational and entirely lacking in substance." Thus while the "inconsisten[cy]" (and what about international law is not "consistent?") does create legal and moral *ambiguity, that is not to say anything but the results are actually irrational. The current standard is a distinction between maturity (18+) and minority (17-), and it exists for reasons that vastly exceed the scope of this discussion. We can agree that in reality, it is rare that issues involving "minor" children are treated the same as those involving "minor" teenagers. But that is not to say we can ignore concerns regarding to former, just to give support to a concept of greater freedom for the latter.
Just as the excessively prudish camp creates ambiguities with their language, the excessively libertine camp does much the same thing. Any substantive discussion requires resolving those ambiguities through being clear. There are interesting philosophical ideas at work here as well, and the fact of the matter is that we do delete articles all the time for being "un-encyclopedic" - the debates around whether images are "encyclopedic" have largely shifted to Commons, which has a much broader purpose - perhaps one that does not match that of the encyclopedia.
-Stevertigo
The Foundation, Commons and the English Wikipedia typically address problems associated with minors by refusing to engage as a group. Some individuals advise children not to put personally identifying information on their userpage, but that is advice haphazardly given and no effort is made to systematically identify situations where it would be useful. That one problem is a microcosm for the whole spectrum of "children" issues throughout Wikimedia - we encourage individual editors to advise other editors when they might be endangering themselves, but we don't allow (and often refuse even to discuss) more proactive solutions.
Outstanding problems that have been identified in the past:
* Access of minor readers to sexually explicit material * Involvement of minor participants / administrators in the administration of sexually explicit content * Sexually explicit imagery that features or may feature models under the age of 18
Our responses to these problems have never been more sophisticated than "Wikimedia is not censored." Perhaps its assumed that by refusing to budge from this absolute position, we avoid a war by inches where we will ultimately be forced to cave to all cultural sensitivities. Instead of evaluating what our responsibilities should be, what action we ought to take, we limit ourselves only to what we *must* do by law. I think that's a mistake.
I'm not sure we can do much about minor readers and participants, except perhaps putting certain types of content behind a warning wall that can be easily bypassed. The types of verification and consent models used in the web industry are formatted on limiting liability, they don't need to be (and consequently are not) very effective. Adopting one of these models may not make sense for Wikimedia, but it certainly makes sense to have a discussion about it. Geni and Andrew's comments strike me as an attempt to foreclose any discussion.
On the other hand, we certainly can do more on policing the sexually explicit imagery on Commons against possible violations of child pornography and privacy laws. We may not *have* to do this, but we ought to. There is at least one large category of images, from a specific photographer, where it has long been suspected that some models are underage. The only verification effort we make now is on licensing, but I think we ought to require actual model releases on sexually explicit photographs. We will gain far more by protecting the safety and privacy of image subjects than we stand to lose in the volume of explicit photos.
Nathan
+1. Not sure what I can add to that, other than I agree completely. We have great nuance in our debates about copyright and take consummate care when concerns are raised on that front. But when concerns are raised in other areas (such as this one) we often tend towards extreme positions characterised by a refusal to engage in the issue and simplistic shutdowns. I have no answer or particular axe to grind in this topic but I do think it is worth consideration.
Nathan's response has got to be the most well written thing I've seen on Foundatio.nl for a long time.
-Liam [[witty lama]]
wittylama.com/blog Peace, love & metadata
On Wed, Nov 18, 2009 at 9:49 PM, Nathan nawrich@gmail.com wrote:
The Foundation, Commons and the English Wikipedia typically address problems associated with minors by refusing to engage as a group. Some individuals advise children not to put personally identifying information on their userpage, but that is advice haphazardly given and no effort is made to systematically identify situations where it would be useful. That one problem is a microcosm for the whole spectrum of "children" issues throughout Wikimedia - we encourage individual editors to advise other editors when they might be endangering themselves, but we don't allow (and often refuse even to discuss) more proactive solutions.
Outstanding problems that have been identified in the past:
- Access of minor readers to sexually explicit material
- Involvement of minor participants / administrators in the
administration of sexually explicit content
- Sexually explicit imagery that features or may feature models under
the age of 18
Our responses to these problems have never been more sophisticated than "Wikimedia is not censored." Perhaps its assumed that by refusing to budge from this absolute position, we avoid a war by inches where we will ultimately be forced to cave to all cultural sensitivities. Instead of evaluating what our responsibilities should be, what action we ought to take, we limit ourselves only to what we *must* do by law. I think that's a mistake.
I'm not sure we can do much about minor readers and participants, except perhaps putting certain types of content behind a warning wall that can be easily bypassed. The types of verification and consent models used in the web industry are formatted on limiting liability, they don't need to be (and consequently are not) very effective. Adopting one of these models may not make sense for Wikimedia, but it certainly makes sense to have a discussion about it. Geni and Andrew's comments strike me as an attempt to foreclose any discussion.
On the other hand, we certainly can do more on policing the sexually explicit imagery on Commons against possible violations of child pornography and privacy laws. We may not *have* to do this, but we ought to. There is at least one large category of images, from a specific photographer, where it has long been suspected that some models are underage. The only verification effort we make now is on licensing, but I think we ought to require actual model releases on sexually explicit photographs. We will gain far more by protecting the safety and privacy of image subjects than we stand to lose in the volume of explicit photos.
Nathan
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
There are a number of problems with these statements.
One - the Foundation exists to host and legally protect the encyclopedia, not direct it in all matters. Most policy flows up rather than down. Things which would grossly embarrass or endanger the encyclopedia are an exception, but no good case has been made here for that.
On en.wp this topic has been addressed repeatedly - there is (near) universal support for enforcing legal requirements and restrictions to the degree that they are felt or found to apply. Past that, there's at least an arguable consensus that WP:NOTCENSORED is the policy the community supports.
Is it worse for a 15-year-old (or 17-year-old, or 13-year-old) to participate in discussions about or administrative actions regarding an image or article with mature content, compared to merely being able to view the image or article?
The latter is widely felt to be a parental control issue. Why not the former?
I believe that advocates of a change both are taking the wrong venue here, and not explaining how the level of access currently under debate is fundamentally different than basic access to view images or read articles. If there is an argument to be made that there's a qualitative difference or legal difference then that is an appropriate topic for policy discussions on en.wp.
The burden of proof for justifying that there's the sort of policy issue that the Foundation must by nature intervene in is not met, nor being specifically argued. If you feel that it's true - you need to argue specifically to that point.
-george
On Wed, Nov 18, 2009 at 2:13 PM, Liam Wyatt liamwyatt@gmail.com wrote:
+1. Not sure what I can add to that, other than I agree completely. We have great nuance in our debates about copyright and take consummate care when concerns are raised on that front. But when concerns are raised in other areas (such as this one) we often tend towards extreme positions characterised by a refusal to engage in the issue and simplistic shutdowns. I have no answer or particular axe to grind in this topic but I do think it is worth consideration.
Nathan's response has got to be the most well written thing I've seen on Foundatio.nl for a long time.
-Liam [[witty lama]]
wittylama.com/blog Peace, love & metadata
On Wed, Nov 18, 2009 at 9:49 PM, Nathan nawrich@gmail.com wrote:
The Foundation, Commons and the English Wikipedia typically address problems associated with minors by refusing to engage as a group. Some individuals advise children not to put personally identifying information on their userpage, but that is advice haphazardly given and no effort is made to systematically identify situations where it would be useful. That one problem is a microcosm for the whole spectrum of "children" issues throughout Wikimedia - we encourage individual editors to advise other editors when they might be endangering themselves, but we don't allow (and often refuse even to discuss) more proactive solutions.
Outstanding problems that have been identified in the past:
- Access of minor readers to sexually explicit material
- Involvement of minor participants / administrators in the
administration of sexually explicit content
- Sexually explicit imagery that features or may feature models under
the age of 18
Our responses to these problems have never been more sophisticated than "Wikimedia is not censored." Perhaps its assumed that by refusing to budge from this absolute position, we avoid a war by inches where we will ultimately be forced to cave to all cultural sensitivities. Instead of evaluating what our responsibilities should be, what action we ought to take, we limit ourselves only to what we *must* do by law. I think that's a mistake.
I'm not sure we can do much about minor readers and participants, except perhaps putting certain types of content behind a warning wall that can be easily bypassed. The types of verification and consent models used in the web industry are formatted on limiting liability, they don't need to be (and consequently are not) very effective. Adopting one of these models may not make sense for Wikimedia, but it certainly makes sense to have a discussion about it. Geni and Andrew's comments strike me as an attempt to foreclose any discussion.
On the other hand, we certainly can do more on policing the sexually explicit imagery on Commons against possible violations of child pornography and privacy laws. We may not *have* to do this, but we ought to. There is at least one large category of images, from a specific photographer, where it has long been suspected that some models are underage. The only verification effort we make now is on licensing, but I think we ought to require actual model releases on sexually explicit photographs. We will gain far more by protecting the safety and privacy of image subjects than we stand to lose in the volume of explicit photos.
Nathan
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Right now I'm just going to quote a bit from the "General Policy" page of the Huntsville-Madison County Public Library system in Alabama. Not because they're special, but the anecdotal sample here is fairly representative of the policies of public information resources everywhere, not just here in liberal gomorrah:
"Parents/guardians are responsible for their minor (under the age of 17) children's use of the Library's resources and facilities. This includes using the Internet at any of the Library locations. Parents who believe that their children cannot responsibly use the Library's Internet access are requested to monitor their children's Internet use."
FMF
On Wed, Nov 18, 2009 at 5:13 PM, Liam Wyatt liamwyatt@gmail.com wrote:
+1. Not sure what I can add to that, other than I agree completely. We have great nuance in our debates about copyright and take consummate care when concerns are raised on that front. But when concerns are raised in other areas (such as this one) we often tend towards extreme positions characterised by a refusal to engage in the issue and simplistic shutdowns. I have no answer or particular axe to grind in this topic but I do think it is worth consideration.
Nathan's response has got to be the most well written thing I've seen on Foundatio.nl for a long time.
-Liam [[witty lama]]
wittylama.com/blog Peace, love & metadata
On Wed, Nov 18, 2009 at 9:49 PM, Nathan nawrich@gmail.com wrote:
The Foundation, Commons and the English Wikipedia typically address problems associated with minors by refusing to engage as a group. Some individuals advise children not to put personally identifying information on their userpage, but that is advice haphazardly given and no effort is made to systematically identify situations where it would be useful. That one problem is a microcosm for the whole spectrum of "children" issues throughout Wikimedia - we encourage individual editors to advise other editors when they might be endangering themselves, but we don't allow (and often refuse even to discuss) more proactive solutions.
Outstanding problems that have been identified in the past:
- Access of minor readers to sexually explicit material
- Involvement of minor participants / administrators in the
administration of sexually explicit content
- Sexually explicit imagery that features or may feature models under
the age of 18
Our responses to these problems have never been more sophisticated than "Wikimedia is not censored." Perhaps its assumed that by refusing to budge from this absolute position, we avoid a war by inches where we will ultimately be forced to cave to all cultural sensitivities. Instead of evaluating what our responsibilities should be, what action we ought to take, we limit ourselves only to what we *must* do by law. I think that's a mistake.
I'm not sure we can do much about minor readers and participants, except perhaps putting certain types of content behind a warning wall that can be easily bypassed. The types of verification and consent models used in the web industry are formatted on limiting liability, they don't need to be (and consequently are not) very effective. Adopting one of these models may not make sense for Wikimedia, but it certainly makes sense to have a discussion about it. Geni and Andrew's comments strike me as an attempt to foreclose any discussion.
On the other hand, we certainly can do more on policing the sexually explicit imagery on Commons against possible violations of child pornography and privacy laws. We may not *have* to do this, but we ought to. There is at least one large category of images, from a specific photographer, where it has long been suspected that some models are underage. The only verification effort we make now is on licensing, but I think we ought to require actual model releases on sexually explicit photographs. We will gain far more by protecting the safety and privacy of image subjects than we stand to lose in the volume of explicit photos.
Nathan
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
wikimedia-l@lists.wikimedia.org