I was looking over old discussions, and wondered: who originally came up with the notion that the "principle of least surprise" should apply to educational content? If it existed before Wikimedia, who introduced it to the image filter discussion, on what rationale?
[Personally I think it's an inanity - an education that doesn't turn your head upside down might as well be basket weaving - and it's too easily applied to shocking and outrageous concepts that children shouldn't be exposed to, like homosexuality or rights for minorities - but I could of course be convinced I'm wrong.]
- d.
Not sure, but I think it's the principle of least /astonishment/ - which may be an important difference...
Richard Symonds Wikimedia UK 0207 065 0992 Disclaimer viewable at http://uk.wikimedia.org/wiki/Wikimedia:Email_disclaimer Visit http://www.wikimedia.org.uk/ and @wikimediauk
On 13 June 2012 21:30, David Gerard dgerard@gmail.com wrote:
I was looking over old discussions, and wondered: who originally came up with the notion that the "principle of least surprise" should apply to educational content? If it existed before Wikimedia, who introduced it to the image filter discussion, on what rationale?
[Personally I think it's an inanity - an education that doesn't turn your head upside down might as well be basket weaving - and it's too easily applied to shocking and outrageous concepts that children shouldn't be exposed to, like homosexuality or rights for minorities - but I could of course be convinced I'm wrong.]
- d.
Wikimedia-l mailing list Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
On 13 June 2012 21:32, Richard Symonds richard.symonds@wikimedia.org.uk wrote:
Not sure, but I think it's the principle of least /astonishment/ - which may be an important difference...
Pretty sure it doesn't for educational purposes. I think my objection stands in its entirety.
(I note that in interface design, "principle of least astonishment" is in opposition to "educating" the user. With educational materials, that is ahahaha indeed the point.)
- d.
My understanding of this line of argument was that images would be displayed where you would expect them to be displayed (e.g. the article on penis or vagina would naturally include a picture of a penis or vagina), but wouldn't be immediately displayed where you wouldn't expect them (e.g. if you want to find information on necklaces made of pearls).
Whether that is called 'principle of least surprise' or 'principle of least astonishment' or something else is semantics...
Thanks, Mike
On 13 Jun 2012, at 21:38, David Gerard wrote:
On 13 June 2012 21:32, Richard Symonds richard.symonds@wikimedia.org.uk wrote:
Not sure, but I think it's the principle of least /astonishment/ - which may be an important difference...
Pretty sure it doesn't for educational purposes. I think my objection stands in its entirety.
(I note that in interface design, "principle of least astonishment" is in opposition to "educating" the user. With educational materials, that is ahahaha indeed the point.)
- d.
Wikimedia-l mailing list Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
On 13 June 2012 21:44, Michael Peel michael.peel@wikimedia.org.uk wrote:
My understanding of this line of argument was that images would be displayed where you would expect them to be displayed (e.g. the article on penis or vagina would naturally include a picture of a penis or vagina),
I don't recall this being conceded. (The discussions of image filter plans seemed to me to assume that images considered unsuitable would indeed be filtered in such places.)
So who first brought the phrase into the discussion?
- d.
Earliest I have it on a Wikimedia list is from WikiEn-L on 2/11/08 from Ian Woollard (written as principle of least surprise), in the context of a Muhammad images thread started by Jimbo -- but my logs only go back to the summer of 07.
On-wiki, I see it being used in naming convention arguments for years, as early as April 2005. I'm not sure when it made the transition from user interface design principle to a more general content principle, but it looks like (from a web search) it was commonly used for Ruby as early as 2002-2003.
On 13 June 2012 21:56, Nathan nawrich@gmail.com wrote:
Earliest I have it on a Wikimedia list is from WikiEn-L on 2/11/08 from Ian Woollard (written as principle of least surprise), in the context of a Muhammad images thread started by Jimbo -- but my logs only go back to the summer of 07.
Bingo - and he specifically invoked it to "minimise offence".
On-wiki, I see it being used in naming convention arguments for years, as early as April 2005.
Yeah, that's arguably a user interface issue (with arguments being somewhat alleviated by a forest of redirects). I see it's been commonly used around user interface issues in Wikimedia for many years.
- d.
David Gerard, 13/06/2012 23:02:
On-wiki, I see it being used in naming convention arguments for years, as early as April 2005.
Yeah, that's arguably a user interface issue (with arguments being somewhat alleviated by a forest of redirects). I see it's been commonly used around user interface issues in Wikimedia for many years.
And still you had reactions like this: http://lists.wikimedia.org/pipermail/wikien-l/2006-April/044011.html (notice the other big threads in the same month about images...).
Nemo
On 13 June 2012 22:02, David Gerard dgerard@gmail.com wrote:
On 13 June 2012 21:56, Nathan nawrich@gmail.com wrote:
Earliest I have it on a Wikimedia list is from WikiEn-L on 2/11/08 from Ian Woollard (written as principle of least surprise), in the context of a Muhammad images thread started by Jimbo -- but my logs only go back to the summer of 07.
Bingo - and he specifically invoked it to "minimise offence".
Sure, but it also applies to getting back what you expect.
A male heterosexual friend of mine typed in the word "Boobs" into Commons search engine a while back and came back with the page "Boobs on Bikes". It's not a matter of minimising offence, it's simply that if you type in one thing and get something else and rather surprising, that's a problem.
That a subset of that surprise happens to be involve people getting offended doesn't mean that avoiding unnecessary surprise isn't a laudable goal.
There's surprise in the "reading a book and learning something new" sense, then there is surprise in the "being told that the book is on this shelf, but instead it's on a different shelf" sense. The two are rather different, and I fear some conflation is going on.
On Wed, Jun 13, 2012 at 1:44 PM, Michael Peel michael.peel@wikimedia.org.uk wrote:
My understanding of this line of argument was that images would be displayed where you would expect them to be displayed (e.g. the article on penis or vagina would naturally include a picture of a penis or vagina), but wouldn't be immediately displayed where you wouldn't expect them (e.g. if you want to find information on necklaces made of pearls).
Whether that is called 'principle of least surprise' or 'principle of least astonishment' or something else is semantics...
Thanks, Mike
That's exactly how I understand the idea as well.
As for where it came from -- from my imperfect memory, the idea has been kicking around in the English Wikipedia style guide and in Commons for some years (I found it in a style guide history in 2004, also cf Nathan's research).
In the context of this discussion, however, the "principle of least astonishment" had I believe been brought up early on; it was highlighted in the Harris report as a potentially useful concept for thinking about the whole range of issues around handling controversial content. This was actually a separate bullet point/idea from the recommendation to allow readers to hide images. They're not necessarily connected; overall I haven't heard a lot of complaints about trying to implement the principle of least astonishment, i.e. by improving search etc.
The concept itself, as a usability term, has been around for a while; there's a (not very good) article, which was started in 2002: http://en.wikipedia.org/wiki/Principle_of_least_astonishment I don't know when it came into use in the world at large.
-- phoebe
* Michael Peel wrote:
My understanding of this line of argument was that images would be displayed where you would expect them to be displayed (e.g. the article on penis or vagina would naturally include a picture of a penis or vagina), but wouldn't be immediately displayed where you wouldn't expect them (e.g. if you want to find information on necklaces made of pearls).
Did anyone argue for displaying images where they would not expect them?
I can't say who came up with it. The point I first became aware of it was the posts, and consultation reports series, on Meta. It may well have predated that though, in which case I couldn't say.
Advanced search in old enwp and meta dumps, or mailing lists would be a way to explore before that. The topic was only discussed _in depth_ in a limited number of places easily identified by search, the expressions are very distinctive, and a list of wiki pages or list threads can be searched fairly easily to find exact posts or dates.
FT2
On Wed, Jun 13, 2012 at 9:30 PM, David Gerard dgerard@gmail.com wrote:
I was looking over old discussions, and wondered: who originally came up with the notion that the "principle of least surprise" should apply to educational content? If it existed before Wikimedia, who introduced it to the image filter discussion, on what rationale?
On 13 June 2012 21:30, David Gerard dgerard@gmail.com wrote:
I was looking over old discussions, and wondered: who originally came up with the notion that the "principle of least surprise" should apply to educational content? If it existed before Wikimedia, who introduced it to the image filter discussion, on what rationale?
It (principle of least astonishment) derives from our redirect guidelines where you are trying to decide between redirecting to an article and redirecting to a disambiguation page. It also somewhat related to page naming.
[Personally I think it's an inanity - an education that doesn't turn your head upside down might as well be basket weaving - and it's too easily applied to shocking and outrageous concepts that children shouldn't be exposed to, like homosexuality or rights for minorities - but I could of course be convinced I'm wrong.]
I think you miss the point of a concept. The idea is not that say [[Marriage]] shouldn't contain information about homosexual marriages, heterosexual marriages, marriages of convenience or polygamous marriages but that it probably shouldn't contain photos of marriage consummation.
[[Nude photography]] on the other hand should have some nudity. but then it should also be more than 3 paragraphs long.
On 14 June 2012 12:52, geni geniice@gmail.com wrote:
I think you miss the point of a concept. The idea is not that say [[Marriage]] shouldn't contain information about homosexual marriages, heterosexual marriages, marriages of convenience or polygamous marriages but that it probably shouldn't contain photos of marriage consummation.
As I have noted already, this idealised version is not how it was used when it was introduced to the discussion and is not how it's been used in the most recent round of it.
- d.
On 14 June 2012 14:45, David Gerard dgerard@gmail.com wrote:
As I have noted already, this idealised version is not how it was used when it was introduced to the discussion and is not how it's been used in the most recent round of it.
Looking at the timing of the phrase appeared in the email list I think you were physically present when the phrase stated being used in the context of dealing controversial content. Certainly I can find it being used in that context before that London meetup that Dory Carr-Harris attended. And in that case at least the meaning was very much in the direction of not including controversial content unless there was a valid reason to do so. It was unrelated to an image filter.
Shocking images in [[Nanking Massacre]] are pretty much expected. [[People's Republic of China–Japan relations]] not so much. [[Agent orange]] is a more boarderline case but these things are never easy as [[Wikipedia:LAME#Names]] shows.
On 14 June 2012 17:22, geni geniice@gmail.com wrote:
Shocking images in [[Nanking Massacre]] are pretty much expected. [[People's Republic of China–Japan relations]] not so much. [[Agent orange]] is a more boarderline case but these things are never easy as [[Wikipedia:LAME#Names]] shows.
Yes, but this is called editorial judgement rather than something that can be imposed by filtering. (Although the board and staff claim that editorial judgement they disagree with must just be trolling is how "principle of least surprise" becomes "we need a filter system".)
- d.
On 14 June 2012 18:01, David Gerard dgerard@gmail.com wrote:
Yes, but this is called editorial judgement
No its called censorship. Or at least it will be called censorship by enough people to make any debate not worth the effort.
rather than something that can be imposed by filtering.
True for wikipedia but commons in particular needs some way or another to provide more focused search results.
(Although the board and staff claim that editorial judgement they disagree with must just be trolling is how "principle of least surprise" becomes "we need a filter system".)
Perhaps but I wasn't aware that their opinions were considered to be of any significance at this point.
Okey they did block [[user:Beta_M]] but the fact that very much came out of the blue shows how little consideration they are given these days.
The fact remains that anyone who actually wants a filter could probably put one together in the form of an Adblock plus filter list within a few days. So far the only list I'm aware of is one I put together to filter out images of Giant isopods.
On Thu, Jun 14, 2012 at 11:31 AM, geni geniice@gmail.com wrote:
On 14 June 2012 18:01, David Gerard dgerard@gmail.com wrote:
Yes, but this is called editorial judgement
No its called censorship. Or at least it will be called censorship by enough people to make any debate not worth the effort.
rather than something that can be imposed by filtering.
True for wikipedia but commons in particular needs some way or another to provide more focused search results.
(Although the board and staff claim that editorial judgement they disagree with must just be trolling is how "principle of least surprise" becomes "we need a filter system".)
Perhaps but I wasn't aware that their opinions were considered to be of any significance at this point.
Okey they did block [[user:Beta_M]] but the fact that very much came out of the blue shows how little consideration they are given these days.
The fact remains that anyone who actually wants a filter could probably put one together in the form of an Adblock plus filter list within a few days. So far the only list I'm aware of is one I put together to filter out images of Giant isopods.
-- geni
Wikimedia-l mailing list Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
If "Principle of least astonishment" means what it normally means, that being to make sensible UI decisions based upon what your average user would expect to happen, I'm all for it.
If "Principle of least astonishment" means what it's been co-opted to mean in this particular case, that people will somehow be "astonished" to see images of nude humans on human anatomy articles, or depictions of sex acts on articles about that particular act (though that's already off kilter, we already fail to use real images on those, instead preferring poor-quality line drawings), or images of Muhammad on the Muhammad article, we need a cluebat rather than a filter. Point those who scream in faux-outrage at finding media depicting ejaculation on that article, or Muhammad on that article, to the content disclaimer, tell them that yes, they will actually get an article on what they specifically look for one for, that yes, we use multimedia illustrations when we have appropriately licensed and relevant media, and move on.
Todd Allen
Am 14.06.2012 19:31, schrieb geni:
On 14 June 2012 18:01, David Gerarddgerard@gmail.com wrote:
Yes, but this is called editorial judgement
No its called censorship. Or at least it will be called censorship by enough people to make any debate not worth the effort.
It is called censorship right at that moment when useful illustrations are removed because of their shock value, while arguing with the "the priciple of XYZ" from a rather extreme position. Good editorial judgment would include such depictions if they further the understanding of a topic. But bad editorial judgment tends to exclude useful depictions and to include useless/unrelated, shocking or not, depictions.
rather than something that can be imposed by filtering.
True for wikipedia but commons in particular needs some way or another to provide more focused search results.
I already made a workable suggestion for Commons, but the interest from any side was very low:
http://commons.wikimedia.org/wiki/Commons:Requests_for_comment/improving_sea...
Some seam not like to give up the idea of filtering (labeling) and others seam not to care. Overall we have a proposal that would be workable, being to the benefit of all users and would not introduce any controversy or additional work, once implemented.
(Although the board and staff claim that editorial judgement they disagree with must just be trolling is how "principle of least surprise" becomes "we need a filter system".)
Perhaps but I wasn't aware that their opinions were considered to be of any significance at this point.
Okey they did block [[user:Beta_M]] but the fact that very much came out of the blue shows how little consideration they are given these days.
The fact remains that anyone who actually wants a filter could probably put one together in the form of an Adblock plus filter list within a few days. So far the only list I'm aware of is one I put together to filter out images of Giant isopods.
I argued at some time that if there was a strong need for such a filter that there would already services in place that would filter the content or images. So far i have seen some very week approaches using the Google APIs, but no real filter lists. Judging from your approach to filter out Giant isopods, we see that there is no general rule what should be filtered. Some dislike X, others Y and the next one likes X and Y but not Z. Overall this results in the wish to have as many suitable filters as possible, which at the same time results in massive tagging work.
On 15 June 2012 13:15, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
I argued at some time that if there was a strong need for such a filter that there would already services in place that would filter the content or images. So far i have seen some very week approaches using the Google APIs, but no real filter lists. Judging from your approach to filter out Giant isopods, we see that there is no general rule what should be filtered. Some dislike X, others Y and the next one likes X and Y but not Z. Overall this results in the wish to have as many suitable filters as possible, which at the same time results in massive tagging work.
I don't recall seeing any, but did anyone actually explain why the market had not provided a filtering solution for Wikipedia, if there's actually a demand for one?
(IIRC the various netnannies for workplaces don't filter Wikipedia, or do so only by keyword, i.e. [[Scunthorpe problem]]-susceptible, methods.)
I ask because of recent statements by board members that the filter is alive and well, and not at all dead.
- d.
On Friday, 15 June 2012 at 13:21, David Gerard wrote:
I don't recall seeing any, but did anyone actually explain why the market had not provided a filtering solution for Wikipedia, if there's actually a demand for one?
Market failures do sometimes exist.
Also, because as far as I can tell, the proposed filter isn't a NetNanny type thing, it's a "I don't want to see pictures of boobies" AdBlock type thing. Which is a different thing entirely.
Of course, there's some confusion here. Larry Sanger, for instance, is very very angry about how Wikipedia hasn't implemented a "filter", even though he seems slightly confused as to the difference between an AdBlock type filter and a NetNanny type filter.
Preventing people who don't want to see pictures of naked people from seeing pictures of naked people is a lot easier a task than preventing people who DO want to see pictures of naked people from doing so.
On Fri, Jun 15, 2012 at 8:27 AM, Tom Morris tom@tommorris.org wrote:
On Friday, 15 June 2012 at 13:21, David Gerard wrote:
I don't recall seeing any, but did anyone actually explain why the market had not provided a filtering solution for Wikipedia, if there's actually a demand for one?
Market failures do sometimes exist.
Also, because as far as I can tell, the proposed filter isn't a NetNanny type thing, it's a "I don't want to see pictures of boobies" AdBlock type thing. Which is a different thing entirely.
Of course, there's some confusion here. Larry Sanger, for instance, is very very angry about how Wikipedia hasn't implemented a "filter", even though he seems slightly confused as to the difference between an AdBlock type filter and a NetNanny type filter.
Preventing people who don't want to see pictures of naked people from seeing pictures of naked people is a lot easier a task than preventing people who DO want to see pictures of naked people from doing so.
Preventing, sure. But I think what you see as Sanger being confused about the difference between an AdBlock type filter and a NetNanny type filter is actually his desire for something which isn't either - a filter which parents can set up to prevent their children from inadvertently stumbling upon age-inappropriate materials.
As a parent I must say that there is certainly demand for this sort of thing. And I can think of many reasons why the market hasn't tackled this one. The copyleft license is near, if not at, the top of that list. Liability and other legal considerations would also be high up on the list.
On Fri, Jun 15, 2012 at 1:21 PM, David Gerard dgerard@gmail.com wrote:
On 15 June 2012 13:15, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
I argued at some time that if there was a strong need for such a filter
that
there would already services in place that would filter the content or images. So far i have seen some very week approaches using the Google
APIs,
but no real filter lists. Judging from your approach to filter out Giant isopods, we see that there is no general rule what should be filtered.
Some
dislike X, others Y and the next one likes X and Y but not Z. Overall
this
results in the wish to have as many suitable filters as possible, which
at
the same time results in massive tagging work.
I don't recall seeing any, but did anyone actually explain why the market had not provided a filtering solution for Wikipedia, if there's actually a demand for one?
(IIRC the various netnannies for workplaces don't filter Wikipedia, or do so only by keyword, i.e. [[Scunthorpe problem]]-susceptible, methods.)
UK schools of course filter, but both the bestiality video and everything that comes up in a multimedia search for "male human" was accessible on computers in my son's school. Much to their surprise. The one thing their filter did catch was the masturbation videos category page in Commons.
I ask because of recent statements by board members that the filter is
alive and well, and not at all dead.
Which board members other than Jimbo have said that?
Am 15.06.2012 23:22, schrieb Andreas Kolbe:
On Fri, Jun 15, 2012 at 1:21 PM, David Gerarddgerard@gmail.com wrote:
I don't recall seeing any, but did anyone actually explain why the market had not provided a filtering solution for Wikipedia, if there's actually a demand for one?
(IIRC the various netnannies for workplaces don't filter Wikipedia, or do so only by keyword, i.e. [[Scunthorpe problem]]-susceptible, methods.)
UK schools of course filter, but both the bestiality video and everything that comes up in a multimedia search for "male human" was accessible on computers in my son's school. Much to their surprise. The one thing their filter did catch was the masturbation videos category page in Commons.
That means they already found a solution to their problem that includes the whole web at once. As you might have noticed it isn't perfect. I guess that it could be easily improved over time. But the image filter had an different goal. It wouldn't help the schools, since the content is still accessible. But why we discuss about schools and children all the time and speak about it as a net nanny?
On Saturday, 16 June 2012 at 20:21, Tobias Oelgarte wrote:
That means they already found a solution to their problem that includes the whole web at once. As you might have noticed it isn't perfect. I guess that it could be easily improved over time. But the image filter had an different goal. It wouldn't help the schools, since the content is still accessible. But why we discuss about schools and children all the time and speak about it as a net nanny?
Don't you get it? An image filter you can trivially opt-out of by clicking the big button labelled "show image" is a perfect way of preventing children from getting to naughty pictures…
Seriously though, I'm slightly surprised that commercial censorware providers haven't bothered to add the nudey stuff from Commons. Pay a few bored minimum wage people to go through and find all the categories with the naughty stuff and stick all those images in their filter. It'd only take a few hours, given the extensive work already done by the Commons community neatly sorting things into categories with names like "Nude works including Muppets" and "Suggestive use of feathers" etc.
It's almost as if the censorware manufacturers are selling products to people who don't know any better that are ineffective and serve to give piece-of-mind placebo to people in place of effective access control. Oh, wait, that would be the inner cynic speaking.
-- Tom Morris http://tommorris.org/
Am 16.06.2012 23:36, schrieb Tom Morris:
On Saturday, 16 June 2012 at 20:21, Tobias Oelgarte wrote:
That means they already found a solution to their problem that includes the whole web at once. As you might have noticed it isn't perfect. I guess that it could be easily improved over time. But the image filter had an different goal. It wouldn't help the schools, since the content is still accessible. But why we discuss about schools and children all the time and speak about it as a net nanny?
Don't you get it? An image filter you can trivially opt-out of by clicking the big button labelled "show image" is a perfect way of preventing children from getting to naughty pictures…
Is this irony? My comment included some irony as well. ;-)
How would a "show image" button protect children from getting to naughty pictures? The first thing a child would do is to press this button out of curiosity alone. Real child protection software is meant to hide such content without giving the child even the possibility to access such content. That is what a so called "net nanny" software will do, since it is usually meant to block access in case no parent is present and watching over their children exploring minefields. At least the adverts tell this great story.
Seriously though, I'm slightly surprised that commercial censorware providers haven't bothered to add the nudey stuff from Commons. Pay a few bored minimum wage people to go through and find all the categories with the naughty stuff and stick all those images in their filter. It'd only take a few hours, given the extensive work already done by the Commons community neatly sorting things into categories with names like "Nude works including Muppets" and "Suggestive use of feathers" etc.
Yes they could do that. But the Internet is large. They usually use a combination of black and white listing which is the core evil in the detail. White listing delivers perfect results (as long the content doesn't change over night), but it is much more expensive since every new page would need to be checked. Blacklisting is way easier, since it doesn't block access to new pages or images. But at the same time it has it's flaws, because any unknown website (the biggest part) can be accessed regardless of content.
It's almost as if the censorware manufacturers are selling products to people who don't know any better that are ineffective and serve to give piece-of-mind placebo to people in place of effective access control. Oh, wait, that would be the inner cynic speaking.
Exactly that is the case. I have never seen a "censorware" that works flawlessly (not even china can do this right). Either it allows to much (incomplete blacklist) or it is unnecessary limited (incomplete whitelist producing angry mob). Additionally it has to suite the view of the parents and match the age of the child. The only "software" which does this perfectly is the brain of the parents that tracks the actions of the child, stops them when necessary and gives useful advice (even better then Clippy).
nya~
On Saturday, 16 June 2012 at 23:51, Tobias Oelgarte wrote:
Am 16.06.2012 23:36, schrieb Tom Morris:
On Saturday, 16 June 2012 at 20:21, Tobias Oelgarte wrote:
That means they already found a solution to their problem that includes the whole web at once. As you might have noticed it isn't perfect. I guess that it could be easily improved over time. But the image filter had an different goal. It wouldn't help the schools, since the content is still accessible. But why we discuss about schools and children all the time and speak about it as a net nanny?
Don't you get it? An image filter you can trivially opt-out of by clicking the big button labelled "show image" is a perfect way of preventing children from getting to naughty pictures…
Is this irony? My comment included some irony as well. ;-)
I should probably get a .uk domain name for my emails to remove any doubt as to whether I'm being ironic and/or dryly sarcastic.
-- Tom Morris http://tommorris.org/
I have never seen a "censorware" that works flawlessly (not even china can do this right). Either it allows to much (incomplete blacklist) or it is unnecessary limited (incomplete whitelist producing angry mob). Additionally it has to suite the view of the parents and match the age of the child. The only "software" which does this perfectly is the brain of the parents that tracks the actions of the child, stops them when necessary and gives useful advice (even better then Clippy).
What parent tracks every action of their child? You seem to have a very unrealistic picture of how parenting works.
Am 17.06.2012 01:21, schrieb Anthony:
I have never seen a "censorware" that works flawlessly (not even china can do this right). Either it allows to much (incomplete blacklist) or it is unnecessary limited (incomplete whitelist producing angry mob). Additionally it has to suite the view of the parents and match the age of the child. The only "software" which does this perfectly is the brain of the parents that tracks the actions of the child, stops them when necessary and gives useful advice (even better then Clippy).
What parent tracks every action of their child? You seem to have a very unrealistic picture of how parenting works.
I guess i have to really wrap any comment inside the <sarcasm><irony><takeItNotToSerious> tag stack to avoid confusion...
* Tobias Oelgarte wrote:
Am 17.06.2012 01:21, schrieb Anthony:
I have never seen a "censorware" that works flawlessly (not even china can do this right). Either it allows to much (incomplete blacklist) or it is unnecessary limited (incomplete whitelist producing angry mob). Additionally it has to suite the view of the parents and match the age of the child. The only "software" which does this perfectly is the brain of the parents that tracks the actions of the child, stops them when necessary and gives useful advice (even better then Clippy).
What parent tracks every action of their child? You seem to have a very unrealistic picture of how parenting works.
I guess i have to really wrap any comment inside the <sarcasm><irony><takeItNotToSerious> tag stack to avoid confusion...
No, the Wikimedia Foundation should develop a personal sarcasm filter for this mailing list so nobody is surprised or confused by what they (don't) read here.
On Sat, Jun 16, 2012 at 9:48 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 17.06.2012 01:21, schrieb Anthony:
I have never seen a "censorware" that works flawlessly (not even china can do this right). Either it allows to much (incomplete blacklist) or it is unnecessary limited (incomplete whitelist producing angry mob). Additionally it has to suite the view of the parents and match the age of the child. The only "software" which does this perfectly is the brain of the parents that tracks the actions of the child, stops them when necessary and gives useful advice (even better then Clippy).
What parent tracks every action of their child? You seem to have a very unrealistic picture of how parenting works.
I guess i have to really wrap any comment inside the <sarcasm><irony><takeItNotToSerious> tag stack to avoid confusion...
I still would have been confused. Still am, actually. Did this paragraph have a serious point at all? I hope so, because Wikipedia's porn problem is a serious issue.
Anthony, 17/06/2012 05:05:
I still would have been confused. Still am, actually. Did this paragraph have a serious point at all? I hope so, because Wikipedia's porn problem is a serious issue.
The point was, I think, that no "software" is perfect (not even parents' brain) and that parents can't rely on software too much. Not that hard to understand, hence please avoid off-topic (see subject) paternalism.
Nemo
Am 17.06.2012 09:11, schrieb Federico Leva (Nemo):
Anthony, 17/06/2012 05:05:
I still would have been confused. Still am, actually. Did this paragraph have a serious point at all? I hope so, because Wikipedia's porn problem is a serious issue.
The point was, I think, that no "software" is perfect (not even parents' brain) and that parents can't rely on software too much. Not that hard to understand, hence please avoid off-topic (see subject) paternalism.
Nemo
This interpretation is right but a also a bit incomplete. It also criticizes the "one hat suits everyone" approach. The reasons are:
a) Children have not the same age. What should a 8 year old see and what a 16 year old? I doubt that there is a good compromise between both ages, what i called black- and white-listing.
b) Also parents have different expectations depending on how they see their child or themselves.
c) The proposed filter would have affected all projects and therefore every culture the same way, ignoring cultural differences entirely.
This leaves the question: What is the prototype target group for the filter? If I remember correctly, this was never defined.
On Sun, Jun 17, 2012 at 3:11 AM, Federico Leva (Nemo) nemowiki@gmail.com wrote:
Anthony, 17/06/2012 05:05:
I still would have been confused. Still am, actually. Did this paragraph have a serious point at all? I hope so, because Wikipedia's porn problem is a serious issue.
The point was, I think, that no "software" is perfect (not even parents' brain) and that parents can't rely on software too much.
Is this supposed to be a parody of the people who point out the flaws in software solutions but fail to point out the flaws in non-software solutions?
Because, it seemed to me to be an instance of it.
On Sun, Jun 17, 2012 at 3:11 AM, Federico Leva (Nemo) nemowiki@gmail.com wrote:
Not that hard to understand, hence please avoid off-topic (see subject) paternalism.
I still don't understand how calling something "perfect" when you are making an argument that it is the proper solution to a problem, is sarcasm/irony. Sarcasm/irony would be calling something perfect when you are making an argument that it is not the proper solution. Exaggerating a point on which you are incorrect is not sarcasm/irony.
In any case, I'm not sure where the paternalism is in my posts. Yes, I am giving the perspective of a father, but that is not paternalism.
No software is perfect. No solution is perfect. But don't let the perfect be the enemy of the good.
On 17 June 2012 13:21, Anthony wikimail@inbox.org wrote:
No software is perfect. No solution is perfect. But don't let the perfect be the enemy of the good.
You're assuming that a "good" exists for this function. This assumption is entirely unsubstantiated.
- d.
On Sun, Jun 17, 2012 at 9:14 AM, David Gerard dgerard@gmail.com wrote:
On 17 June 2012 13:21, Anthony wikimail@inbox.org wrote:
No software is perfect. No solution is perfect. But don't let the perfect be the enemy of the good.
You're assuming that a "good" exists for this function. This assumption is entirely unsubstantiated.
YouTube's age restricted content policy is "good". That is to say, it's not perfect, but it's a lot better than Wikipedia's policies. My kids are much more likely to run across hard core pornography while clicking around on Wikipedia than clicking around on YouTube. Personally I'd prefer they rely more on whitelisting than on blacklisting - but what they do is already a *lot* better than Wikipedia.
On Sun, Jun 17, 2012 at 9:30 AM, Anthony wikimail@inbox.org wrote:
On Sun, Jun 17, 2012 at 9:14 AM, David Gerard dgerard@gmail.com wrote:
On 17 June 2012 13:21, Anthony wikimail@inbox.org wrote:
No software is perfect. No solution is perfect. But don't let the perfect be the enemy of the good.
You're assuming that a "good" exists for this function. This assumption is entirely unsubstantiated.
YouTube's age restricted content policy is "good". That is to say, it's not perfect, but it's a lot better than Wikipedia's policies. My kids are much more likely to run across hard core pornography while clicking around on Wikipedia than clicking around on YouTube. Personally I'd prefer they rely more on whitelisting than on blacklisting - but what they do is already a *lot* better than Wikipedia.
World Book Encyclopedia was "good". I spent many days reading through the entries, performing the dead-tree equivalent of clicking on the links as I went from topic to topic. My parents didn't sit looking over my shoulder. It was an encyclopedia I could read on my own.
You want an explanation for why the market hasn't created a WBE equivalent based on Wikipedia (*)? The top answer is copyleft. (As suggested by Andrew Gray, technical/legal problems are another problem, but I think these issues pale in comparison to copyleft.)
(*) Actually I'm not sure the market hasn't created this. There certainly have been various projects which have attempted to create it. I'm not sure if any have succeeded, and my kids are not yet at the reading level where I need to spend much time looking for it.
On 17 June 2012 14:14, David Gerard dgerard@gmail.com wrote:
On 17 June 2012 13:21, Anthony wikimail@inbox.org wrote:
No software is perfect. No solution is perfect. But don't let the perfect be the enemy of the good.
You're assuming that a "good" exists for this function. This assumption is entirely unsubstantiated.
Well the various attempts by collages to block game sites were somewhat effective. And that did have the effect of freeing up more computers for actual college work.
On 15 June 2012 13:21, David Gerard dgerard@gmail.com wrote:
I don't recall seeing any, but did anyone actually explain why the market had not provided a filtering solution for Wikipedia, if there's actually a demand for one?
I think we had this conversation almost a year ago ;-)
http://lists.wikimedia.org/pipermail/wikimedia-l/2011-September/114562.html http://lists.wikimedia.org/pipermail/wikimedia-l/2011-September/114569.html http://lists.wikimedia.org/pipermail/wikimedia-l/2011-September/115530.html
are my comments from the last round.
In short: the almost complete absence of anyone doing *anything* clever in terms of reusing and repurposing our content strongly suggests that there are practical barriers to doing so in general, rather than the flaws with any specific model of what it is they want to do.
(Alternatively, it might suggest there's no demand at all for any meaningfully variant derivatives of Wikipedia, which is a demoralising thought..)
On 17 June 2012 14:50, Andrew Gray andrew.gray@dunelm.org.uk wrote:
In short: the almost complete absence of anyone doing *anything* clever in terms of reusing and repurposing our content strongly suggests that there are practical barriers to doing so in general, rather than the flaws with any specific model of what it is they want to do.
Which comes back to someone testing our practical forkability, then (as I've noted before) - arguably an important part of backup hygiene, but one which is in no way actually urgent at present.
- d.
On 17 June 2012 14:53, David Gerard dgerard@gmail.com wrote:
On 17 June 2012 14:50, Andrew Gray andrew.gray@dunelm.org.uk wrote:
In short: the almost complete absence of anyone doing *anything* clever in terms of reusing and repurposing our content strongly suggests that there are practical barriers to doing so in general, rather than the flaws with any specific model of what it is they want to do.
Which comes back to someone testing our practical forkability, then (as I've noted before) - arguably an important part of backup hygiene, but one which is in no way actually urgent at present.
I certainly don't think it's urgent to try now - I'm sanguine that the WMF WP we have now will be around for a second decade at least - but I do think it's important to remember when bringing up the issue of competitors.
As there are no major and well-used forks at all, we can't reasonably draw inferences of the desirability of a specific project from its non-existence - we simply don't have the information to make that conclusion. This applies whether the hypothetical fork is one using an image filter, one using stable versions, one using peer-review editorial control, one dynamically switching between varieties of English, or anything else...
On 17 June 2012 15:43, Andrew Gray andrew.gray@dunelm.org.uk wrote:
As there are no major and well-used forks at all, we can't reasonably draw inferences of the desirability of a specific project from its non-existence - we simply don't have the information to make that conclusion. This applies whether the hypothetical fork is one using an image filter, one using stable versions, one using peer-review editorial control, one dynamically switching between varieties of English, or anything else...
I haven't seen those being shouted for like this is. That is, there are people actually asking for this, and there aren't really for those other things. So I think my question - if this is so obviously the right thing, then where are the existing attempts? - still stands as relevant.
- d.
On Sun, Jun 17, 2012 at 10:48 AM, David Gerard dgerard@gmail.com wrote:
So I think my question - if this is so obviously the right thing, then where are the existing attempts? - still stands as relevant.
The fact that it is the right thing isn't obvious, and forking of free content is generally a last resort, when all else has failed. Those "recent statements by board members that the filter is alive and well" make a fork less likely, not more.
Am 17.06.2012 17:16, schrieb Anthony:
On Sun, Jun 17, 2012 at 10:48 AM, David Gerarddgerard@gmail.com wrote:
So I think my question - if this is so obviously the right thing, then where are the existing attempts? - still stands as relevant.
The fact that it is the right thing isn't obvious, and forking of free content is generally a last resort, when all else has failed. Those "recent statements by board members that the filter is alive and well" make a fork less likely, not more.
It didn't even need to be complete fork. A whitelist copy would most likely already be sufficient for your needs. It would automatically update any article on a white list after a quick review (like sighted revision) or even entirely automated for articles or images marked as unproblematic. There would be some programming work (an "confirm update button"), but overall it would be easy to implement and maintain. That way you could easily create a Wiki suited for the needs of a special audience which is quickly updated and expanded to the latest versions. A subset of Wikipedia.
On Sun, Jun 17, 2012 at 1:04 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
It didn't even need to be complete fork. A whitelist copy would most likely already be sufficient for your needs. It would automatically update any article on a white list after a quick review (like sighted revision) or even entirely automated for articles or images marked as unproblematic. There would be some programming work (an "confirm update button"), but overall it would be easy to implement and maintain. That way you could easily create a Wiki suited for the needs of a special audience which is quickly updated and expanded to the latest versions. A subset of Wikipedia.
I don't see how that isn't a fork. And I don't think it would be easy to implement or to maintain. Citizendium tried to do this without even doing the automatic updating part, and they quickly decided that it was more trouble than it was worth.
Maybe things have gotten better since then. Maybe they have gotten worse. I don't know. Is there even a way to export an article, including (recursively) all the templates it depends on?
Am 18.06.2012 00:40, schrieb Anthony:
On Sun, Jun 17, 2012 at 1:04 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
It didn't even need to be complete fork. A whitelist copy would most likely already be sufficient for your needs. It would automatically update any article on a white list after a quick review (like sighted revision) or even entirely automated for articles or images marked as unproblematic. There would be some programming work (an "confirm update button"), but overall it would be easy to implement and maintain. That way you could easily create a Wiki suited for the needs of a special audience which is quickly updated and expanded to the latest versions. A subset of Wikipedia.
I don't see how that isn't a fork. And I don't think it would be easy to implement or to maintain. Citizendium tried to do this without even doing the automatic updating part, and they quickly decided that it was more trouble than it was worth.
Maybe things have gotten better since then. Maybe they have gotten worse. I don't know. Is there even a way to export an article, including (recursively) all the templates it depends on?
Every stupid bot could do this. There is no "running out of the box" solution at the moment, but the effort to set up something like this would be minimal compared to anything else.
I would say that Citizendium failed because they did no automatic updating. What i have in mind is delayed mirror with update control. It is not meant to be edited by hand. It is a subset of the current content selected by the host (one or many users) of the page himself. It is essentially a whitelist for Wikipedia that only contains selected/checked content. That way a "childrens Wiki" could easily be created, by not including any unwanted content, while the effort stays minimal. (Not more effort then to create your own book from a list of already written articles)
On Monday, 18 June 2012 at 02:44, Tobias Oelgarte wrote:
Every stupid bot could do this. There is no "running out of the box" solution at the moment, but the effort to set up something like this would be minimal compared to anything else.
I would say that Citizendium failed because they did no automatic updating. What i have in mind is delayed mirror with update control. It is not meant to be edited by hand. It is a subset of the current content selected by the host (one or many users) of the page himself. It is essentially a whitelist for Wikipedia that only contains selected/checked content. That way a "childrens Wiki" could easily be created, by not including any unwanted content, while the effort stays minimal. (Not more effort then to create your own book from a list of already written articles)
{{sofixit}}
If all the people in favour of filters had spent their time building them rather than arguing about them, we would have had a wide array of different solutions, without any politics or drama.
That said, if people want to filter Wikipedia, a client-side solution rather than a filtered mirror is preferable. If a filtered mirror were to come into existence and become popular, this would mean that people would just filter all of main Wikipedia, which would prevent people from editing Wikipedia. A client-side solution means they are still looking at wikipedia.org just without naughty pics and doesn't interfere with editing. It also reduces the need for any servers.
On 18 June 2012 08:00, Tom Morris tom@tommorris.org wrote:
{{sofixit}} If all the people in favour of filters had spent their time building them rather than arguing about them, we would have had a wide array of different solutions, without any politics or drama.
The problem there is the insistence of filter proponents (from board down) that it *has* to be done on the sites themselves, with any post-site solution being considered unsuitable. Why is not clear to me either.
- d.
Am 18.06.2012 09:21, schrieb David Gerard:
On 18 June 2012 08:00, Tom Morristom@tommorris.org wrote:
{{sofixit}} If all the people in favour of filters had spent their time building them rather than arguing about them, we would have had a wide array of different solutions, without any politics or drama.
The problem there is the insistence of filter proponents (from board down) that it *has* to be done on the sites themselves, with any post-site solution being considered unsuitable. Why is not clear to me either.
- d.
I guess Tom misunderstood my comment. I wrote down a simple plan how an external solution could work and how to minimize the effort to maintain it. If there is a community (it might overlap with our community) that would run such a "filter portal" (or even multiple portals) then it should be even more sufficient as if we would implement filters inside Wikipedia itself. They could really block images and make a child-save zone after their own definition, while we could continue as usual without having the burden to avoid conflicts.
On 18 June 2012 12:29, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
I guess Tom misunderstood my comment. I wrote down a simple plan how an external solution could work and how to minimize the effort to maintain it. If there is a community (it might overlap with our community) that would run such a "filter portal" (or even multiple portals) then it should be even more sufficient as if we would implement filters inside Wikipedia itself. They could really block images and make a child-save zone after their own definition, while we could continue as usual without having the burden to avoid conflicts.
The Board acted according to the Harris report, which just said to do it on the site itself:
http://meta.wikimedia.org/wiki/2010_Wikimedia_Study_of_Controversial_Content...
It's still not clear to me (looking over part two or part one) why it has to be on the site itself and no post-site solution is acceptable. Presumably someone interested can dredge through part one and pick out the sentences that back this position as opposed to post-site filtering.
- d.
On 18 June 2012 12:39, David Gerard dgerard@gmail.com wrote:
On 18 June 2012 12:29, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
I guess Tom misunderstood my comment. I wrote down a simple plan how an external solution could work and how to minimize the effort to maintain
it.
If there is a community (it might overlap with our community) that would
run
such a "filter portal" (or even multiple portals) then it should be even more sufficient as if we would implement filters inside Wikipedia itself. They could really block images and make a child-save zone after their own definition, while we could continue as usual without having the burden to avoid conflicts.
The Board acted according to the Harris report, which just said to do it on the site itself:
http://meta.wikimedia.org/wiki/2010_Wikimedia_Study_of_Controversial_Content...
It's still not clear to me (looking over part two or part one) why it has to be on the site itself and no post-site solution is acceptable. Presumably someone interested can dredge through part one and pick out the sentences that back this position as opposed to post-site filtering.
Utility; hiding a filter on a lower order site does not make it useful. Incorporating it into the main site (prefferably client side) makes it the most accessible for our community.
Tom
On 18 June 2012 12:41, Thomas Morton morton.thomas@googlemail.com wrote:
On 18 June 2012 12:39, David Gerard dgerard@gmail.com wrote:
The Board acted according to the Harris report, which just said to do it on the site itself: http://meta.wikimedia.org/wiki/2010_Wikimedia_Study_of_Controversial_Content... It's still not clear to me (looking over part two or part one) why it has to be on the site itself and no post-site solution is acceptable. Presumably someone interested can dredge through part one and pick out the sentences that back this position as opposed to post-site filtering.
Utility; hiding a filter on a lower order site does not make it useful. Incorporating it into the main site (prefferably client side) makes it the most accessible for our community.
That's not from the Harris report. What was the justification in the report?
- d.
On 18 June 2012 12:42, David Gerard dgerard@gmail.com wrote:
On 18 June 2012 12:41, Thomas Morton morton.thomas@googlemail.com wrote:
On 18 June 2012 12:39, David Gerard dgerard@gmail.com wrote:
The Board acted according to the Harris report, which just said to do it on the site itself:
http://meta.wikimedia.org/wiki/2010_Wikimedia_Study_of_Controversial_Content...
It's still not clear to me (looking over part two or part one) why it has to be on the site itself and no post-site solution is acceptable. Presumably someone interested can dredge through part one and pick out the sentences that back this position as opposed to post-site filtering.
Utility; hiding a filter on a lower order site does not make it useful. Incorporating it into the main site (prefferably client side) makes it
the
most accessible for our community.
That's not from the Harris report. What was the justification in the report?
Because they were investigating solutions to problems *on* Wikipedia. Seems rather obvious ;)
Or perhaps you didn't read parts in full, this for example:
For example, all of these sites, as WMF pages do, have internally-generated
policies that determine what content is permitted on their sites at all.
Or
However, on every one of these sites, they also employ a series of
user-controlled options (options designed by the site) that allow users to tailor their viewing experiences to their individual needs. Unique among these sites, at the moment, Wikimedia projects employ no such options.
I'm not sure where you are leading with this line of argument.. but it seems to be down a black hole :)
Tom
On Mon, Jun 18, 2012 at 3:21 AM, David Gerard dgerard@gmail.com wrote:
On 18 June 2012 08:00, Tom Morris tom@tommorris.org wrote:
{{sofixit}} If all the people in favour of filters had spent their time building them rather than arguing about them, we would have had a wide array of different solutions, without any politics or drama.
The problem there is the insistence of filter proponents (from board down) that it *has* to be done on the sites themselves, with any post-site solution being considered unsuitable. Why is not clear to me either.
Where do you run the filter? I suppose a sophisticated parent could set up a firewall and a proxy on his home network, but many families don't even have a spare computer to act as the firewall, let alone the technical know-how to run the thing.
Am 18.06.2012 09:00, schrieb Tom Morris:
On Monday, 18 June 2012 at 02:44, Tobias Oelgarte wrote:
Every stupid bot could do this. There is no "running out of the box" solution at the moment, but the effort to set up something like this would be minimal compared to anything else.
I would say that Citizendium failed because they did no automatic updating. What i have in mind is delayed mirror with update control. It is not meant to be edited by hand. It is a subset of the current content selected by the host (one or many users) of the page himself. It is essentially a whitelist for Wikipedia that only contains selected/checked content. That way a "childrens Wiki" could easily be created, by not including any unwanted content, while the effort stays minimal. (Not more effort then to create your own book from a list of already written articles)
{{sofixit}}
If all the people in favour of filters had spent their time building them rather than arguing about them, we would have had a wide array of different solutions, without any politics or drama.
That said, if people want to filter Wikipedia, a client-side solution rather than a filtered mirror is preferable. If a filtered mirror were to come into existence and become popular, this would mean that people would just filter all of main Wikipedia, which would prevent people from editing Wikipedia. A client-side solution means they are still looking at wikipedia.org just without naughty pics and doesn't interfere with editing. It also reduces the need for any servers.
I never meant that we should host or create such a solution on our own. Every external "force", which sees a need to do this, could do this for itself. I'm really not interested to implement a filter on Wikipedia itself. If there is a huge enough group of readers that want's to have its own "view" from Wikipedia, than this would be practical way to go. It would not make much of a difference if it is installed locally or as a web service. The web based solution would only have the advantage that it could "entertain" a open or partially closed community that selects the content.
To clarify: I'm against any kind of filtering done by the WMF or our community itself. If others want to, then they can do that by using and filtering our content on their own.
On 18 June 2012 08:00, Tom Morris tom@tommorris.org wrote:
On Monday, 18 June 2012 at 02:44, Tobias Oelgarte wrote:
Every stupid bot could do this. There is no "running out of the box" solution at the moment, but the effort to set up something like this would be minimal compared to anything else.
I would say that Citizendium failed because they did no automatic updating. What i have in mind is delayed mirror with update control. It is not meant to be edited by hand. It is a subset of the current content selected by the host (one or many users) of the page himself. It is essentially a whitelist for Wikipedia that only contains selected/checked content. That way a "childrens Wiki" could easily be created, by not including any unwanted content, while the effort stays minimal. (Not more effort then to create your own book from a list of already written articles)
{{sofixit}}
If all the people in favour of filters had spent their time building them rather than arguing about them, we would have had a wide array of different solutions, without any politics or drama.
That said, if people want to filter Wikipedia, a client-side solution rather than a filtered mirror is preferable. If a filtered mirror were to come into existence and become popular, this would mean that people would just filter all of main Wikipedia, which would prevent people from editing Wikipedia. A client-side solution means they are still looking at wikipedia.org just without naughty pics and doesn't interfere with editing. It also reduces the need for any servers.
The technical solution is a fairly trivial part of the problem; a client-side filter could probably be put together in a few days IMO.
The *hard* problem is convincing the "not censored" abusers that it's a useful feature for our community.
Tom
Am 18.06.2012 13:52, schrieb Thomas Morton:
On 18 June 2012 08:00, Tom Morristom@tommorris.org wrote:
On Monday, 18 June 2012 at 02:44, Tobias Oelgarte wrote:
Every stupid bot could do this. There is no "running out of the box" solution at the moment, but the effort to set up something like this would be minimal compared to anything else.
I would say that Citizendium failed because they did no automatic updating. What i have in mind is delayed mirror with update control. It is not meant to be edited by hand. It is a subset of the current content selected by the host (one or many users) of the page himself. It is essentially a whitelist for Wikipedia that only contains selected/checked content. That way a "childrens Wiki" could easily be created, by not including any unwanted content, while the effort stays minimal. (Not more effort then to create your own book from a list of already written articles)
{{sofixit}}
If all the people in favour of filters had spent their time building them rather than arguing about them, we would have had a wide array of different solutions, without any politics or drama.
That said, if people want to filter Wikipedia, a client-side solution rather than a filtered mirror is preferable. If a filtered mirror were to come into existence and become popular, this would mean that people would just filter all of main Wikipedia, which would prevent people from editing Wikipedia. A client-side solution means they are still looking at wikipedia.org just without naughty pics and doesn't interfere with editing. It also reduces the need for any servers.
The technical solution is a fairly trivial part of the problem; a client-side filter could probably be put together in a few days IMO.
The *hard* problem is convincing the "not censored" abusers that it's a useful feature for our community.
Tom
It is not convincing since it interferes with the work of our editors that aren't interested in such a feature. If we tag images inside the project itself then we impose our judgment onto it, while ignoring or separating it from the context it is used in. The first proposal (referendum) mentioned various tagging options/categories that would have to be maintained by the community, despite existing and huge backlogs. Additionally we are a multi culture project with quite different view points and which accepts different view points (main difference between Flickr and Co). The result will be huge amount of discussions about whether to tag an image or not. This leads me to the simple conclusion that it isn't worth the effort, especially if the filter is advertised to make Wikipedia a save place for children, while everyone (including children) can disable it at any time.
Separate projects that only focus on one task (providing a whitelisted view, an automatically updated subset of Wikipedia) would not be a burden for the community or at least for everyone not interested in or against filtering. Additionally it could define it's own strict rules and could even hide images and articles entirely depending on it's goal.
But i have to add that the WMF should not be part of this projects. This projects define their own rules like Flickr and Co.
It is not convincing since it interferes with the work of our editors
that aren't interested in such a feature.
Seems unlikely. Although please feel to expand on this with specifics.
If we tag images inside the project itself then we impose our judgment onto it, while ignoring or separating it from the context it is used in.
And yet you allow that we use editorial judgement in articles. This is no different, it gives a further tool for editorial decisions to be made.
The first proposal (referendum) mentioned various tagging options/categories that would have to be maintained by the community, despite existing and huge backlogs.
A reasonable argument; but almost everything adds to our backlog anyway.
Additionally we are a multi culture project with quite different view
points and which accepts different view points (main difference between Flickr and Co).
This is an argument for an opt-in filter.
The result will be huge amount of discussions about whether to tag an image or not.
Not if well designed. And at the moment we have big discussions about whether to include images or not.
This leads me to the simple conclusion that it isn't worth the effort, especially if the filter is advertised to make Wikipedia a save place for children, while everyone (including children) can disable it at any time.
"Think of the children" is not really an argument I ascribe to. And not really one other proponents of the filter, by my observation, ascribe to either.
It mostly seems to be brought up by opponents to try and invalidate arguments.
Separate projects that only focus on one task (providing a whitelisted view, an automatically updated subset of Wikipedia) would not be a burden for the community or at least for everyone not interested in or against filtering. Additionally it could define it's own strict rules and could even hide images and articles entirely depending on it's goal.
Please note we define community in significantly different ways. My "community" includes a minority, us, who edit and maintain the project. And also the vast majority who merely read and use the project.
Our goal as maintainers for this main community should be: * Maximise the ability of individuals to access content by... * Minimising the road blocks (social, political, etc.) to accessing content
A significant portion of the filter discussion is predicated on our internal prejudices and POV - basically navel gazing - with a wide rejection of the idea that a multi-cultural society exists.
A non-WMF filtering project would not be useful to our community due to the chicken/egg seeding problem.
Tom
This leads me to the simple conclusion that it isn't worth the effort, especially if the filter is advertised to make Wikipedia a save place for children, while everyone (including children) can disable it at any time.
"Think of the children" is not really an argument I ascribe to. And not really one other proponents of the filter, by my observation, ascribe to either.
It mostly seems to be brought up by opponents to try and invalidate arguments.
No, the goal of making Wikipedia a safe place for children is the genesis of the filter. It has since been watered down via design by committee into some sort of "let's make people double click before they can see the porn", but there certainly are some who have stuck by their principles, on both sides of the argument.
Am 18.06.2012 15:06, schrieb Thomas Morton:
It is not convincing since it interferes with the work of our editors
that aren't interested in such a feature.
Seems unlikely. Although please feel to expand on this with specifics.
Any tagging by non neutral definitions would interfere with project. It's like to create categories named "bad images", "uninteresting topics" or "not for ethnic minority X".
If we tag images inside the project itself then we impose our judgment onto it, while ignoring or separating it from the context it is used in.
And yet you allow that we use editorial judgement in articles. This is no different, it gives a further tool for editorial decisions to be made.
Editorial judgment is based on how to wrap up a topic a nice way without making an own judgment about the topic. A hard job to do, but that is the goal.
If i would write the article "pornography" then i would have to think about what should be mentioned inside this article because it is important and which parts are not relevant enough or should be but in separate sections to elaborate them in further detail. This is entirely different to say "pornography is good or evil" or "this pornographic practice is good or evil and thats why it should be mentioned or excluded".
There is a difference between the relevance of a topic and the attitude toward a topic. The whole image filter idea is based on the latter and not to be confused with editorial judgment.
The first proposal (referendum) mentioned various tagging options/categories that would have to be maintained by the community, despite existing and huge backlogs.
A reasonable argument; but almost everything adds to our backlog anyway.
I would have nothing against additional work if i would see the benefits. But in this case i see some good points and i also see list of bad points. At best it might be a very tiny improvement which comes along with a huge load of additional work while other parts could be improved with little extra work and be a true improvement. If we had nothing better to do then i would say "yes lets try it". But at the moment it is a plain "No, other things have to come first".
Additionally we are a multi culture project with quite different view
points and which accepts different view points (main difference between Flickr and Co).
This is an argument for an opt-in filter.
Don't confuse opt-in and opt-out if a filter is implemented on an external platform. There is no opt-in or opt-out for Wikipedia as long the WP isn't blocked and the filter is the only access to Wikipedia. <contains some irony>We have the long story that parents want their children to visit Wikipedia without coming across controversial content, which they apparently do everytime they search for something entirely unrelated.</contains some irony> In this case an opt-in (to view) filter makes actually sense. Otherwise it doesn't.
The result will be huge amount of discussions about whether to tag an image or not.
Not if well designed. And at the moment we have big discussions about whether to include images or not.
We have such discussions. But I'm afraid that most of them do not circle around the benefits of the image for the article, but the latter part that i mentioned above (editorial judgment vs attitude judgment).
Believe me or believe me not. If we introduce such tagging then the discussions will only be about personal attitude towards an image, ignoring the context, it's educational benefits entirely.
This leads me to the simple conclusion that it isn't worth the effort, especially if the filter is advertised to make Wikipedia a save place for children, while everyone (including children) can disable it at any time.
"Think of the children" is not really an argument I ascribe to. And not really one other proponents of the filter, by my observation, ascribe to either.
It mostly seems to be brought up by opponents to try and invalidate arguments.
I don't think that we need this argument since the filter can't replace parents anyway. But it is a constant part of the discussions with various exaggerated examples that can be seen in bold at Jimmys talk page even right at this moment. For example:
"Wikipedia helps me teach my children about the world in a safe, clean and trustworthy manner. Free from bias, banter, commercial interests and risky content."[1]
[1] http://en.wikipedia.org/wiki/User_talk:Jimbo_Wales#UK_law
Separate projects that only focus on one task (providing a whitelisted view, an automatically updated subset of Wikipedia) would not be a burden for the community or at least for everyone not interested in or against filtering. Additionally it could define it's own strict rules and could even hide images and articles entirely depending on it's goal.
Please note we define community in significantly different ways. My "community" includes a minority, us, who edit and maintain the project. And also the vast majority who merely read and use the project.
Our goal as maintainers for this main community should be:
- Maximise the ability of individuals to access content by...
- Minimising the road blocks (social, political, etc.) to accessing content
A significant portion of the filter discussion is predicated on our internal prejudices and POV - basically navel gazing - with a wide rejection of the idea that a multi-cultural society exists.
A non-WMF filtering project would not be useful to our community due to the chicken/egg seeding problem.
It is a chicken/egg problem. One part of our community (including readers) dislikes tagging/filtering and sees it as (or the tool for) the creation of road blocks that don't exist at the moment. A second part of our community wants it to be more conservative in fear that it might the deciding factor that could create road blocks. I already mentioned it above in the "benefits vs effort" section.
On 18 June 2012 15:16, Tobias Oelgarte tobias.oelgarte@googlemail.comwrote:
Am 18.06.2012 15:06, schrieb Thomas Morton:
It is not convincing since it interferes with the work of our editors
that aren't interested in such a feature.
Seems unlikely. Although please feel to expand on this with specifics.
Any tagging by non neutral definitions would interfere with project. It's like to create categories named "bad images", "uninteresting topics" or "not for ethnic minority X".
Of course; but that is predicated on a bad process design. Solution; design an appropriate process.
If we tag images inside the project itself then we impose our judgment
onto it, while ignoring or separating it from the context it is used in.
And yet you allow that we use editorial judgement in articles. This is no different, it gives a further tool for editorial decisions to be made.
Editorial judgment is based on how to wrap up a topic a nice way without making an own judgment about the topic. A hard job to do, but that is the goal.
If i would write the article "pornography" then i would have to think about what should be mentioned inside this article because it is important and which parts are not relevant enough or should be but in separate sections to elaborate them in further detail. This is entirely different to say "pornography is good or evil" or "this pornographic practice is good or evil and thats why it should be mentioned or excluded".
There is a difference between the relevance of a topic and the attitude toward a topic. The whole image filter idea is based on the latter and not to be confused with editorial judgment.
Pornography articles, as it stands, have a community-implemented "filter" as it is. Which is the tradition that articles are illustrated with graphics, not photographs. So the example is a poor one; because we already have a poor man's filter :)
Similarly the decision "does this image represent hardcore porn, softcore porn, nudity or none of the above" is an editorial one. Bad design process would introduce POV issues - but we are plagued with them anyway. If anything this gives us an opportunity to design and trial a process without those issues (or at least minimising them).
The first proposal (referendum) mentioned various tagging
options/categories that would have to be maintained by the community, despite existing and huge backlogs.
A reasonable argument; but almost everything adds to our backlog anyway.
I would have nothing against additional work if i would see the benefits. But in this case i see some good points and i also see list of bad points. At best it might be a very tiny improvement which comes along with a huge load of additional work while other parts could be improved with little extra work and be a true improvement. If we had nothing better to do then i would say "yes lets try it". But at the moment it is a plain "No, other things have to come first".
Additionally we are a multi culture project with quite different view
points and which accepts different view points (main difference between Flickr and Co).
This is an argument for an opt-in filter.
Don't confuse opt-in and opt-out if a filter is implemented on an external platform. There is no opt-in or opt-out for Wikipedia as long the WP isn't blocked and the filter is the only access to Wikipedia. <contains some irony>We have the long story that parents want their children to visit Wikipedia without coming across controversial content, which they apparently do everytime they search for something entirely unrelated.</contains some irony> In this case an opt-in (to view) filter makes actually sense. Otherwise it doesn't.
We may be confusing opt in/out between us. The filter I would like to see is optional to enable (and then stays enabled) and gives a robust method of customising the level and type of filtering.
The result will be huge amount of discussions about whether to tag an
image or not.
Not if well designed. And at the moment we have big discussions about whether to include images or not.
We have such discussions. But I'm afraid that most of them do not circle around the benefits of the image for the article, but the latter part that i mentioned above (editorial judgment vs attitude judgment).
Filtering images would resolve most of these issues.
Believe me or believe me not. If we introduce such tagging then the discussions will only be about personal attitude towards an image, ignoring the context, it's educational benefits entirely.
We successfully tag images as pornographic, apparently without drama, already. So I find this scenario unlikely.
This leads me to the simple conclusion that it isn't worth the effort,
especially if the filter is advertised to make Wikipedia a save place for children, while everyone (including children) can disable it at any time.
"Think of the children" is not really an argument I ascribe to. And not
really one other proponents of the filter, by my observation, ascribe to either.
It mostly seems to be brought up by opponents to try and invalidate arguments.
I don't think that we need this argument since the filter can't replace parents anyway. But it is a constant part of the discussions with various exaggerated examples that can be seen in bold at Jimmys talk page even right at this moment. For example:
"Wikipedia helps me teach my children about the world in a safe, clean and trustworthy manner. Free from bias, banter, commercial interests and risky content."[1]
[1] http://en.wikipedia.org/wiki/**User_talk:Jimbo_Wales#UK_lawhttp://en.wikipedia.org/wiki/User_talk:Jimbo_Wales#UK_law
Separate projects that only focus on one task (providing a whitelisted
view, an automatically updated subset of Wikipedia) would not be a burden for the community or at least for everyone not interested in or against filtering. Additionally it could define it's own strict rules and could even hide images and articles entirely depending on it's goal.
Please note we define community in significantly different ways. My
"community" includes a minority, us, who edit and maintain the project. And also the vast majority who merely read and use the project.
Our goal as maintainers for this main community should be:
- Maximise the ability of individuals to access content by...
- Minimising the road blocks (social, political, etc.) to accessing
content
A significant portion of the filter discussion is predicated on our internal prejudices and POV - basically navel gazing - with a wide rejection of the idea that a multi-cultural society exists.
A non-WMF filtering project would not be useful to our community due to the chicken/egg seeding problem.
It is a chicken/egg problem. One part of our community (including readers) dislikes tagging/filtering and sees it as (or the tool for) the creation of road blocks that don't exist at the moment. A second part of our community wants it to be more conservative in fear that it might the deciding factor that could create road blocks. I already mentioned it above in the "benefits vs effort" section.
We don't have much data on what our readers want; but a not insignificant portion of them, at least, are concerned with controversial images (nudity, Mohammed, etc.). I fully advocate finding out what the community thinks; but when I raised this issue before it was snorted at with something along the lines of "the readers aren't the driving force here".
*sigh*
Tom
Am 18.06.2012 16:31, schrieb Thomas Morton:
On 18 June 2012 15:16, Tobias Oelgartetobias.oelgarte@googlemail.comwrote:
Any tagging by non neutral definitions would interfere with project. It's like to create categories named "bad images", "uninteresting topics" or "not for ethnic minority X".
Of course; but that is predicated on a bad process design. Solution; design an appropriate process.
So far i have not seen any indication to design an appropriate process. If there is such a design work in progress i would be really interested how the current ideas look like and if they are more convincing then the latest proposals (e.g. referendum) that only touched the surface and ignored many potential issues.
Editorial judgment is based on how to wrap up a topic a nice way without making an own judgment about the topic. A hard job to do, but that is the goal.
If i would write the article "pornography" then i would have to think about what should be mentioned inside this article because it is important and which parts are not relevant enough or should be but in separate sections to elaborate them in further detail. This is entirely different to say "pornography is good or evil" or "this pornographic practice is good or evil and thats why it should be mentioned or excluded".
There is a difference between the relevance of a topic and the attitude toward a topic. The whole image filter idea is based on the latter and not to be confused with editorial judgment.
Pornography articles, as it stands, have a community-implemented "filter" as it is. Which is the tradition that articles are illustrated with graphics, not photographs. So the example is a poor one; because we already have a poor man's filter :)
Similarly the decision "does this image represent hardcore porn, softcore porn, nudity or none of the above" is an editorial one. Bad design process would introduce POV issues - but we are plagued with them anyway. If anything this gives us an opportunity to design and trial a process without those issues (or at least minimising them).
That is already a sad thing, but this does not apply to all language versions. Some only use this illustrations since they are more suitable to illustrate the term or practice, other because of the "community implemented filter" and it might vary from article to article.
You make me interested to hear what a good design could look like.
I would have nothing against additional work if i would see the benefits. But in this case i see some good points and i also see list of bad points. At best it might be a very tiny improvement which comes along with a huge load of additional work while other parts could be improved with little extra work and be a true improvement. If we had nothing better to do then i would say "yes lets try it". But at the moment it is a plain "No, other things have to come first". Don't confuse opt-in and opt-out if a filter is implemented on an external platform. There is no opt-in or opt-out for Wikipedia as long the WP isn't blocked and the filter is the only access to Wikipedia.<contains some irony>We have the long story that parents want their children to visit Wikipedia without coming across controversial content, which they apparently do everytime they search for something entirely unrelated.</contains some irony> In this case an opt-in (to view) filter makes actually sense. Otherwise it doesn't.
We may be confusing opt in/out between us. The filter I would like to see is optional to enable (and then stays enabled) and gives a robust method of customising the level and type of filtering.
While I'm personally not against filtering on personal level someone will still have to deal with it (open design question).
We have such discussions. But I'm afraid that most of them do not circle around the benefits of the image for the article, but the latter part that i mentioned above (editorial judgment vs attitude judgment).
Filtering images would resolve most of these issues.
I think it would just reset the borders, but it won't take long until new lines are drawn and the discussions will continue. Now it is "OMG vs WP:NOT CENSORED" later it will be "OMG vs Use the filter". But at the same time we will have new discussions regarding the filter itself (open design question).
Believe me or believe me not. If we introduce such tagging then the discussions will only be about personal attitude towards an image, ignoring the context, it's educational benefits entirely.
We successfully tag images as pornographic, apparently without drama, already. So I find this scenario unlikely.
No. We don't tag images _as_ pornographic. We tag them _as related to_ pornography. Just take a look at the category pornography at Commons.
http://commons.wikimedia.org/wiki/Category:Pornography
This applies to terms like violence and other stuff as well.
It is a chicken/egg problem. One part of our community (including readers) dislikes tagging/filtering and sees it as (or the tool for) the creation of road blocks that don't exist at the moment. A second part of our community wants it to be more conservative in fear that it might the deciding factor that could create road blocks. I already mentioned it above in the "benefits vs effort" section.
We don't have much data on what our readers want; but a not insignificant portion of them, at least, are concerned with controversial images (nudity, Mohammed, etc.). I fully advocate finding out what the community thinks; but when I raised this issue before it was snorted at with something along the lines of "the readers aren't the driving force here".
*sigh*
I asked for the same thing and got no response as well. We had the referendum which had big flaws,[1] but not a single neutral survey directed at the readers under the light that our community is most likely biased...
[1] explained in length at http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/en
On Mon, Jun 18, 2012 at 4:18 PM, Tobias Oelgarte < tobias.oelgarte@googlemail.com> wrote:
Am 18.06.2012 16:31, schrieb Thomas Morton:
We don't have much data on what our readers want; but a not insignificant
portion of them, at least, are concerned with controversial images
(nudity, Mohammed, etc.). I fully advocate finding out what the community thinks; but when I raised this issue before it was snorted at with something along the lines of "the readers aren't the driving force here".
*sigh*
I asked for the same thing and got no response as well. We had the
referendum which had big flaws,[1] but not a single neutral survey directed at the readers under the light that our community is most likely biased...
[1] explained in length at http://meta.wikimedia.org/** wiki/Talk:Image_filter_**referendum/enhttp://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/en
That's one point we seem to agree on. I asked the same question a year ago – why did nobody survey the reading public, or the donors?
A
On Mon, Jun 18, 2012 at 3:16 PM, Tobias Oelgarte < tobias.oelgarte@googlemail.com> wrote:
Am 18.06.2012 15:06, schrieb Thomas Morton: I don't think that we need this argument since the filter can't replace parents anyway. But it is a constant part of the discussions with various exaggerated examples that can be seen in bold at Jimmys talk page even right at this moment. For example:
"Wikipedia helps me teach my children about the world in a safe, clean and trustworthy manner. Free from bias, banter, commercial interests and risky content."[1]
[1] http://en.wikipedia.org/wiki/**User_talk:Jimbo_Wales#UK_lawhttp://en.wikipedia.org/wiki/User_talk:Jimbo_Wales#UK_law
The issue there is that on the one hand, the Foundation's fundraising materials advertise Wikipedia as being God's gift for children, especially underprivileged children, through official fundraiser "stories" like these*:
"Wikipedia helps me teach my children about the world in a safe, clean and trustworthy manner. Free from bias, banter, commercial interests and risky content."
"Wikipedia has been a wonderful recourse for my children and me to learn new terms, knowledge, and culture background as an immigrant family. It is a safe and trustworthy website for children to do their research."
"Thanks to websites like 'Wikipedia', children of all ages can continue their endeavor in learning."
"We are a family that live in the interior of Brazil in a very poor state. We have opened a learning center and work with local children from nearby villages. Wikipedia is INVALUABLE for this work."
"I worked for a non-profit in India and even the poorest children who were receiving education there knew about Wikipedia and were familiar with the site."
So that's one half of the story. The other half of the story is that the community says the exact opposite: Wikipedia is not for children, but for adults, and only a moron or a bad parent would let their children go on Wikipedia unsupervised. Go figure.
Andreas
On Tue, Jun 19, 2012 at 11:22 AM, Andreas Kolbe jayen466@gmail.com wrote:
On Mon, Jun 18, 2012 at 3:16 PM, Tobias Oelgarte < tobias.oelgarte@googlemail.com> wrote:
Am 18.06.2012 15:06, schrieb Thomas Morton: I don't think that we need this argument since the filter can't replace parents anyway. But it is a constant part of the discussions with various exaggerated examples that can be seen in bold at Jimmys talk page even right at this moment. For example:
"Wikipedia helps me teach my children about the world in a safe, clean and trustworthy manner. Free from bias, banter, commercial interests and risky content."[1]
[1] http://en.wikipedia.org/wiki/**User_talk:Jimbo_Wales#UK_lawhttp://en.wikipedia.org/wiki/User_talk:Jimbo_Wales#UK_law
The issue there is that on the one hand, the Foundation's fundraising materials advertise Wikipedia as being God's gift for children, especially underprivileged children, through official fundraiser "stories" like these*:
"Wikipedia helps me teach my children about the world in a safe, clean and trustworthy manner. Free from bias, banter, commercial interests and risky content."
"Wikipedia has been a wonderful recourse for my children and me to learn new terms, knowledge, and culture background as an immigrant family. It is a safe and trustworthy website for children to do their research."
"Thanks to websites like 'Wikipedia', children of all ages can continue their endeavor in learning."
"We are a family that live in the interior of Brazil in a very poor state. We have opened a learning center and work with local children from nearby villages. Wikipedia is INVALUABLE for this work."
"I worked for a non-profit in India and even the poorest children who were receiving education there knew about Wikipedia and were familiar with the site."
So that's one half of the story. The other half of the story is that the community says the exact opposite: Wikipedia is not for children, but for adults, and only a moron or a bad parent would let their children go on Wikipedia unsupervised. Go figure.
Andreas
Wikimedia-l mailing list Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
Only a moron or a bad parent would let their children go -on the Internet-, unsupervised, Wikipedia or otherwise. Teaching your children to use the Internet responsibly is no different than teaching them to drive-at first, you have them watch you, then you let them start taking the wheel with you watching closely, then as they gain experience, maybe they can take short drives on quiet roads alone, and then on from there. Throwing your kids on the Internet without giving them any idea of what to expect is like handing them the keys when they've never been on the road before.
My oldest kid is kind of in the intermediate stage right now-she can use the Net, but I check in reasonably frequently. As she continues to use it responsibly, the frequency of those checks will drop gradually, until one day she knows how to properly and safely use it with no supervision. My youngest is still at the stage where if she wants to get online, I'm sitting right next to her. My middle one can very briefly go online alone to a few sites I've already agreed to, and I check up on her a lot.
But the whole point is, that's -my- job, not anyone else's, just like it's my job to teach them how to drive, not everyone else's to get the hell off the road before they start to. Why are we figuring this to be any different? The world isn't always safe for children, and it is the job of -parents- to keep children away from areas unsuitable for them, and to alert them to the type of things they might encounter, not the job of everyone else to make sure the whole earth is covered in safety plastic and rubber bumpers.
On Tue, Jun 19, 2012 at 1:52 PM, Todd Allen toddmallen@gmail.com wrote:
My middle one can very briefly go online alone to a few sites I've already agreed to, and I check up on her a lot.
Is Wikipedia one of those few sites?
But the whole point is, that's -my- job, not anyone else's, just like it's my job to teach them how to drive, not everyone else's to get the hell off the road before they start to. Why are we figuring this to be any different?
Well, surely it is different. If you leave your keys in your car with the car running, and my ten year old hops in and takes it for a joyride, you don't think you're partially responsible for what happens?
The world isn't always safe for children, and it is the job of -parents- to keep children away from areas unsuitable for them, and to alert them to the type of things they might encounter, not the job of everyone else to make sure the whole earth is covered in safety plastic and rubber bumpers.
The question, really, is whether or not Wikipedia (or, at least, a cordoned off section of Wikipedia) wants to be one of those safe places.
Personally, all I'm saying is that it would be nice if it did. Some others are saying that, if Wikipedia chooses not to be a place which is safe for children, then Wikipedia shouldn't be marketed to children - that the fundraisers shouldn't advertise Wikipedia as being a project which benefits children. And I think they have a good point.
And actually, I have to nit-pick and say that it isn't *only* my job and "not anyone else's". It's also the job of others who have *chosen* to help me with it. I think that's an important point, because the vast majority of us are *not* saying that Wikipedia *has to* choose to facilitate the creation of an educational resource for children. We're saying you *should* choose to do so.
On Tue, Jun 19, 2012 at 9:23 PM, Anthony wikimail@inbox.org wrote:
On Tue, Jun 19, 2012 at 1:52 PM, Todd Allen toddmallen@gmail.com wrote:
My middle one can very briefly go online alone to a few sites I've already agreed to, and I check up on her a lot.
Is Wikipedia one of those few sites?
Yes, actually, along with several other educational ones, some with children's games, her school website, etc. The chances that she would randomly stumble across a sexual image on Wikipedia are -vanishingly- slim, and quite realistically, if it were to happen, I would much rather it occur in the context of a dispassionate article giving a frank but rather dry account of what it means, than a porn site with flashing banners and descriptions designed to shock, titillate, etc. Her main interest is in dinosaurs, horses, and veterinary medicine, though-not exactly controversial sections of the project.
But the whole point is, that's -my- job, not anyone else's, just like it's my job to teach them how to drive, not everyone else's to get the hell off the road before they start to. Why are we figuring this to be any different?
Well, surely it is different. If you leave your keys in your car with the car running, and my ten year old hops in and takes it for a joyride, you don't think you're partially responsible for what happens?
My ten year old kid isn't stupid enough to do that. If yours is, you failed long before they got in the driver's seat. So no, I wouldn't particularly feel responsible-if your kid is that immature and prone to rash behavior, you shouldn't have let them out of your sight. If mine did that, I would absolutely feel fully responsible for it-ten years is plenty of time that she should know that's an extremely dangerous thing to do.
The world isn't always safe for children, and it is the job of -parents- to keep children away from areas unsuitable for them, and to alert them to the type of things they might encounter, not the job of everyone else to make sure the whole earth is covered in safety plastic and rubber bumpers.
The question, really, is whether or not Wikipedia (or, at least, a cordoned off section of Wikipedia) wants to be one of those safe places.
And like I said, and have seen with my own kids, the vast majority of it is. I would wager that a far higher percentage of Wikipedia is "child-safe" than the percentage of the Internet at large. I have no problem recommending that my kids go read a Wikipedia article on something they're curious about, and then go look at the sources cited in it for more information.
Personally, all I'm saying is that it would be nice if it did. Some others are saying that, if Wikipedia chooses not to be a place which is safe for children, then Wikipedia shouldn't be marketed to children
- that the fundraisers shouldn't advertise Wikipedia as being a
project which benefits children. And I think they have a good point.
If someone wants to make a Kidopedia, with everything nuked out that they consider child-unfriendly, more power to them. They're welcome to host that wherever they like. They could even work at having the project in language aimed more at children, and perhaps making a point to cite children's education sources in articles in addition to newspapers, science journals, etc. This is free content, and someone's absolutely welcome to go and do that.
But that's not -this- project, its aim is to be comprehensive. The two are mutually exclusive, because the real world is not always pleasant or child-safe.
And actually, I have to nit-pick and say that it isn't *only* my job and "not anyone else's". It's also the job of others who have *chosen* to help me with it. I think that's an important point, because the vast majority of us are *not* saying that Wikipedia *has to* choose to facilitate the creation of an educational resource for children. We're saying you *should* choose to do so.
If you hire a babysitter, sure, it becomes their job-they accepted it as such. The same if you have family, etc., who help with your children, as well as teachers and the like who voluntarily assume responsibility for your child while in their care. That's fine. But you shouldn't be able to force total strangers to accept the responsibility of supervising your children because you can't be bothered to do it, and you certainly shouldn't be able to insist that public places be childproofed.
Now if someone wants to take on that Kidopedia project, hey-all they need is a DB dump, a webhost, and the time to nuke out whatever they don't want. Given our categorization system, the time part's probably not even as onerous as it sounds at first. That's the whole point of free content-anyone here yowling that there should be such a thing can go make that thing, any day they want! If no one wants to do it, despite the fact that they don't need anyone's consent at all to get started on it right this minute, guess it's not that big a deal after all, is it? But that's certainly not what I want to see -this- project changed into.
And no, I don't think we should do so. I think we should facilitate the creation of a comprehensive educational resource. What parts of that resource parents will allow their children to look at is up to the parents, right squarely where that decision belongs. Adults, on the other hand, should have the option of looking up any topic they like, and finding it covered frankly and fully, as should any parents who perhaps have children with questions on sex and sexuality, and might like to have a resource that discusses such things in a neutral, educational tone, but still covers the topic frankly and without treating it as disgusting or shameful. That's the project I think we should choose to continue building.
On Wed, Jun 20, 2012 at 1:06 AM, Todd Allen toddmallen@gmail.com wrote:
On Tue, Jun 19, 2012 at 9:23 PM, Anthony wikimail@inbox.org wrote:
On Tue, Jun 19, 2012 at 1:52 PM, Todd Allen toddmallen@gmail.com wrote:
My middle one can very briefly go online alone to a few sites I've already agreed to, and I check up on her a lot.
Is Wikipedia one of those few sites?
Yes, actually, along with several other educational ones, some with children's games, her school website, etc. The chances that she would randomly stumble across a sexual image on Wikipedia are -vanishingly- slim,
Really? How old are we talking about?
And what do you mean "randomly stumble across"? I don't think it would be random. It would be one link leads to another, leads to another, leads to another...
Also, how do you deal with the external links? Do you have any type of blocking software set up, or does your daughter recognize the different shades of blue and know that she's not allowed to click on the blues of a lighter shade without permission?
and quite realistically, if it were to happen, I would much rather it occur in the context of a dispassionate article giving a frank but rather dry account of what it means, than a porn site with flashing banners and descriptions designed to shock, titillate, etc.
Wikipedia is better than a porn site. But "better than a porn site" doesn't mean it's necessarily a place I'd like my child to go to to learn about a sexual topic.
Her main interest is in dinosaurs, horses, and veterinary medicine, though-not exactly controversial sections of the project.
You've never gotten any of the "tough questions"? The ones that I claim, and you don't seem to deny, are not best answered by a Wikipedia article.
But the whole point is, that's -my- job, not anyone else's, just like it's my job to teach them how to drive, not everyone else's to get the hell off the road before they start to. Why are we figuring this to be any different?
Well, surely it is different. If you leave your keys in your car with the car running, and my ten year old hops in and takes it for a joyride, you don't think you're partially responsible for what happens?
My ten year old kid isn't stupid enough to do that. If yours is, you failed long before they got in the driver's seat.
Well, I don't have a ten-year-old kid, let alone one that would hop into a car and go for a joyride. But hypothetically speaking, maybe s/he has a mental disability which is not a failure of mine at all.
So no, I wouldn't particularly feel responsible-if your kid is that immature and prone to rash behavior, you shouldn't have let them out of your sight.
Well, first of all, every parent has to, at some point, let their kid out of their sight (if nothing else, at some point they have to sleep). So, the failure is not necessarily that of the parent. It could be the failure of the baby-sitter, or the failure of the school bus driver, or the action of a kidnapper, or any of a number of other possibilities.
But, in any case, my point is not that the current caregiver of the child is not at fault. My point is that the person who left their car running, unattended, with the doors unlocked, in it is *also* at fault.
The law would certainly agree with me on this. I guess you would disagree with this aspect of law?
The question, really, is whether or not Wikipedia (or, at least, a cordoned off section of Wikipedia) wants to be one of those safe places.
And like I said, and have seen with my own kids, the vast majority of it is. I would wager that a far higher percentage of Wikipedia is "child-safe" than the percentage of the Internet at large.
Well, yes, if you go by word count or article count. If you go by number of pageviews, I'm not so sure. There are large portions of Wikipedia which are perfectly safe for Wikipedia, and also completely ignored by almost everyone.
I have no problem recommending that my kids go read a Wikipedia article on something they're curious about, and then go look at the sources cited in it for more information.
So, you'd let them go on the Internet unsupervised.
If someone wants to make a Kidopedia, with everything nuked out that they consider child-unfriendly, more power to them. They're welcome to host that wherever they like. They could even work at having the project in language aimed more at children, and perhaps making a point to cite children's education sources in articles in addition to newspapers, science journals, etc. This is free content, and someone's absolutely welcome to go and do that.
But that's not -this- project, its aim is to be comprehensive.
What exactly do you mean by "this project"? Are you talking about Wikipedia, or about WMF in general?
WMF already does have a Kidopedia of sorts - Wikijunior.
If you hire a babysitter, sure, it becomes their job-they accepted it as such. The same if you have family, etc., who help with your children, as well as teachers and the like who voluntarily assume responsibility for your child while in their care. That's fine. But you shouldn't be able to force total strangers to accept the responsibility of supervising your children because you can't be bothered to do it,
Of course not. This is, in fact, what I've said.
and you certainly shouldn't be able to insist that public places be childproofed.
Insist how? I certainly should be able to verbally insist that public places be childproofed.
As for using force (via government), I'd say that a place which is open to the general public, especially one which markets itself to children, has a duty of care to its visitors. That said, I don't think Wikipedia is breaching this duty of care, as I don't think the harm rises to the level of "foreseeable" which justifies government action.
Now if someone wants to take on that Kidopedia project, hey-all they need is a DB dump, a webhost, and the time to nuke out whatever they don't want.
We've had this discussion, though if you'd like to revisit it we can do that.
Forking Wikipedia is not easy.
Given our categorization system, the time part's probably not even as onerous as it sounds at first.
Choosing which articles to *add* would probably not be that bad (the problem with that route would be technical/legal). Choosing which articles to *delete* would be quite time-consuming if you wanted to get them all.
If no one wants to do it, despite the fact that they don't need anyone's consent at all to get started on it right this minute, guess it's not that big a deal after all, is it?
Again, this has already been discussed. 1) It's not a very easy thing to do; and 2) the license makes it hard to profit off doing it. Because of 2, it would probably be a non-profit organization which would be more likely to do it. Citizendium, quite infamously, tried. And it failed for many reasons which need not be replicated in the next attempt. But I'm not sure the whole idea is particularly great.
I think you'd have an easier time creating a child-friendly encyclopedia from scratch than from using Wikipedia. Maybe along with some hand copy/pasting from Wikipedia, though this raises tricky issues of how to handle attribution (issues which were partly resolved by the point.
This seems to be the approach of Wikijunior, which they've snuck into the WMF via Wikibooks (as many projects which raise the ire of some Wikipedians have done).
I think we should facilitate the creation of a comprehensive educational resource. What parts of that resource parents will allow their children to look at is up to the parents, right squarely where that decision belongs.
I agree with that. The decision of what parts of Wikipedia parents should allow their children to look at is up to the parents.
I just think there should be tools which help parents implement those decisions. And I think WMF is in the best position to provide them.
In fact, I would argue that providing the information which can be used by these tools would be part of the comprehensiveness of a comprehensive educational resource. A comprehensive educational resource would be an educational resource which can be used by all people, not just adults. And that means different content for different people.
Adults, on the other hand, should have the option of looking up any topic they like, and finding it covered frankly and fully, as should any parents who perhaps have children with questions on sex and sexuality, and might like to have a resource that discusses such things in a neutral, educational tone, but still covers the topic frankly and without treating it as disgusting or shameful. That's the project I think we should choose to continue building.
I definitely disagree on the "neutral" part. In fact, I'd say "neutral" is directly opposed to "educational". When answering questions from children on sex and sexuality, parents should definitely teach their children about right and wrong. In fact, you seem to agree with this yourself, as you suggest that these parents want to teach their children that sex and sexuality is *not* disgusting or shameful. The position that sex and sexuality is not disgusting or shameful is not one that is "neutral".
On Wed, Jun 20, 2012 at 6:06 AM, Todd Allen toddmallen@gmail.com wrote:
Yes, actually, along with several other educational ones, some with children's games, her school website, etc. The chances that she would randomly stumble across a sexual image on Wikipedia are -vanishingly- slim, ...
There is another aspect to this, which is that Wikipedia presently gives undue weight to the weird, bizarre and even the completely made-up. To give an example: every kid will look up the word fuck at some point in their lives. Wikipedia offers, at the bottom of that article, the sexual slang template
http://en.wikipedia.org/wiki/Template:Sexual_slang
with links to (partly illustrated) articles on a whole slew of weird and obscure practices, while missing out many of the slang terms ordinary people actually use in the bedroom. Basically, it's urban dictionary, written for the lulz, rather than sex education.
Even the article on the humble gel bracelet
http://en.wikipedia.org/wiki/Gel_bracelet
contains more about a sexual urban legend than anything else, and it too comes with a template offering helpful links to Wikipedia's bizarre world of sex.
Larry recently illustrated another way in which kids can come across Wikimedia's wealth of sexual media:
http://www.youtube.com/watch?v=uE4Z9qunAc4
As Seth Finkelstein pointed out the other day, there is opposition to pornography both from the right, on a family values basis, and from the left, from feminists countering male bias. These are quite separate, but equally valid concerns.
It's not for nothing for example that Anita Sarkeesian's article was vandalised with porn. Male-fantasy porn expresses male dominance; in this case, it was used to emphatically reassert that dominance, because Sarkeesian had threatened it. It's as symbolic as the babe calendar on the office wall: it signals that women don't have much to say in that office, and can be greeted with cat calls or put-downs.
I am not against pornography per se. I just wish that if the projects have it, they'd handle it responsibly, the way everybody else does quite naturally. That means with respect for subject privacy, gender issues, child protection issues, and so forth. Just be professional about it and follow best practice.
Andreas
On Wed, Jun 20, 2012 at 1:57 PM, Andreas Kolbe jayen466@gmail.com wrote:
On Wed, Jun 20, 2012 at 6:06 AM, Todd Allen toddmallen@gmail.com wrote:
Yes, actually, along with several other educational ones, some with children's games, her school website, etc. The chances that she would randomly stumble across a sexual image on Wikipedia are -vanishingly- slim, ...
There is another aspect to this, which is that Wikipedia presently gives undue weight to the weird, bizarre and even the completely made-up. To give an example: every kid will look up the word fuck at some point in their lives. Wikipedia offers, at the bottom of that article, the sexual slang template
http://en.wikipedia.org/wiki/Template:Sexual_slang
with links to (partly illustrated) articles on a whole slew of weird and obscure practices, while missing out many of the slang terms ordinary people actually use in the bedroom. Basically, it's urban dictionary, written for the lulz, rather than sex education.
{{sofixit}}, just like any area with NPOV/undue weight issues.
Even the article on the humble gel bracelet
http://en.wikipedia.org/wiki/Gel_bracelet
contains more about a sexual urban legend than anything else, and it too comes with a template offering helpful links to Wikipedia's bizarre world of sex.
It's well known for that. Like it or not, that's the aspect of them that most sources write about. That's not in that case undue weight, it's -due- weight.
Larry recently illustrated another way in which kids can come across Wikimedia's wealth of sexual media:
Good for him. Care to summarize his argument? I don't particularly care to watch his video, or for him in general after the OHNOESVIRGINKILLERIMAGE!!! hysteria a while back.
As Seth Finkelstein pointed out the other day, there is opposition to pornography both from the right, on a family values basis, and from the left, from feminists countering male bias. These are quite separate, but equally valid concerns.
And like anything, we should catalog and report on the debate over the issue in articles about it, accurately summarizing reliable sources with due weight for each position, without as a project actually taking a position ourselves.
It's not for nothing for example that Anita Sarkeesian's article was vandalised with porn. Male-fantasy porn expresses male dominance; in this case, it was used to emphatically reassert that dominance, because Sarkeesian had threatened it. It's as symbolic as the babe calendar on the office wall: it signals that women don't have much to say in that office, and can be greeted with cat calls or put-downs.
Alright, so someone is both a vandal and a jerk. I'm not seeing the relevance in that, to a discussion about having sexual images in articles where they -are- germane and on topic. Could you please clarify that?
I am not against pornography per se. I just wish that if the projects have it, they'd handle it responsibly, the way everybody else does quite naturally. That means with respect for subject privacy, gender issues, child protection issues, and so forth. Just be professional about it and follow best practice.
You are, of course, starting from the presumption that the way you want to do it -is- the "responsible" way, or what have you. I have no problem with developing best practices, and certainly I don't think anyone will argue that we should host or retain porn or near-porn involving kids, but you want a very strict practice. A lot of us disagree to that, and really don't want to treat such images significantly differently from others, so long as they clearly involve adults. I think we could also develop privacy best practices, such that the subject of a photo must either be: a) Unidentifiable (or rendered unidentifiable), b) Show clear awareness that they are being photographed, or c) Give an explicit release. But I don't think c) is necessary if a) or b) are satisfied.
Andreas _______________________________________________ Wikimedia-l mailing list Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
On Wed, Jun 20, 2012 at 6:03 PM, Todd Allen toddmallen@gmail.com wrote:
{{sofixit}}, just like any area with NPOV/undue weight issues.
"The next day someone will fix it back." - Douglas Hofstadter
Good for him. Care to summarize his argument? I don't particularly care to watch his video, or for him in general after the OHNOESVIRGINKILLERIMAGE!!! hysteria a while back.
Yeah, it's pretty bad.
You are, of course, starting from the presumption that the way you want to do it -is- the "responsible" way, or what have you.
As opposed to what, assuming that the way we want to do it is the irresponsible way?
If I thought the way I wanted to do something was irresponsible, I wouldn't want to do it that way any more!
I have no problem with developing best practices, and certainly I don't think anyone will argue that we should host or retain porn or near-porn involving kids
Certainly some people will argue this. I believe that, fortunately, most of them are banned, though.
On Wed, Jun 20, 2012 at 4:46 PM, Anthony wikimail@inbox.org wrote:
On Wed, Jun 20, 2012 at 6:03 PM, Todd Allen toddmallen@gmail.com wrote:
{{sofixit}}, just like any area with NPOV/undue weight issues.
"The next day someone will fix it back." - Douglas Hofstadter
Such is the nature of this project. If no one ever did anything because of that possibility, no one would ever do anything at all.
Good for him. Care to summarize his argument? I don't particularly care to watch his video, or for him in general after the OHNOESVIRGINKILLERIMAGE!!! hysteria a while back.
Yeah, it's pretty bad.
You are, of course, starting from the presumption that the way you want to do it -is- the "responsible" way, or what have you.
As opposed to what, assuming that the way we want to do it is the irresponsible way?
If I thought the way I wanted to do something was irresponsible, I wouldn't want to do it that way any more!
I perhaps wasn't clear. My problem was in the phrasing of "We should do this the responsible way," followed by a description of the way Andreas wanted to do it. The trouble is, that's essentially starting from the premise that there's agreement on both sides on what the responsible way -is-, and one side is arguing to be irresponsible. That is, of course, not the case. Rather, many of us believe that it would be irresponsible to implement censorship on an uncensored, comprehensive educational project.
I have no problem with developing best practices, and certainly I don't think anyone will argue that we should host or retain porn or near-porn involving kids
Certainly some people will argue this. I believe that, fortunately, most of them are banned, though.
Uh...wow. One would hope so. I don't believe that's very common, though. Certainly no one I've heard arguing against censorship is in favor of that.
Wikimedia-l mailing list Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
On Thu, Jun 21, 2012 at 10:01 AM, Todd Allen toddmallen@gmail.com wrote:
On Wed, Jun 20, 2012 at 4:46 PM, Anthony wikimail@inbox.org wrote:
On Wed, Jun 20, 2012 at 6:03 PM, Todd Allen toddmallen@gmail.com wrote:
{{sofixit}}, just like any area with NPOV/undue weight issues.
"The next day someone will fix it back." - Douglas Hofstadter
Such is the nature of this project. If no one ever did anything because of that possibility, no one would ever do anything at all.
Well, it's not just that it's possible, it's that I judge the probability to be very high.
Rather, many of us believe that it would be irresponsible to implement censorship on an uncensored, comprehensive educational project.
I have no problem with developing best practices, and certainly I don't think anyone will argue that we should host or retain porn or near-porn involving kids
Certainly some people will argue this. I believe that, fortunately, most of them are banned, though.
Uh...wow. One would hope so. I don't believe that's very common, though. Certainly no one I've heard arguing against censorship is in favor of that.
But a policy against porn or near-porn involving kids *is* censorship, is it not?
On Thu, Jun 21, 2012 at 10:18 AM, Anthony wikimail@inbox.org wrote:
On Thu, Jun 21, 2012 at 10:01 AM, Todd Allen toddmallen@gmail.com wrote:
On Wed, Jun 20, 2012 at 4:46 PM, Anthony wikimail@inbox.org wrote:
On Wed, Jun 20, 2012 at 6:03 PM, Todd Allen toddmallen@gmail.com wrote:
{{sofixit}}, just like any area with NPOV/undue weight issues.
"The next day someone will fix it back." - Douglas Hofstadter
Such is the nature of this project. If no one ever did anything because of that possibility, no one would ever do anything at all.
Well, it's not just that it's possible, it's that I judge the probability to be very high.
Then, if your proposed change is opposed by a significant number of people, it would tend to indicate it has not gained consensus. That, too, is the nature of the beast, when working on a project like this. I think we've all had an idea we strongly believe to be right fail to gain the consensus that would be needed to implement it.
Rather, many of us believe that it would be irresponsible to implement censorship on an uncensored, comprehensive educational project.
I have no problem with developing best practices, and certainly I don't think anyone will argue that we should host or retain porn or near-porn involving kids
Certainly some people will argue this. I believe that, fortunately, most of them are banned, though.
Uh...wow. One would hope so. I don't believe that's very common, though. Certainly no one I've heard arguing against censorship is in favor of that.
But a policy against porn or near-porn involving kids *is* censorship, is it not?
I suppose in the most technical sense it is, but that's a question of very settled and tested law, unlike 2257. That's more like forbidding copyvios--copyright law, while complex, is fairly stable and well tested. In a very technical sense, forbidding penis vandalism is censorship, but I think most of us know the difference. Putting a picture of a penis on the article about a political candidate or sports team is unacceptable, putting a picture of a penis on the "Penis" article is much more likely to be done in good faith.
Wikimedia-l mailing list Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
On Thu, Jun 21, 2012 at 12:28 PM, Todd Allen toddmallen@gmail.com wrote:
On Thu, Jun 21, 2012 at 10:18 AM, Anthony wikimail@inbox.org wrote:
On Thu, Jun 21, 2012 at 10:01 AM, Todd Allen toddmallen@gmail.com wrote:
On Wed, Jun 20, 2012 at 4:46 PM, Anthony wikimail@inbox.org wrote:
On Wed, Jun 20, 2012 at 6:03 PM, Todd Allen toddmallen@gmail.com wrote:
{{sofixit}}, just like any area with NPOV/undue weight issues.
"The next day someone will fix it back." - Douglas Hofstadter
Such is the nature of this project. If no one ever did anything because of that possibility, no one would ever do anything at all.
Well, it's not just that it's possible, it's that I judge the probability to be very high.
Then, if your proposed change is opposed by a significant number of people, it would tend to indicate it has not gained consensus.
Heh. Sorry, I have to laugh any time I hear a...person heavily versed in Wikipedia-speak...use the word consensus.
That, too, is the nature of the beast, when working on a project like this. I think we've all had an idea we strongly believe to be right fail to gain the consensus that would be needed to implement it.
Certainly. And when this happens, sometimes we write about it, and then someone says "so fix it", and we say "the next day someone will fix it back".
You seem to be making the assumption that Wikipedia's notion of "consensus" is the proper way to write an encyclopedia. I by no means am accepting that assumption.
But a policy against porn or near-porn involving kids *is* censorship, is it not?
I suppose in the most technical sense it is, but that's a question of very settled and tested law, unlike 2257.
So, the only reason kiddie porn isn't allowed (*) is that it's illegal?
(*) Notwithstanding Virgin Killer, and perhaps a few other examples, anyway.
In a very technical sense, forbidding penis vandalism is censorship, but I think most of us know the difference. Putting a picture of a penis on the article about a political candidate or sports team is unacceptable, putting a picture of a penis on the "Penis" article is much more likely to be done in good faith.
What if it's a picture of the penis of the political candidate?
You seem to think there's a clear line to be drawn that everyone agrees upon. But clearly there isn't. Some people think the line should be drawn in one place, and some people think it should be drawn in another.
On Thu, Jun 21, 2012 at 10:55 AM, Anthony wikimail@inbox.org wrote:
On Thu, Jun 21, 2012 at 12:28 PM, Todd Allen toddmallen@gmail.com wrote:
On Thu, Jun 21, 2012 at 10:18 AM, Anthony wikimail@inbox.org wrote:
On Thu, Jun 21, 2012 at 10:01 AM, Todd Allen toddmallen@gmail.com wrote:
On Wed, Jun 20, 2012 at 4:46 PM, Anthony wikimail@inbox.org wrote:
On Wed, Jun 20, 2012 at 6:03 PM, Todd Allen toddmallen@gmail.com wrote:
{{sofixit}}, just like any area with NPOV/undue weight issues.
"The next day someone will fix it back." - Douglas Hofstadter
Such is the nature of this project. If no one ever did anything because of that possibility, no one would ever do anything at all.
Well, it's not just that it's possible, it's that I judge the probability to be very high.
Then, if your proposed change is opposed by a significant number of people, it would tend to indicate it has not gained consensus.
Heh. Sorry, I have to laugh any time I hear a...person heavily versed in Wikipedia-speak...use the word consensus.
That's the way the project works. You or I can love it, or hate it, or rail against it, but that's the reality. If you'd like to propose a different mechanism, you can. But I think that the consensus mechanism, for all its faults, has produced a very remarkable end product.
Any system we use is going to be imperfect. Perhaps consensus is the least imperfect one.
That, too, is the nature of the beast, when working on a project like this. I think we've all had an idea we strongly believe to be right fail to gain the consensus that would be needed to implement it.
Certainly. And when this happens, sometimes we write about it, and then someone says "so fix it", and we say "the next day someone will fix it back".
You seem to be making the assumption that Wikipedia's notion of "consensus" is the proper way to write an encyclopedia. I by no means am accepting that assumption.
What would you propose as a superior mechanism, then? That's not a rhetorical or sarcastic question-maybe we could do better. But you haven't said how.
But a policy against porn or near-porn involving kids *is* censorship, is it not?
I suppose in the most technical sense it is, but that's a question of very settled and tested law, unlike 2257.
So, the only reason kiddie porn isn't allowed (*) is that it's illegal?
(*) Notwithstanding Virgin Killer, and perhaps a few other examples, anyway.
Child porn is illegal, that's been upheld by the Supreme Court repeatedly, end of discussion. If 2257 were similarly upheld to apply even in circumstances of educational/artistic work, I suppose we'd similarly have to follow it like it or not, but it is untested in such areas, and I suspect the SC would find it massively overbroad, especially as it relates to subjects not identifiable at all.
But even in a hypothetical (and highly unlikely) world where child porn was legal, a privacy issue exists there that does not exist in adult nude or sexual images, since children are incapable of giving real consent to participation in such a thing due to lack of maturity, whereas adults can and often do give informed consent to participation in photographed or filmed nudity or sexuality. I think that, too, would allow us to draw a distinction between sexual images of children and those of adults, since those of a child would be -by definition- taken without the subject's genuine consent.
In a very technical sense, forbidding penis vandalism is censorship, but I think most of us know the difference. Putting a picture of a penis on the article about a political candidate or sports team is unacceptable, putting a picture of a penis on the "Penis" article is much more likely to be done in good faith.
What if it's a picture of the penis of the political candidate?
I can -conceive- of a case where that would be appropriate, such as if the candidate were a member of a hypothetical "Porn Party" and freely released such an image, and that release resulted in substantial source coverage of that particular image. In that case, we of course should show it, since the article will have a section with reliably sourced commentary on it. But since no such thing really exists, such an image would be of little to no relevance to the article. In that case, we're not disallowing it because it's a penis, we're disallowing it because it's irrelevant. But if somehow it were extremely relevant to the article, I'd see no problem including it. In every case I know of, though, a candidate penis photo would be just as irrelevant as a macro photo of a few hairs on the candidate's head.
You seem to think there's a clear line to be drawn that everyone agrees upon. But clearly there isn't. Some people think the line should be drawn in one place, and some people think it should be drawn in another.
That goes back up to the above. When disagreement happens, we discuss it and come to consensus, if we can. If no consensus can be reached for an exception in a particular circumstance, standing policy (in this case, NOTCENSORED) serves as a fallback/baseline, and we go with that. Did you have another suggestion for a better process?
Wikimedia-l mailing list Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
On Thu, Jun 21, 2012 at 1:20 PM, Todd Allen toddmallen@gmail.com wrote:
On Thu, Jun 21, 2012 at 10:55 AM, Anthony wikimail@inbox.org wrote:
Heh. Sorry, I have to laugh any time I hear a...person heavily versed in Wikipedia-speak...use the word consensus.
That's the way the project works. You or I can love it, or hate it, or rail against it, but that's the reality.
Sometimes I, and sometimes others, are going to write about the results of it, okay?
So, the only reason kiddie porn isn't allowed (*) is that it's illegal?
Child porn is illegal, that's been upheld by the Supreme Court repeatedly, end of discussion.
Well, moreover, it's illegal almost everywhere. So yeah, putting it on Wikipedia wouldn't be pragmatic.
But I'm just wondering if there's a principled reason for the ban in addition to the pragmatic one.
But even in a hypothetical (and highly unlikely) world where child porn was legal, a privacy issue exists there that does not exist in adult nude or sexual images, since children are incapable of giving real consent to participation in such a thing due to lack of maturity, whereas adults can and often do give informed consent to participation in photographed or filmed nudity or sexuality. I think that, too, would allow us to draw a distinction between sexual images of children and those of adults, since those of a child would be -by definition- taken without the subject's genuine consent.
Many images on Wikipedia have been taken without the subject's genuine consent. So surely that isn't the issue.
What if it's a picture of the penis of the political candidate?
I can -conceive- of a case where that would be appropriate
So, commons is fine, I guess.
In every case I know of, though, a candidate penis photo would be just as irrelevant as a macro photo of a few hairs on the candidate's head.
Convent pornography, cock and ball torture, and hogtie bondage, though. These are things that are relevant.
Or is it okay if, instead of putting the penis picture on [[Candidate Whatever]], we put it in [[Candidate Whatever's Penis]]?
You seem to think there's a clear line to be drawn that everyone agrees upon. But clearly there isn't. Some people think the line should be drawn in one place, and some people think it should be drawn in another.
That goes back up to the above. When disagreement happens, we discuss it and come to consensus, if we can.
And what is "consensus"?
If no consensus can be reached for an exception in a particular circumstance, standing policy (in this case, NOTCENSORED) serves as a fallback/baseline, and we go with that.
So, things are included (under NOTCENSORED), unless there is consensus to not include it?
Did you have another suggestion for a better process?
Yes, but first let me get a complete description of the current process (starting with answers to the above questions).
On Thu, Jun 21, 2012 at 1:46 PM, Anthony wikimail@inbox.org wrote:
Many images on Wikipedia have been taken without the subject's genuine consent. So surely that isn't the issue.
In case you need an example, http://en.wikipedia.org/wiki/File:LeonardGSiffleet.jpg
On Thu, Jun 21, 2012 at 6:46 PM, Anthony wikimail@inbox.org wrote:
Many images on Wikipedia have been taken without the subject's genuine consent. So surely that isn't the issue.
Many are transferred to Commons from Flickr without the uploader's consent which, in the case of sexually explicit photos taken in a private location, should always be sought before doing the transfer.
Unfortunately, that's another rule more honoured in the breach than in the observance on Commons. (Note that even if the image doesn't show a face, the Commons page always includes a link to the person's Flickr stream, thus identifying them.)
Incidentally, a Commons copyright specialist is currently being banned for nominating admins' copyright violations for deletion, even though the vast majority of his deletions have always turned out to be correct ... the administrators are feeling "harassed" by having their copyright violations nominated and say he's doing it because he doesn't like them, and that it's bad for community relations.
http://commons.wikimedia.org/wiki/Commons:Administrators%27_noticeboard/User...
You couldn't make this stuff up. Not unless you were William Golding, that is.
2012/6/21 Andreas Kolbe jayen466@gmail.com:
Incidentally, a Commons copyright specialist is currently being banned for
"copyright specialist"? Is this supposed to be a joke? A 4th degree sargasm? An alien way of defining a "specialist"? Or anything else?
Yann
(cut nonsense rethoric about the PK affair).
On Thu, Jun 21, 2012 at 2:10 PM, Andreas Kolbe jayen466@gmail.com wrote:
On Thu, Jun 21, 2012 at 6:46 PM, Anthony wikimail@inbox.org wrote:
Many images on Wikipedia have been taken without the subject's genuine consent. So surely that isn't the issue.
Many are transferred to Commons from Flickr without the uploader's consent which, in the case of sexually explicit photos taken in a private location, should always be sought before doing the transfer.
Well, first of all, why?
Secondly, I'm not talking just about sexually explicit photos. Wikipedia has photos of people being or about to be [[behead]]ed, [[torture]]d, [[kidnap]]ped, [[assassination]]ed, etc. I checked, and there's no photograph of someone being [[rape]]d, just paintings, but it's probably just a matter of time.
On Thu, Jun 21, 2012 at 2:22 PM, Anthony wikimail@inbox.org wrote:
Secondly, I'm not talking just about sexually explicit photos. Wikipedia has photos of people being or about to be [[behead]]ed, [[torture]]d, [[kidnap]]ped, [[assassination]]ed, etc. I checked, and there's no photograph of someone being [[rape]]d, just paintings, but it's probably just a matter of time.
No photo on the [[child abuse]] article either. Is this for pragmatic reasons (no free photo available), or reasons of principle?
If someone added a photo of child abuse on the [[child abuse]] article, and if it did not have any copyright issues, would it be kept unless there was a "consensus" to delete it?
On Thu, Jun 21, 2012 at 7:22 PM, Anthony wikimail@inbox.org wrote:
Well, first of all, why?
Secondly, I'm not talking just about sexually explicit photos. Wikipedia has photos of people being or about to be [[behead]]ed, [[torture]]d, [[kidnap]]ped, [[assassination]]ed, etc. I checked, and there's no photograph of someone being [[rape]]d, just paintings, but it's probably just a matter of time.
Well, Todd has certainly said on-wiki in the past that he would not see a problem in Wikipedia using a video of rape to illustrate an article on the topic, provided it were appropriately licensed and did not raise privacy concerns (for example if the persons shown were no longer alive). He and I have discussed this at length before, together with Jimbo, but I don't think either of us has been able to change the other's mind. :)
Many Wikipedians generally argue that because Wikipedia is not censored, it should always be appropriate to show an image or video of what the article is about. According to this reasoning, an ideal article about rape would show a video of rape. An article on suicide would have embedded videos of people killing themselves. An article on marriage would show a video of a marriage's consummation. An article on fatal car accidents would show a video of a fatal car crash one. An article on Russian roulette would show someone playing it. And so forth.
This argument is not motivated by a desire to educate, or by educational competence for that matter.
On Thu, Jun 21, 2012 at 3:38 PM, Andreas Kolbe jayen466@gmail.com wrote:
Well, Todd has certainly said on-wiki in the past that he would not see a problem in Wikipedia using a video of rape to illustrate an article on the topic, provided it were appropriately licensed and did not raise privacy concerns (for example if the persons shown were no longer alive).
So would the same argument would apply to child porn, if the child is dead, and if it weren't illegal?
The current situation seems to be that photos of child abuse are legal (and are allowed on Wikipedia), and photos of sexual abuse are legal (and are allowed on Wikipedia), but photos of child sexual abuse are illegal (and aren't on Wikipedia except for a few disputed cases).
Am 21.06.2012 22:24, schrieb Anthony:
On Thu, Jun 21, 2012 at 3:38 PM, Andreas Kolbejayen466@gmail.com wrote:
Well, Todd has certainly said on-wiki in the past that he would not see a problem in Wikipedia using a video of rape to illustrate an article on the topic, provided it were appropriately licensed and did not raise privacy concerns (for example if the persons shown were no longer alive).
So would the same argument would apply to child porn, if the child is dead, and if it weren't illegal?
The current situation seems to be that photos of child abuse are legal (and are allowed on Wikipedia), and photos of sexual abuse are legal (and are allowed on Wikipedia), but photos of child sexual abuse are illegal (and aren't on Wikipedia except for a few disputed cases).
Can you point me to any examples of real "child abuse", "sexual abuse" or of "child sexual abuse"?
On Thu, Jun 21, 2012 at 4:44 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Can you point me to any examples of real "child abuse", "sexual abuse" or of "child sexual abuse"?
On Wikipedia? On Commons? Anywhere?
For "child sexual abuse", I was referring mainly to the Virgin Killer image (and as I said, whether or not the image constitutes this is disputed).
For "child abuse", see http://en.wikipedia.org/wiki/File:Erichsen_Abused_San_or_Nama_child_prisoner...
For "sexual abuse", a simple search came up with http://en.wikipedia.org/wiki/File:AG-10.jpg (which isn't on the English Wikipedia except through image search, but is on other language Wikipedias.
Am 21.06.2012 22:51, schrieb Anthony:
On Thu, Jun 21, 2012 at 4:44 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Can you point me to any examples of real "child abuse", "sexual abuse" or of "child sexual abuse"?
On Wikipedia? On Commons? Anywhere?
Do i really need to answer this question, depending on where we discuss?
For "child sexual abuse", I was referring mainly to the Virgin Killer image (and as I said, whether or not the image constitutes this is disputed).
You call the Virgin Killer image "child sexual abuse"? Truly?
For "child abuse", see http://en.wikipedia.org/wiki/File:Erichsen_Abused_San_or_Nama_child_prisoner...
I don't see any problem with this image. It documents child abuse as a fact without advocating it.
For "sexual abuse", a simple search came up with http://en.wikipedia.org/wiki/File:AG-10.jpg (which isn't on the English Wikipedia except through image search, but is on other language Wikipedias.
I would be truly shocked if that image or another version of it isn't used.
Are that examples of images you find shocking or that should not be shown on Wikipedia or hosted on Commons?
On Thu, Jun 21, 2012 at 5:48 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 21.06.2012 22:51, schrieb Anthony:
On Thu, Jun 21, 2012 at 4:44 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Can you point me to any examples of real "child abuse", "sexual abuse" or of "child sexual abuse"?
On Wikipedia? On Commons? Anywhere?
Do i really need to answer this question, depending on where we discuss?
Well, I still don't know the answer.
For "child sexual abuse", I was referring mainly to the Virgin Killer image (and as I said, whether or not the image constitutes this is disputed).
You call the Virgin Killer image "child sexual abuse"? Truly?
It depicts an instance of child sexual abuse, yes.
For "child abuse", see
http://en.wikipedia.org/wiki/File:Erichsen_Abused_San_or_Nama_child_prisoner...
I don't see any problem with this image. It documents child abuse as a fact without advocating it.
Okay, I don't understand your request.
I thought you wanted me to give you an examples of these images.
For "sexual abuse", a simple search came up with http://en.wikipedia.org/wiki/File:AG-10.jpg (which isn't on the English Wikipedia except through image search, but is on other language Wikipedias.
I would be truly shocked if that image or another version of it isn't used.
"No pages on the English Wikipedia link to this file."
7 other Wikipedias do use the image
(Ah, going to [[Abu Ghraib torture and prisoner abuse]] I see why. The English Wikipedia is using the version of the image without the blur/censorship.)
Are that examples of images you find shocking or that should not be shown on Wikipedia or hosted on Commons?
I was responding to your request to point you to examples.
Am 22.06.2012 00:02, schrieb Anthony:
On Thu, Jun 21, 2012 at 5:48 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 21.06.2012 22:51, schrieb Anthony:
On Thu, Jun 21, 2012 at 4:44 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Can you point me to any examples of real "child abuse", "sexual abuse" or of "child sexual abuse"?
On Wikipedia? On Commons? Anywhere?
Do i really need to answer this question, depending on where we discuss?
Well, I still don't know the answer.
Of course Wikimedia related...
For "child sexual abuse", I was referring mainly to the Virgin Killer image (and as I said, whether or not the image constitutes this is disputed).
You call the Virgin Killer image "child sexual abuse"? Truly?
It depicts an instance of child sexual abuse, yes.
I see a child, but i don't see sexual abuse. So i can't agree with you that it is an instance for child sexual abuse.
Are that examples of images you find shocking or that should not be shown on Wikipedia or hosted on Commons?
I was responding to your request to point you to examples.
I should have written this question: Can you point me to examples of any of the previously mentioned abuses on Commons or Wikipedia that have no justification to be there?
On Thu, Jun 21, 2012 at 6:31 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
I see a child, but i don't see sexual abuse. So i can't agree with you that it is an instance for child sexual abuse.
As I said, it is disputed.
I should have written this question: Can you point me to examples of any of the previously mentioned abuses on Commons or Wikipedia that have no justification to be there?
I have no idea what the justification is for any particular image. My point was that Wikipedia contains plenty of images which were "taken without the subject's genuine consent".
I am not the one who said that Wikipedia may not contain images which were "taken without the subject's genuine consent". That was brought up by Todd Allen, and my purpose in showing that the images were in Wikipedia was to show that this is *not* a valid criterion.
I do indeed think that the proper criterion, at least for the adult version of Wikipedia, is whether or not the image is justified.
On 21 June 2012 20:38, Andreas Kolbe jayen466@gmail.com wrote:
Well, Todd has certainly said on-wiki in the past that he would not see a problem in Wikipedia using a video of rape to illustrate an article on the topic, provided it were appropriately licensed and did not raise privacy concerns (for example if the persons shown were no longer alive). He and I
You've already been caught once today making a highly distorted claim in this thread, so if you're going to make a claim like this you really need to supply the diffs.
- d.
On Thu, Jun 21, 2012 at 1:38 PM, Andreas Kolbe jayen466@gmail.com wrote:
On Thu, Jun 21, 2012 at 7:22 PM, Anthony wikimail@inbox.org wrote:
Well, first of all, why?
Secondly, I'm not talking just about sexually explicit photos. Wikipedia has photos of people being or about to be [[behead]]ed, [[torture]]d, [[kidnap]]ped, [[assassination]]ed, etc. I checked, and there's no photograph of someone being [[rape]]d, just paintings, but it's probably just a matter of time.
Well, Todd has certainly said on-wiki in the past that he would not see a problem in Wikipedia using a video of rape to illustrate an article on the topic, provided it were appropriately licensed and did not raise privacy concerns (for example if the persons shown were no longer alive). He and I have discussed this at length before, together with Jimbo, but I don't think either of us has been able to change the other's mind. :)
That was a highly theoretical scenario (and one you brought up for that reason, as I recall.) But in practice, we do have photos of victims at articles such as [[Rape of Nanking]] and [[Holocaust]]. Some of those photos are extremely disturbing. That's because the articles are about extremely disturbing subjects.
Many Wikipedians generally argue that because Wikipedia is not censored, it should always be appropriate to show an image or video of what the article is about. According to this reasoning, an ideal article about rape would show a video of rape.
It currently does. In this case, they're paintings rather than photos, but they certainly and graphically show the subject matter at hand.
An article on suicide would have embedded videos of
people killing themselves.
For such a broad topic, I think we might want more general illustrations. But if we really did have such an image, of appropriate license and high quality, I could see considering it.
An article on marriage would show a video of a
marriage's consummation.
No, it wouldn't. The consummation of a marriage is tangentially relevant. Photos of weddings and married couples in various cultures would be much more relevant. The meaning of "consummation" should be briefly touched on, but would not need anywhere near enough detail to be an illustrated section.
An article on fatal car accidents would show a
video of a fatal car crash one.
[[Vehicle accident]] currently includes photos of the aftermath of several car crashes, including a couple that look likely to have been fatal. If we had appropriately licensed video of a vehicle accident occurring, why on earth wouldn't we use it there?
An article on Russian roulette would show
someone playing it. And so forth.
Given that it's illegal in many areas, I would not hold out a high likelihood of us seeing someone voluntarily release a video of it. But let us presume that someone did. Isn't that exactly what the article is about?
This argument is not motivated by a desire to educate, or by educational competence for that matter.
Andreas, I realize we disagree on this in a lot of ways, but I think anyone who works on this project has a desire to educate. I think we can discuss this without questioning one another's motives or calling people incompetent.
Wikimedia-l mailing list Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
On Thu, Jun 21, 2012 at 5:10 PM, Todd Allen toddmallen@gmail.com wrote:
But in practice, we do have photos of victims at articles such as [[Rape of Nanking]] and [[Holocaust]]. Some of those photos are extremely disturbing. That's because the articles are about extremely disturbing subjects.
So legal + no "consensus" to delete = keep.
Fortunately "consensus" doesn't mean consensus. Unfortunately, it means something closer to mob rule.
An article on marriage would show a video of a marriage's consummation.
No, it wouldn't. The consummation of a marriage is tangentially relevant. Photos of weddings and married couples in various cultures would be much more relevant. The meaning of "consummation" should be briefly touched on, but would not need anywhere near enough detail to be an illustrated section.
Why not? The consummation of a marriage certainly deserves a section in an adult version of an encyclopedia article on marriage. I don't think there should be a photograph of a consummation in Wikipedia, but then I don't think there should be an photograph of a rape in Wikipedia either, even in an adult version. (For one thing, neither illustration would do anything to enhance one's knowledge of the topic.)
But what if some people want a photo and some don't? No "consensus", so we leave the photo in, right?
An article on Russian roulette would show someone playing it. And so forth.
Given that it's illegal in many areas, I would not hold out a high likelihood of us seeing someone voluntarily release a video of it. But let us presume that someone did. Isn't that exactly what the article is about?
Yes, it's exactly what the article is about.
But the article being about something does not mean there should be a video of it.
Again, I don't see what a video adds to ones understanding of the topic.
This argument is not motivated by a desire to educate, or by educational competence for that matter.
Andreas, I realize we disagree on this in a lot of ways, but I think anyone who works on this project has a desire to educate.
Well, no, not everyone who works on "this project" does. But the personal attack on you was inappropriate.
On Thu, Jun 21, 2012 at 10:10 PM, Todd Allen toddmallen@gmail.com wrote:
That was a highly theoretical scenario (and one you brought up for that reason, as I recall.) But in practice, we do have photos of victims at articles such as [[Rape of Nanking]] and [[Holocaust]]. Some of those photos are extremely disturbing. That's because the articles are about extremely disturbing subjects.
Those photos are fine, and are found in reliable sources.
Many Wikipedians generally argue that because Wikipedia is not censored,
it
should always be appropriate to show an image or video of what the
article
is about. According to this reasoning, an ideal article about rape would show a video of rape.
It currently does. In this case, they're paintings rather than photos, but they certainly and graphically show the subject matter at hand.
They do not. They do not even show a disrobed male. They are a far cry from the alternative we're discussing – and good job too.
An article on suicide would have embedded videos of
people killing themselves.
For such a broad topic, I think we might want more general illustrations. But if we really did have such an image, of appropriate license and high quality, I could see considering it.
I know you could. :) Again, unprecedented in educational sources, and for good reason. Try finding a publisher who will let you edit a book on suicide for them with that editorial approach.
An article on marriage would show a video of a
marriage's consummation.
No, it wouldn't. The consummation of a marriage is tangentially relevant. Photos of weddings and married couples in various cultures would be much more relevant. The meaning of "consummation" should be briefly touched on, but would not need anywhere near enough detail to be an illustrated section.
The consummation of a marriage is tangentially relevant? *Tangentially?*
An article on fatal car accidents would show a
video of a fatal car crash one.
[[Vehicle accident]] currently includes photos of the aftermath of several car crashes, including a couple that look likely to have been fatal. If we had appropriately licensed video of a vehicle accident occurring, why on earth wouldn't we use it there?
A number of reasons, one of them reader psychology. A normal human being would react with shock, concern and compassion for the people whose deaths they just witnessed, and would probably be put out of the mood to read the article. Websites put together by competent educators don't feature such videos. I realise that what educational sources put together by qualified experts do is irrelevant to the average unqualified Wikipedian.
An article on Russian roulette would show
someone playing it. And so forth.
Given that it's illegal in many areas, I would not hold out a high likelihood of us seeing someone voluntarily release a video of it. But let us presume that someone did. Isn't that exactly what the article is about?
Sigh. I think this is roughly where we stopped two years ago. :)
This argument is not motivated by a desire to educate, or by educational competence for that matter.
Andreas, I realize we disagree on this in a lot of ways, but I think anyone who works on this project has a desire to educate. I think we can discuss this without questioning one another's motives or calling people incompetent.
On Thu, Jun 21, 2012 at 5:43 PM, Andreas Kolbe jayen466@gmail.com wrote:
On Thu, Jun 21, 2012 at 10:10 PM, Todd Allen toddmallen@gmail.com wrote:
Those photos are fine, and are found in reliable sources.
Alright, so we at least found a starting point we can agree on. I'll say that's something.
They do not. They do not even show a disrobed male. They are a far cry from the alternative we're discussing – and good job too.
I think they serve the purpose. I imagine in many cases, it would be possible to do it like that, especially in articles on very general topics.
I know you could. :) Again, unprecedented in educational sources, and for good reason. Try finding a publisher who will let you edit a book on suicide for them with that editorial approach.
Books are very often image-light, given the publishing costs. Wikipedia is not a book.
The consummation of a marriage is tangentially relevant? *Tangentially?*
In terms of a full article on marriage, to take up a slot for an image, when we can generally only fit 10 or so images into even a long article? No, it would not make that cut. Again, photos of weddings, married couples, etc., in different cultures, would be far more instructive than a photo of consummation (especially given that marriage consummation doesn't really look visually any different than sex any other time). You're trying very hard to set up that straw man, but it is a straw man.
A number of reasons, one of them reader psychology. A normal human being would react with shock, concern and compassion for the people whose deaths they just witnessed, and would probably be put out of the mood to read the article.
Wait wait wait. Above, we discussed the war atrocity photos, and you were perfectly fine with them. Do you not think such photos would cause people to react with shock, concern, and compassion? They do for me! But they appropriately illustrate the topic. Difficult subjects may have difficult images accompanying them, but what would one really expect to find?
And in this case, driver's education classes, literature on the topic, and so on, routinely show photos of drunk driving crashes. Driver safety material often shows photos of crashes. So you can't even use the "that's not common practice" argument there. If we follow what you assert to be common practice above, why wouldn't we follow it here?
Websites put together by competent educators don't feature such
videos. I realise that what educational sources put together by qualified experts do is irrelevant to the average unqualified Wikipedian.
Seems this bunch of incompetent, untrained fools has put together one of the most astonishing, comprehensive, and widely utilized educational tools in the history of the world. Maybe it was time for a little unconventional thinking. Wikipedia was not built by deferring blindly to "experts," and I wouldn't say it's turned out too badly so far.
Sigh. I think this is roughly where we stopped two years ago. :)
2012/6/21 Andreas Kolbe jayen466@gmail.com:
Incidentally, a Commons copyright specialist is currently being banned for nominating admins' copyright violations for deletion, even though the vast majority of his deletions have always turned out to be correct ... the administrators are feeling "harassed" by having their copyright violations nominated and say he's doing it because he doesn't like them, and that it's bad for community relations.
http://commons.wikimedia.org/wiki/Commons:Administrators%27_noticeboard/User...
You couldn't make this stuff up. Not unless you were William Golding, that is.
When it goes so far even remotely connected to the reality, I understand that it gets difficult to reach an agreement about practical and down-to-earth issues, like nudity images.
Yann
On Thu, Jun 21, 2012 at 12:10 PM, Andreas Kolbe jayen466@gmail.com wrote:
On Thu, Jun 21, 2012 at 6:46 PM, Anthony wikimail@inbox.org wrote:
Many images on Wikipedia have been taken without the subject's genuine consent. So surely that isn't the issue.
Many are transferred to Commons from Flickr without the uploader's consent which, in the case of sexually explicit photos taken in a private location, should always be sought before doing the transfer.
Unfortunately, that's another rule more honoured in the breach than in the observance on Commons. (Note that even if the image doesn't show a face, the Commons page always includes a link to the person's Flickr stream, thus identifying them.)
Incidentally, a Commons copyright specialist is currently being banned for nominating admins' copyright violations for deletion, even though the vast majority of his deletions have always turned out to be correct ... the administrators are feeling "harassed" by having their copyright violations nominated and say he's doing it because he doesn't like them, and that it's bad for community relations.
http://commons.wikimedia.org/wiki/Commons:Administrators%27_noticeboard/User...
You couldn't make this stuff up. Not unless you were William Golding, that is. _______________________________________________ Wikimedia-l mailing list Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
This thread isn't about copyvios, and I don't want to get too far afield, but I think it does kind of show the thought process here sometimes. From my read of the discussions with that editor, as well as the incident discussion you linked, he is being blocked not for the deletion nominations themselves, but for making them disruptively, both by targeting editors he disagrees with and by being abusive during the process. As a parallel on Wikipedia, if someone has a disagreement with another editor, and proceeds to nominate 10 of their articles for deletion with the deletion rationale "Delete this crap by that moron", that person could be sanctioned even if all 10 articles really -do- need to be deleted. I don't know if that's really the case, nor do I feel like reviewing his contributions in enough detail to find out, but the block discussion is absolutely -not- talking about what you said it was.
On Thu, Jun 21, 2012 at 7:50 PM, Todd Allen toddmallen@gmail.com wrote:
This thread isn't about copyvios, and I don't want to get too far afield, but I think it does kind of show the thought process here sometimes. From my read of the discussions with that editor, as well as the incident discussion you linked, he is being blocked not for the deletion nominations themselves, but for making them disruptively, both by targeting editors he disagrees with and by being abusive during the process. As a parallel on Wikipedia, if someone has a disagreement with another editor, and proceeds to nominate 10 of their articles for deletion with the deletion rationale "Delete this crap by that moron", that person could be sanctioned even if all 10 articles really -do- need to be deleted. I don't know if that's really the case, nor do I feel like reviewing his contributions in enough detail to find out, but the block discussion is absolutely -not- talking about what you said it was.
Notability is different from copyright. Copyright is fundamental. When editors in Wikipedia have pointed out multiple copyright violations or plagiarisms by administrators (we have had examples, up to and including arbitrators), they have not been subject to threats, blocks and bans. I don't think this sort of thing would fly in the English Wikipedia – not with copyright violations.
Non-notable articles, perhaps, especially if the nomination were accompanied by abuse. But I am honestly not aware of Pieter ever having nominated a file with the reasoning "Delete this crap by that moron". These are your words. And I *am* aware of admins continuously picking on him and ganging up on him. This is not the first time this situation has arisen.
If a file is a copyright violation, it is a copyright violation.
Am 21.06.2012 21:55, schrieb Andreas Kolbe:
On Thu, Jun 21, 2012 at 7:50 PM, Todd Allentoddmallen@gmail.com wrote:
This thread isn't about copyvios, and I don't want to get too far afield, but I think it does kind of show the thought process here sometimes. From my read of the discussions with that editor, as well as the incident discussion you linked, he is being blocked not for the deletion nominations themselves, but for making them disruptively, both by targeting editors he disagrees with and by being abusive during the process. As a parallel on Wikipedia, if someone has a disagreement with another editor, and proceeds to nominate 10 of their articles for deletion with the deletion rationale "Delete this crap by that moron", that person could be sanctioned even if all 10 articles really -do- need to be deleted. I don't know if that's really the case, nor do I feel like reviewing his contributions in enough detail to find out, but the block discussion is absolutely -not- talking about what you said it was.
Notability is different from copyright. Copyright is fundamental. When editors in Wikipedia have pointed out multiple copyright violations or plagiarisms by administrators (we have had examples, up to and including arbitrators), they have not been subject to threats, blocks and bans. I don't think this sort of thing would fly in the English Wikipedia – not with copyright violations.
Non-notable articles, perhaps, especially if the nomination were accompanied by abuse. But I am honestly not aware of Pieter ever having nominated a file with the reasoning "Delete this crap by that moron". These are your words. And I *am* aware of admins continuously picking on him and ganging up on him. This is not the first time this situation has arisen.
If a file is a copyright violation, it is a copyright violation.
I don't tend to interfere with that issue. But from what i noticed you put Pieter in a very different light as i would put him. Knowing that you are unhappy with Commons, even dragging it down to a personal level, it isn't really surprising to me to read a comment like this.
I have to agree with Todds view that Pieter used deletion requests against opponents on Commons in a very unconvincing fashion, only hunting for pictures of this users. I also agree on the fact that a (un)justified deletion request is a separate issue from "stalking" opponents and making deletions requests purely to annoy them.
The core problem here is that the Board is not alive and well. The Board of Trustees is dead in their shoes. What precisely are they *Trustees* of?
Jussi, I'm not finding the post you are replying too, what's the context here?
On Sat, Jun 23, 2012 at 3:29 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
The core problem here is that the Board is not alive and well. The Board of Trustees is dead in their shoes. What precisely are they *Trustees* of?
--
Jussi-Ville Heiskanen, ~ [[User:Cimon Avaro]]
Wikimedia-l mailing list Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
Child porn is illegal, that's been upheld by the Supreme Court repeatedly, end of discussion. If 2257 were similarly upheld to apply even in circumstances of educational/artistic work, I suppose we'd similarly have to follow it like it or not, but it is untested in such areas, and I suspect the SC would find it massively overbroad, especially as it relates to subjects not identifiable at all.
2257 is also about child porn, because without age records there is often no way of telling whether a cropped shot belongs to a minor or an adult, and no way for the reader to tell whether they are looking at a picture or video of a minor or not.
US-based adult sites use compliance statements for equivalent material. They seem to be more responsible and law-abiding than the Wikimedia community, which presents its material on a top-5 website.
On 18/06/2012 7:52 AM, Thomas Morton wrote:
The *hard* problem is convincing the "not censored" abusers that it's a useful feature for our community.
You're begging the question that it /is/ a useful feature.
-- Coren / Marc
On Sun, Jun 17, 2012 at 9:44 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 18.06.2012 00:40, schrieb Anthony:
Is there even a way to export an article, including (recursively) all the templates it depends on?
Every stupid bot could do this. There is no "running out of the box" solution at the moment, but the effort to set up something like this would be minimal compared to anything else.
Have you ever tried to do this? It's not as easy as you are making it sound, at least it wasn't as of a few years ago, because Mediawiki is tightly coupled to the specific database structure it uses.
I would say that Citizendium failed because they did no automatic updating.
Well, I'm not talking about why Citizendium failed, as that became apparent much later. I'm talking about why they dropped the "progressive fork" parts, which happened pretty early on. The fact of the matter is that forking Wikipedia and cleaning it up is more difficult than just starting from scratch using Wikipedia as a reference (possibly copy/pasting large portions as you go).
What i have in mind is delayed mirror with update control. It is not meant to be edited by hand.
Yes. This simplifies some things, and it makes other things impossible (e.g. if you want to remove one line from an article, you're stuck with removing the entire article; if you want to remove one link from a template, you're stuck with removing every article which includes that template, or includes a template which includes that template, etc.)
And considering the heavy use of templates which are Wikipedia-specific, presumably you're going to allow for *some* hand-editing.
It is a subset of the current content selected by the host (one or many users) of the page himself. It is essentially a whitelist for Wikipedia that only contains selected/checked content. That way a "childrens Wiki" could easily be created, by not including any unwanted content, while the effort stays minimal. (Not more effort then to create your own book from a list of already written articles)
Right, well, I thought this too, until I tried to do it.
Am 18.06.2012 14:49, schrieb Anthony:
On Sun, Jun 17, 2012 at 9:44 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 18.06.2012 00:40, schrieb Anthony:
Is there even a way to export an article, including (recursively) all the templates it depends on?
Every stupid bot could do this. There is no "running out of the box" solution at the moment, but the effort to set up something like this would be minimal compared to anything else.
Have you ever tried to do this? It's not as easy as you are making it sound, at least it wasn't as of a few years ago, because Mediawiki is tightly coupled to the specific database structure it uses.
You don't need to interact with the database of Wikipedia itself. You can use the MediaWiki API which is quite stable and enough for this task. I don't speak about a complete mirror, i speak about a filtered _view_ for Wikipedia. You type in "http://www.mysavewiki.com/Banana" and the server delivers the recently approved and cached version of the article from Wikipedia if "Banana" is whitelisted.
I would say that Citizendium failed because they did no automatic updating.
Well, I'm not talking about why Citizendium failed, as that became apparent much later. I'm talking about why they dropped the "progressive fork" parts, which happened pretty early on. The fact of the matter is that forking Wikipedia and cleaning it up is more difficult than just starting from scratch using Wikipedia as a reference (possibly copy/pasting large portions as you go).
I'm not speaking about a fork or an improved Wikipedia. I speak about a restricted and checked view. All article work will still be done on Wikipedia itself.
What i have in mind is delayed mirror with update control. It is not meant to be edited by hand.
Yes. This simplifies some things, and it makes other things impossible (e.g. if you want to remove one line from an article, you're stuck with removing the entire article; if you want to remove one link from a template, you're stuck with removing every article which includes that template, or includes a template which includes that template, etc.)
And considering the heavy use of templates which are Wikipedia-specific, presumably you're going to allow for *some* hand-editing.
That would be something else than i had in mind and would extend the functionality of the filter (the proposed one) by far. I intended flagged revisions together with white listing for a some kind of special audience, and not a fork like Wiki that modifies the content (partially) itself.
It is a subset of the current content selected by the host (one or many users) of the page himself. It is essentially a whitelist for Wikipedia that only contains selected/checked content. That way a "childrens Wiki" could easily be created, by not including any unwanted content, while the effort stays minimal. (Not more effort then to create your own book from a list of already written articles)
Right, well, I thought this too, until I tried to do it.
I was thinking about a first step how someone could look at Wikipedia trough a basic filter without the need to interfere with project itself. As far as i can see this is the goal of the filter approach while eliminating the side effects or to keep them minimalistic.
On Mon, Jun 18, 2012 at 9:25 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 18.06.2012 14:49, schrieb Anthony:
Have you ever tried to do this? It's not as easy as you are making it sound, at least it wasn't as of a few years ago, because Mediawiki is tightly coupled to the specific database structure it uses.
You don't need to interact with the database of Wikipedia itself. You can use the MediaWiki API which is quite stable and enough for this task. I don't speak about a complete mirror, i speak about a filtered _view_ for Wikipedia. You type in "http://www.mysavewiki.com/Banana" and the server delivers the recently approved and cached version of the article from Wikipedia if "Banana" is whitelisted.
Are you talking about "remote loading" (http://en.wikipedia.org/wiki/Wikipedia:Mirrors_and_forks#Remote_loading)? That's a good way to get your IP address banned.
Am 19.06.2012 01:39, schrieb Anthony:
On Mon, Jun 18, 2012 at 9:25 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 18.06.2012 14:49, schrieb Anthony:
Have you ever tried to do this? It's not as easy as you are making it sound, at least it wasn't as of a few years ago, because Mediawiki is tightly coupled to the specific database structure it uses.
You don't need to interact with the database of Wikipedia itself. You can use the MediaWiki API which is quite stable and enough for this task. I don't speak about a complete mirror, i speak about a filtered _view_ for Wikipedia. You type in "http://www.mysavewiki.com/Banana" and the server delivers the recently approved and cached version of the article from Wikipedia if "Banana" is whitelisted.
Are you talking about "remote loading" (http://en.wikipedia.org/wiki/Wikipedia:Mirrors_and_forks#Remote_loading)? That's a good way to get your IP address banned.
No. I don't talk about remote loading. I talk about caching. The server hosts the current version itself and only fetches it for an manual update. To inform the host that a new version of page exists it could listen to the recent changes on the IRC channel. If it would do remote loading then you would also accept temporary vandalism which isn't desired like remote loading itself isn't desired.
On Mon, Jun 18, 2012 at 7:48 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 19.06.2012 01:39, schrieb Anthony:
On Mon, Jun 18, 2012 at 9:25 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 18.06.2012 14:49, schrieb Anthony:
Have you ever tried to do this? It's not as easy as you are making it sound, at least it wasn't as of a few years ago, because Mediawiki is tightly coupled to the specific database structure it uses.
You don't need to interact with the database of Wikipedia itself. You can use the MediaWiki API which is quite stable and enough for this task. I don't speak about a complete mirror, i speak about a filtered _view_ for Wikipedia. You type in "http://www.mysavewiki.com/Banana" and the server delivers the recently approved and cached version of the article from Wikipedia if "Banana" is whitelisted.
Are you talking about "remote loading" (http://en.wikipedia.org/wiki/Wikipedia:Mirrors_and_forks#Remote_loading)? That's a good way to get your IP address banned.
No. I don't talk about remote loading. I talk about caching. The server hosts the current version itself and only fetches it for an manual update.
Okay, so the server "hosts the current version itself". Presumably they are going to use Mediawiki to do this, unless you are suggesting that they write their own custom wiki parser. That Mediawiki that they are running "is tightly coupled to the specific database structure it uses", at least it was as of a few years ago. What I mean by this is that the parser loads from the database as it's parsing. It's not 1) load everything you need from the database, and then 2) parse it. You have to parse it in order to figure out what to load, and parsing it means running Mediawiki. I believe there were some efforts made to fix this, so maybe it has been fixed. I haven't looked at it in years.
On Mon, Jun 18, 2012 at 9:25 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 18.06.2012 14:49, schrieb Anthony:
And considering the heavy use of templates which are Wikipedia-specific, presumably you're going to allow for *some* hand-editing.
That would be something else than i had in mind and would extend the functionality of the filter (the proposed one) by far. I intended flagged revisions together with white listing for a some kind of special audience, and not a fork like Wiki that modifies the content (partially) itself.
"This article needs additional citations for verification. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (May 2007)"
Are you going to include that template or not? If so, where are you going to link "improve this article" to?
Andrew Gray, 17/06/2012 15:50:
In short: the almost complete absence of anyone doing *anything* clever in terms of reusing and repurposing our content strongly suggests that there are practical barriers to doing so in general, rather than the flaws with any specific model of what it is they want to do.
A filtered mirror is not something clever and we have plenty of mirrors.
Nemo
Am 17.06.2012 21:41, schrieb Federico Leva (Nemo):
Andrew Gray, 17/06/2012 15:50:
In short: the almost complete absence of anyone doing *anything* clever in terms of reusing and repurposing our content strongly suggests that there are practical barriers to doing so in general, rather than the flaws with any specific model of what it is they want to do.
A filtered mirror is not something clever and we have plenty of mirrors.
Nemo
May i ask why?
On 14 June 2012 18:01, David Gerard dgerard@gmail.com wrote:
On 14 June 2012 17:22, geni geniice@gmail.com wrote:
Shocking images in [[Nanking Massacre]] are pretty much expected. [[People's Republic of China–Japan relations]] not so much. [[Agent orange]] is a more boarderline case but these things are never easy as [[Wikipedia:LAME#Names]] shows.
Yes, but this is called editorial judgement rather than something that can be imposed by filtering. (Although the board and staff claim that
This falls into the trap of presuming there is one approach of "editorial judgement of acceptability" that is common to all readers, *and* that it's the same as the editorial judgement currently provided by our community of editors.
I'm not confident that a) is a reliable assumption - neutrality is a matter of presenting all sides, and so we can achieve it, while this sort of editorial judgement is basically binary and so much harder to equivocate. Even if it is, b) certainly has problems - while our community strives to be neutral, I doubt anyone would claim it does not start off with fairly heavy biases, from demography as much as anything else.
Least surprise is one way to try and get around this problem of not relying on the community's own judgement in all edge cases; I'm not sure it's the best one, but I'm not sure leaving it out is any better.
On 14 June 2012 20:36, Andrew Gray andrew.gray@dunelm.org.uk wrote:
Least surprise is one way to try and get around this problem of not relying on the community's own judgement in all edge cases; I'm not sure it's the best one, but I'm not sure leaving it out is any better.
The present usage (to mean "you disagree with our editorial judgement therefore you must be a juvenile troll") is significantly worse.
- d.
On 14 June 2012 16:19, David Gerard dgerard@gmail.com wrote:
On 14 June 2012 20:36, Andrew Gray andrew.gray@dunelm.org.uk wrote:
Least surprise is one way to try and get around this problem of not relying on the community's own judgement in all edge cases; I'm not sure it's the best one, but I'm not sure leaving it out is any better.
The present usage (to mean "you disagree with our editorial judgement therefore you must be a juvenile troll") is significantly worse.
I'm not entirely certain that you've got the "usage" case correct, David. An example would be that one should not be surprised/astonished to see an image including nudity on the article [[World Naked Gardening Day]], but the same image would be surprising on the article [[Gardening]].
The Commons parallel would be that an image depicting nude gardening would be appropriately categorized as [[Cat:Nude gardening]], but would be poorly categorized as [[Cat:Gardening]]. One expects to see a human and gardening but not nudity in the latter, and humans, gardening, *and* nudity in the former.
Now, in fairness, we all know that trolling with images has been a regular occurrence on many projects for years, much of it very obviously trolling, but edge cases can be more difficult to determine. Thus, the more neutral principle of least astonishment ("would an average reader be surprised to see this image on this article?/in this category?") comes into play. I'd suggest that the principle of least astonishment is an effort to assume good faith.
Risker
On Thu, Jun 14, 2012 at 11:40 PM, Risker risker.wp@gmail.com wrote:
On 14 June 2012 16:19, David Gerard dgerard@gmail.com wrote:
On 14 June 2012 20:36, Andrew Gray andrew.gray@dunelm.org.uk wrote:
Least surprise is one way to try and get around this problem of not relying on the community's own judgement in all edge cases; I'm not sure it's the best one, but I'm not sure leaving it out is any better.
The present usage (to mean "you disagree with our editorial judgement therefore you must be a juvenile troll") is significantly worse.
I'm not entirely certain that you've got the "usage" case correct, David. An example would be that one should not be surprised/astonished to see an image including nudity on the article [[World Naked Gardening Day]], but the same image would be surprising on the article [[Gardening]].
The Commons parallel would be that an image depicting nude gardening would be appropriately categorized as [[Cat:Nude gardening]], but would be poorly categorized as [[Cat:Gardening]]. One expects to see a human and gardening but not nudity in the latter, and humans, gardening, *and* nudity in the former.
Now, in fairness, we all know that trolling with images has been a regular occurrence on many projects for years, much of it very obviously trolling, but edge cases can be more difficult to determine. Thus, the more neutral principle of least astonishment ("would an average reader be surprised to see this image on this article?/in this category?") comes into play. I'd suggest that the principle of least astonishment is an effort to assume good faith.
Risker
There is a serious issue here. "least astonishment" is very much distinct from "least offence". We don't guarantee the latter, and never should.The former was hijacked by a silly board resolution, and should be rescinded.
Am 14.06.2012 22:40, schrieb Risker:
On 14 June 2012 16:19, David Gerarddgerard@gmail.com wrote:
On 14 June 2012 20:36, Andrew Grayandrew.gray@dunelm.org.uk wrote:
Least surprise is one way to try and get around this problem of not relying on the community's own judgement in all edge cases; I'm not sure it's the best one, but I'm not sure leaving it out is any better.
The present usage (to mean "you disagree with our editorial judgement therefore you must be a juvenile troll") is significantly worse.
I'm not entirely certain that you've got the "usage" case correct, David. An example would be that one should not be surprised/astonished to see an image including nudity on the article [[World Naked Gardening Day]], but the same image would be surprising on the article [[Gardening]].
The Commons parallel would be that an image depicting nude gardening would be appropriately categorized as [[Cat:Nude gardening]], but would be poorly categorized as [[Cat:Gardening]]. One expects to see a human and gardening but not nudity in the latter, and humans, gardening, *and* nudity in the former.
Now, in fairness, we all know that trolling with images has been a regular occurrence on many projects for years, much of it very obviously trolling, but edge cases can be more difficult to determine. Thus, the more neutral principle of least astonishment ("would an average reader be surprised to see this image on this article?/in this category?") comes into play. I'd suggest that the principle of least astonishment is an effort to assume good faith.
Risker
You gave a nice description how it should be applied in the right way. But the usual interpretation i found in any recent discussions was something like this:
"We don't need to show naked people inside the article [[World Naked Gardening Day]]. It would be an offense against any reader that doesn't want to see naked people. It also might it be dangerous to read this article in public. ..."
Together with the usual pointy strong-wording it becomes something like this:
"Wikipedia dishes out porn. We need an image filter. Protect the children..."
wikimedia-l@lists.wikimedia.org