This week, the Wikimedia Foundation Board of Trustees unanimously passed a resolution addressing the issue of controversial content on the projects. The Board also unanimously passed a resolution addressing images of identifiable, living people on the projects. The resolutions are posted at: http://wikimediafoundation.org/wiki/Resolution:Controversial_content http://wikimediafoundation.org/wiki/Resolution:Images_of_identifiable_people
These topics have been the subject of active debate on the Projects, and particularly on Commons, for a long time. Last June, following extensive community debate, the Wikimedia Foundation Board requested the Executive Director undertake a study of the issue of controversial content on the projects, acknowledging the difficulty of the issue (http://wikimediafoundation.org/wiki/Resolution:Commissioning_Recommendations...). Robert and Dory Harris were commissioned to do this study, which they did on meta in consultation with the community, publishing recommendations in September 2010. Their report is available at: http://meta.wikimedia.org/wiki/2010_Wikimedia_Study_of_Controversial_Content
At its October 2010 meeting, the Board was presented with this report. The Board discussed the recommendations in depth, and developed a working group to act on them. The working group's report was presented at the Board's next in-person meeting, in March 2011; and these resolutions were subsequently drafted and voted on. The working group report has also been posted on meta, at: http://meta.wikimedia.org/wiki/Controversial_content/Board_report
Note that the controversial content resolution uses the term "curation." We are using this term to refer to all aspects of managing images and other content on our projects, including recruiting and acquiring contributions and uploading, categorizing, placement of images in articles and other pages (including gallery pages and the main page), featuring or highlighting, flagging for improvement, and deletion and removal. All of our projects are curated in line with broad editorial principles; this is an essential feature that distinguishes our projects from indiscriminate or general-purpose repositories.
Not all of the Harris recommendations are addressed in this resolution. In particular: * At this time, we refer the recommendation to create a WikiJunior project to the editing community; the Board would like to see demonstrated community support before creating such a project. and * In agreement with the Harris report, we do not recommend that changes be made to current editing and/or filtering regimes surrounding text in Wikimedia projects; we feel editorial mechanisms regarding text are working well.
Finally, we urge that the community, the Foundation and the Wikimedia movement continue to discuss the appropriate scope of Commons for fulfilling Wikimedia's mission; this is a difficult and important question.
Thank you to everyone who has worked on this issue, and special thanks to Robert and Dory Harris for their hard work.
-- Phoebe Ayers, on behalf of the Board working group and the Board
Forgive me if I find these resolutions rather toothless; this is another in a string of board resolutions that simply "urge the projects." I'd love to understand what the Board thinks such resolutions will accomplish. I understand there are legal constraints on the ability of the Foundation to exercise control over content, but these "urgings" are weakly stated even compared to other Foundation content resolutions (c.f. the licensing policy). Statements of principle are all well and good, but when community positions are firmly entrenched they are likely to have little impact. In the area of images of identifiable subjects, I suspect the WMF will need to be sued over a particularly vicious set of circumstances before substantial progress will be made, and that is disappointing.
Nathan
On 1 June 2011 21:35, Nathan nawrich@gmail.com wrote:
Forgive me if I find these resolutions rather toothless; this is another in a string of board resolutions that simply "urge the projects." I'd love to understand what the Board thinks such resolutions will accomplish. I understand there are legal constraints on the ability of the Foundation to exercise control over content, but these "urgings" are weakly stated even compared to other Foundation content resolutions (c.f. the licensing policy). Statements of principle are all well and good, but when community positions are firmly entrenched they are likely to have little impact. In the area of images of identifiable subjects, I suspect the WMF will need to be sued over a particularly vicious set of circumstances before substantial progress will be made, and that is disappointing.
I agree. This is a very disappointing conclusion to a very long and expensive process. The only real resolution was to create a software feature that was proposed many times before the formal process started and had quite wide support too, so could have just been implemented.
The reason we needed this complex process is because the community was unable to sort the problem out on its own. A few urgings aren't going to make any difference. Those kind of urgings only work when they come from an individual or group with a lot of trust and respect that people think probably knows better than them. I'm sorry to say that the WMF board is not such a group (although that's mostly because the community largely doesn't know anything about them, rather than because they've done anything wrong).
On 1 June 2011 21:35, Nathan nawrich@gmail.com wrote:
Forgive me if I find these resolutions rather toothless; this is another in a string of board resolutions that simply "urge the projects." I'd love to understand what the Board thinks such resolutions will accomplish.
It says very effectively (I thought) to the censorious: "We have given your position a great deal of due careful consideration, and urge you to go away. The issue is dead." Of course, I could just be projecting my own feelings onto it. In which case, it's even better.
It should be thought of, IMO, as being primarily for external consumption.
- d.
I took it as stating the obvious, what we all knew to expect, with one bonus:
1. *"The Wikimedia Foundation Board affirms that Wikimedia projects are not censored"* is stated first, and as a positive affirmative assertion. Everything else follows that. I like that establishment of basic principle as being first and clearly said, before discussing the minority cases of contentious material. It gives something firm to stand on as a principle, if in years to come, this or that group or country tries to limit or manipulate. A positive statement's useful that way.
It then says what we all pretty much knew it would.
2. Actively manage uploaded files to make sure they are relevant and appropriate to our mission and criteria; and 3. Develop a way that people who don't want to see certain types of image can collapse or not see them without other people being affected.
No surprises, much as anyone expected. Endorsement of not-very-contentious conclusion.
FT2
On Wed, Jun 1, 2011 at 10:52 PM, David Gerard dgerard@gmail.com wrote:
On 1 June 2011 21:35, Nathan nawrich@gmail.com wrote:
Forgive me if I find these resolutions rather toothless; this is another in a string of board resolutions that simply "urge the projects." I'd love to understand what the Board thinks such resolutions will accomplish.
It says very effectively (I thought) to the censorious: "We have given your position a great deal of due careful consideration, and urge you to go away. The issue is dead." Of course, I could just be projecting my own feelings onto it. In which case, it's even better.
It should be thought of, IMO, as being primarily for external consumption.
On Wed, Jun 1, 2011 at 5:52 PM, David Gerard dgerard@gmail.com wrote:
On 1 June 2011 21:35, Nathan nawrich@gmail.com wrote:
Forgive me if I find these resolutions rather toothless; this is another in a string of board resolutions that simply "urge the projects." I'd love to understand what the Board thinks such resolutions will accomplish.
It says very effectively (I thought) to the censorious: "We have given your position a great deal of due careful consideration, and urge you to go away. The issue is dead." Of course, I could just be projecting my own feelings onto it. In which case, it's even better.
It should be thought of, IMO, as being primarily for external consumption.
- d.
I expect and hope that the WMF board is a little more honest and straightforward than that would suggest. The resolution could be read as CYA - an intentionally deflective statement with no concrete impact. But I'd prefer to assume that the Board expects some sort of gradual adjustment in procedures enacted by local projects on their own initiative, using the resolution as ammunition in the inevitable debates. Hard to be sure, so I'd rather hear it from the horse's mouth.
On 1 June 2011 23:03, Nathan nawrich@gmail.com wrote:
I expect and hope that the WMF board is a little more honest and straightforward than that would suggest. The resolution could be read as CYA - an intentionally deflective statement with no concrete impact.
I think that opening line is pretty damn clear and concrete.
"Hi, we're that 800lb gorilla. Today we actually have a clear opinion to state." I think you're greatly underestimating the impact that alone will have.
What bits are you finding insufficiently clear or strongly worded?
- d.
On Wed, Jun 1, 2011 at 3:03 PM, Nathan nawrich@gmail.com wrote:
The resolution could be read as CYA - an intentionally deflective statement with no concrete impact.
I feel that basically _is_ the role of the board. I feel like my dream board personified is a little like a judo master. When push to rule on a dispute, usually they should pull that energy and productively deflect that energy back to the community.
No concrete impact is all we need from the board on this. Status quo works great.
(also, there is the enactment of "Not Censored" at the foundation level. )
Alec
On Wed, Jun 1, 2011 at 3:17 PM, Alec Conroy alecmconroy@gmail.com wrote:
I feel that basically _is_ the role of the board. I feel like my dream board personified is a little like a judo master. When push to rule on a dispute, usually they should pull that energy and productively deflect that energy back to the community.
That is a lovely metaphor, though I occasionally feel in this role more like a student of Zen (complete with koans to study!)
I will say that the Board drafted these resolutions with good faith and a great deal of care, and the one thing I would ask as you debate them is to consider them as a whole. We think all of the principles we articulate are important, and have implications for how we manage our content. And a few of you have noted that these ideas are not new; of course that's true. We are simply building on the work that many community members have done over the years on this difficult problem.
If there are specific questions for the board, I or other trustees can try to answer them; but there are of course many areas of debate and opinion where we can only speak for ourselves individually. "The board's opinion", such as it is, is expressed in the resolutions.
best, -- phoebe
On 2 June 2011 00:00, phoebe ayers phoebe.wiki@gmail.com wrote:
I will say that the Board drafted these resolutions with good faith and a great deal of care, and the one thing I would ask as you debate them is to consider them as a whole. We think all of the principles we articulate are important, and have implications for how we manage our content. And a few of you have noted that these ideas are not new; of course that's true. We are simply building on the work that many community members have done over the years on this difficult problem.
The principles are, indeed, important. However, they are generally quite uncontroversial. There are vocal minorities that disagree with them, but I think the vast majority of the community agrees with the principles you have outlined and that has been clear from numerous discussions over the part couple of years. We didn't really have a problem with the principles, we had a problem with doing something about them. It's doing something about them that you haven't helped with.
"Wikimedia projects are curated and edited collections, according to certain principles: namely, we host only content that is both free and educational in nature."
So Board said that Wikinews is out of scope. Its nature is informational, not educational.
On 6/1/2011 2:03 PM, Milos Rancic wrote:
"Wikimedia projects are curated and edited collections, according to certain principles: namely, we host only content that is both free and educational in nature."
So Board said that Wikinews is out of scope. Its nature is informational, not educational.
I'm sorry, but I don't understand what distinction you're trying to make. In this context, those look like synonyms to me.
--Michael Snow
On 06/01/2011 11:05 PM, Michael Snow wrote:
On 6/1/2011 2:03 PM, Milos Rancic wrote:
"Wikimedia projects are curated and edited collections, according to certain principles: namely, we host only content that is both free and educational in nature."
So Board said that Wikinews is out of scope. Its nature is informational, not educational.
I'm sorry, but I don't understand what distinction you're trying to make. In this context, those look like synonyms to me.
If so, I am fine with it. What do Board members mean with that?
On Wed, Jun 1, 2011 at 2:14 PM, Milos Rancic millosh@gmail.com wrote:
On 06/01/2011 11:05 PM, Michael Snow wrote:
On 6/1/2011 2:03 PM, Milos Rancic wrote:
"Wikimedia projects are curated and edited collections, according to certain principles: namely, we host only content that is both free and educational in nature."
So Board said that Wikinews is out of scope. Its nature is informational, not educational.
I'm sorry, but I don't understand what distinction you're trying to make. In this context, those look like synonyms to me.
If so, I am fine with it. What do Board members mean with that?
Hi Milos,
We meant what is stated there: that Wikimedia project content should be at a minimum both free and educational in nature. (In general, you can assume that language in resolutions like this is intentional). However, you can also safely assume that the Board did not specifically discuss the scope of Wikinews when writing this resolution; we were focused on the topic at hand. I personally think there is a very valid argument to be made that Wikinews, like most news sources and like the rest of our projects, is educational (as well as possessing other qualities, such as the more general quality of being informative, which also arguably applies to all of our projects).
If people want to discuss these subtleties of language (or the scope of wikinews) in depth however, a separate thread might be best.
-- phoebe
On 06/01/2011 11:56 PM, phoebe ayers wrote:
We meant what is stated there: that Wikimedia project content should be at a minimum both free and educational in nature. (In general, you can assume that language in resolutions like this is intentional). However, you can also safely assume that the Board did not specifically discuss the scope of Wikinews when writing this resolution; we were focused on the topic at hand. I personally think there is a very valid argument to be made that Wikinews, like most news sources and like the rest of our projects, is educational (as well as possessing other qualities, such as the more general quality of being informative, which also arguably applies to all of our projects).
Thanks for clarification. According to the last sentence, I would say that everything is fine for now, at least.
On 1 June 2011 22:05, Michael Snow wikipedia@frontier.com wrote:
On 6/1/2011 2:03 PM, Milos Rancic wrote:
"Wikimedia projects are curated and edited collections, according to certain principles: namely, we host only content that is both free and educational in nature."
So Board said that Wikinews is out of scope. Its nature is informational, not educational.
I'm sorry, but I don't understand what distinction you're trying to make. In this context, those look like synonyms to me.
They aren't synonyms, although there is a lot of overlap. To inform means to provide information to. Educating involves understanding and skills too. I think Wikinews can both inform and educate, though.
Information is educational. When I read Wikinews, it educates me as to significant matters going on in the world, and provides other related resources.
FT2
On Wed, Jun 1, 2011 at 10:03 PM, Milos Rancic millosh@gmail.com wrote:
"Wikimedia projects are curated and edited collections, according to certain principles: namely, we host only content that is both free and educational in nature."
So Board said that Wikinews is out of scope. Its nature is informational, not educational.
On 06/01/2011 11:11 PM, FT2 wrote:
Information is educational. When I read Wikinews, it educates me as to significant matters going on in the world, and provides other related resources.
I fully agree with you. Any information is educational; it just depends of particular project scope would it be there or not. For example, you don't want to put Shakespeare's works on Wikipedia, because the proper place for it is Wikisource. Particular colony of ants is educational and could be interesting for making a photo of it, but it is not likely that it would get an article on Wikipedia. And so on.
But, why then Board decided to force "educational" component as mandatory in its statement? If there is no difference between "informational" and "educational", statement "we host only content that is both free and educational in nature" doesn't have a lot of sense, as it would sound like "we only host content which is free" (and that's the very known information), as "content" is more precise synonym for "information" (to be precise "content" could be interpreted as "set of information" or so).
So, I would like to know distinction between "informational" and "educational" interpreted by Board members; or it is, as you and Michael said, just not so common interpretation of the synonyms of the adjective "educational".
On Wed, Jun 1, 2011 at 5:33 PM, Milos Rancic millosh@gmail.com wrote:
I fully agree with you. Any information is educational; it just depends of particular project scope would it be there or not. For example, you don't want to put Shakespeare's works on Wikipedia, because the proper place for it is Wikisource. Particular colony of ants is educational and could be interesting for making a photo of it, but it is not likely that it would get an article on Wikipedia. And so on.
But, why then Board decided to force "educational" component as mandatory in its statement? If there is no difference between "informational" and "educational", statement "we host only content that is both free and educational in nature" doesn't have a lot of sense, as it would sound like "we only host content which is free" (and that's the very known information), as "content" is more precise synonym for "information" (to be precise "content" could be interpreted as "set of information" or so).
So, I would like to know distinction between "informational" and "educational" interpreted by Board members; or it is, as you and Michael said, just not so common interpretation of the synonyms of the adjective "educational".
I doubt the language selection was parsed to such a degree. Whatever the difference, it's minor, and I seriously doubt they meant to exclude Wikinews (or, for that matter, the huge volume of data hosted on all the projects that is meta-content rather than outward-facing educational material) from the umbrella mission of the WMF. Seems like there are more substantial questions about the resolution the Board could address.
Nathan
On Wed, Jun 1, 2011 at 2:33 PM, Milos Rancic millosh@gmail.com wrote:
On 06/01/2011 11:11 PM, FT2 wrote:
Information is educational.
I fully agree with you. Any information is educational;
I also strongly agree that any information is educational. The two terms are synonymous.
We use 'educational' as a fig leaf that hides a fundamental schism over the future of the movement. There are two very different interpretations of "educational" and thus two fundamentally different visions of the road ahead.
One view holds that information is inherently educational, and in an ideal world, we'd host all the information we could. (with obvious restrictions like legality and not-being-evil). In the future, we will expand the diversity of both our audience AND our content.
A second camp views "educational' in a very narrow "classroom" way. To this group, most information is "garbage/trivial/cruft", not "educational". In an ideal world, our movement would include ONLY the very best educational materials. In the future, we will expand the diversity of our audience while honing our content to only the very best. .. All the same, I'm glad to read that mass content deletion has been taken off the table.
Alec
On 1 June 2011 16:17, phoebe ayers phoebe.wiki@gmail.com wrote:
This week, the Wikimedia Foundation Board of Trustees unanimously passed a resolution addressing the issue of controversial content on the projects. The Board also unanimously passed a resolution addressing images of identifiable, living people on the projects. The resolutions are posted at: http://wikimediafoundation.org/wiki/Resolution:Controversial_content
http://wikimediafoundation.org/wiki/Resolution:Images_of_identifiable_people
I think the more important part of this announcement is the resolution on images of identifiable people, and it is this section that requires considerably more self-examination on the part of every project that hosts or uses images.
Commons has a guideline on the subject, found here: http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people
This is a starting point for the discussion. In particular, I think that the Board in its resolution is looking specifically at the uploading of images by third party editors/users who are not the subject of the image, nor its creator, nor the person who has claimed the right to it. (The most obvious example is images from Flickr, but there are many other "resource" sites.) This, of course, does not exempt users who upload images that they create or own. The resolution and (where applicable) guidelines do place an important onus on both the uploader and the project to ensure that personality rights have been appropriately confirmed. The resolution places this obligation on a near-equal footing to ensuring that copyright status is appropriate to the project.
It may also be worth noting that the term "identifiable" is used. Unusual physical structures, jewelry, tattoos or other features may render the subject of an image identifiable even if the facial features are not included in the image.
It should probably be emphasized that this would apply equally to projects that host "fair use" or other images, and is not simply an expectation on Commons.
Risker/Anne
On Wed, Jun 1, 2011 at 6:53 PM, Risker risker.wp@gmail.com wrote:
I think the more important part of this announcement is the resolution on images of identifiable people, and it is this section that requires considerably more self-examination on the part of every project that hosts or uses images.
Commons has a guideline on the subject, found here: http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people
This is a starting point for the discussion. In particular, I think that the Board in its resolution is looking specifically at the uploading of images by third party editors/users who are not the subject of the image, nor its creator, nor the person who has claimed the right to it. (The most obvious example is images from Flickr, but there are many other "resource" sites.) This, of course, does not exempt users who upload images that they create or own. The resolution and (where applicable) guidelines do place an important onus on both the uploader and the project to ensure that personality rights have been appropriately confirmed. The resolution places this obligation on a near-equal footing to ensuring that copyright status is appropriate to the project.
It may also be worth noting that the term "identifiable" is used. Unusual physical structures, jewelry, tattoos or other features may render the subject of an image identifiable even if the facial features are not included in the image.
It should probably be emphasized that this would apply equally to projects that host "fair use" or other images, and is not simply an expectation on Commons.
Risker/Anne _______________________________________________
I agree that, for me at least, the identifiable image issue is much more key. A lot less has been done there by the projects themselves than in the realm of controversial text, and there is considerable risk of harm at stake. While the board resolution does urge some appropriate action on the part of the projects, it doesn't require it. In the absence of a firm commitment from the board to respect the principles and spirit of model release laws, I have no reason to expect that any project, particularly Commons, will undertake a re-evaluation of their policies. I'm most particularly disheartened that the board specifically endorsed having uploaders affirm subject permission; that kind of requirement will do nothing to stop bad faith uploaders.
What I'd ask the Board is this: what do you expect the impact of such a resolution (referring again specifically to the image content resolution) will be? By restating the ideology that the projects are not censored in one resolution, and merely "urging" a minimal standard of care in the other, is it not likely that the status quo will reign and we'll be in the same position years from now absent some other motivating event? Is it really going to take a series of Seigenthaler moments to spur substantial change?
On Wed, Jun 1, 2011 at 4:38 PM, Nathan nawrich@gmail.com wrote:
What I'd ask the Board is this: what do you expect the impact of such a resolution (referring again specifically to the image content resolution) will be? By restating the ideology that the projects are not censored in one resolution, and merely "urging" a minimal standard of care in the other, is it not likely that the status quo will reign and we'll be in the same position years from now absent some other motivating event?
The important point is that it's not the role of the board to change the status quo of a specific project in dramatic ways-- it's their job to speak up for what they think the project should be doing.
Non-notable people shouldn't be shown on WM against their will- that isn't controversial. There are a lot of details to work out about when it's reasonable to infer consent and when it's not, but that's a debate for the leadership of Commons.
So long as a project stays within the law, doesn't grossly misuse their resources, and isn't "evil", it is free to make mistakes.
Alec
Risker, 02/06/2011 00:53:
I think the more important part of this announcement is the resolution on images of identifiable people [...]
I agree. It's also the first time (if memory serves me well and if I understand it correctly) that the board asks for a specific content policy of a specific project to be changed in some direction: «Strengthen [...] the current Commons guideline», compared to «_continue to practice_ rigorous active curation of content» in the other resolution. I'm not sure I like it, although the spirit of the resolutions is balanced and agreeable.
It should probably be emphasized that this would apply equally to projects that host "fair use" or other images, and is not simply an expectation on Commons.
That's not what the resolution says, though. I think that it would be more interesting to have some clear legal guideline to understand what's /legal/ in different countries (at least the most important ones, or the countries whose citizens more frequently ask deletion of images to the WMF), because this is something the community is often not able to produce and the WMF has the indisputable right to keep the projects lawful (at least in some countries, which are tough to define; see e.g. the quite generic draft http://meta.wikimedia.org/wiki/Legal/Legal_Policies#Applicable_Law). Instead, we'll now have a «Consent of the subject (who is a non-public figure) is required even for photographs taken in public places in the following countries [...] (incomplete list)», a "Citation needed" in "Legal issues" section and finally some links to random websites about some (very few) countries. I don't expect this to improve much on Commons, not to speak of other projects; I know people who work in television companies and it's clear that even professionals don't know all the details of the law, because it's just too complex.
Nemo
On Thu, Jun 2, 2011 at 1:28 AM, Federico Leva (Nemo) nemowiki@gmail.com wrote:
Risker, 02/06/2011 00:53:
I think the more important part of this announcement is the resolution on images of identifiable people [...]
I agree. It's also the first time (if memory serves me well and if I understand it correctly) that the board asks for a specific content policy of a specific project to be changed in some direction: «Strengthen [...] the current Commons guideline», compared to «_continue to practice_ rigorous active curation of content» in the other resolution. I'm not sure I like it, although the spirit of the resolutions is balanced and agreeable.
It should probably be emphasized that this would apply equally to projects that host "fair use" or other images, and is not simply an expectation on Commons.
That's not what the resolution says, though.
Hi Nemo and all,
I don't want to get into wikilawyering (that dread disease!), but the resolution does state that we urge the global community to: "Ensure that all projects that host media have policies in place regarding the treatment of images of identifiable living people in private situations."
In other words, we recognize that these concerns apply to any project that hosts images. However, clearly the vast majority of our photos are on Commons and Commons already has an example of such a policy (inspiring this resolution); that policy could be an inspiration for policies on other projects.
I think that it would be
more interesting to have some clear legal guideline to understand what's /legal/ in different countries (at least the most important ones, or the countries whose citizens more frequently ask deletion of images to the WMF), because this is something the community is often not able to produce and the WMF has the indisputable right to keep the projects lawful (at least in some countries, which are tough to define; see e.g. the quite generic draft http://meta.wikimedia.org/wiki/Legal/Legal_Policies#Applicable_Law).
I agree this would be helpful. However, I think the spirit of the consent guideline, and of this resolution, is more in line with other editorial policies -- it's a common-sense guideline that is meant to encourage special editorial care under particular circumstances, and we are stating that being freely licensed by itself is not enough in these cases. This is part of being a curated collection.
There are parallels here with the BLP resolution (http://wikimediafoundation.org/wiki/Resolution:Biographies_of_living_people), and those are intentional. In that resolution, we don't tell the editorial community that they need to understand every detail of libel law in every country; rather, simply that particular editorial care should be exercised in those cases. Similarly, image copyright law is exceedingly complex (though if anyone can figure it out I bet it's Wikimedians) but that's not quite the point: this is an editorial discussion. I think Risker laid out some of the issues involved very well.
-- phoebe
This is an essay. May be someone can find it useful.
For a number of reasons which are not appropriate to address here, three weeks ago I voluntarily left Russian Wikipedia, which used to be my home wiki for four years, and decided to turn to low-key activity in the articles in English Wikipedia. I was of course participating in all these strategy discussions, have seen the user statistics, have heard the reasoning about the "low hanging fruit", and was under impression that there is not so much work to do on en.wp, and that I would have difficulties finding topics where I can easily contribute. I had very little previous experience with en.wp, mainly inserting interwiki links, images, and correcting typos and factual mistakes.
Well, for the first article I just happened to notice an important topic where there was no English articles. I created an article on the famous Russian architect Yakov Bukhvostov. That took me several days, and I collected some material which could be interesting for the usability team (what difficulties a novice, even with extended experience of editing a Wikimedia project, can face), but this is not my point. In the meanwhile I got a welcome notice on my talk page which, in particular, contained a link to Wikiproject Russia. I was not going to participate in the project, but followed the link out of curiosity, thinking that the project may have a list of important missing articles, or important articles for improvement, or smth like that. Indeed, it had a number of lists, and, looking through them, I noticed an article on Solvychegodsk listed. Solvychegodsk is a historical town remotely located in the North of Russia, which I happened to visit in 2005, and which I still know pretty well. I followed the link and found an article which was pretty much reasonably, by no means a stub, but where I immediately could see what information I could add. Well, I added the information, extending the article a bit, then followed the link to the Kotlassky District - the second-order administrative unit, similar to a county in the US, where Solvychegodsk was located. The article was shorter than my sentence above describing the district, and had three templates: one saying it is a stub, another one that it is too short, and the third one that it lacks the geographic coordinates. In two or three days, I extended it from 850 bytes to 10kbytes, using Russian sources found in the internet (I also have some books at home, but I did not use them for this particular article) and cleaning up some Commons categories. Then I noticed that out of 20 districts of Arkhangelsk Oblast (to which Solvychegodsk belongs) 10 do not have articles, 9 are in a pitiful state, and 1 I have just extended. I started working on them, and then noticed that some important related articles are missing as well - for instance, I created two articles on district centers, two on (big) rivers located in these districts, and even one on an artist who had an estate in one of the districts. In two weeks I created or considerably improved about 15 articles. First I tried to translate parts of the articles from Russia Wikipedia, but quickly discovered that they contain some factual errors and in most cases also not very extended, so that the articles I created are in most cases better that the articles in ru.wp. I have written all of the articles myself, and used exclusively Russian sources (for most if not all of the subjects, English sources do not exist).
From what I can see, nobody else is working on this class of articles. I
had some help from user Ezhiki in the beginning, and the help was very much appreciated: instead of going to my talk page and trying to explain me smth, he just edited a couple of the pages I was working on, correcting some typical mistakes (templates, some standart names, spelling of Russian names etc which are by no means a common knowledge) and referring to relevant policies. On one occasion, I also asked the wikiproject about a usage of a certain template. All in all, I had not more than 10 edits of other users in "my" articles. Walking to the office today, I tried to calculate how many articles are waiting for my attention. Well, Russia has about 80 republics and oblasts, which amounts to say 2000 districts. Most of them are stubs or non-existent. Including district centers, rivers, mountains, stubs on the towns, some notable persons I would have interest to write an article on, this must be not less than 3000 articles. With my speed of 1 article per day (which presently I am not planning to increase) this would be 10 years of my work. Note that this is just a narrow topic which does not overlap with my professional interest (I am a theoretical physicist specializing in nanoscience). In this field, I am just an amateur (may be slightly above the average level).
My conclusions from the two-weeks experience:
1. May be the really low hanging fruit, almost on the ground level, has been picked up, but in the vast majority of articles there is much room for improvement. Note that I did not add any special things - only the basic info which you expect to find in the encyclopaedia. I did not aim at GA or FA. I used may be 10% of the information I had, and what I had I found in the internet.
2. I seem to be perfectly suited for these articles - I have a general interest in the topic, and also I know Russian and can work with Russian sources. On the other hand, I an not a native speaker, and I can leave some slight spelling errors / incorrect wording etc. This may be a problem, and generally I am not sure how this problem can be solved. However, if I estimate the balance, probably I created more of a useful product than I created troubles.
3. Even for an technically experienced user as myself it is difficult to start contributing to the project. I was able to clear the barrier, but I am afraid many of the regular users would leave, not being able to understand the usage of templates and similar things. On there other side, I got some necessary help, and I know where to ask if I need more.
4. Comparing the quality of this particular class of articles to Russian Wikipedia, I see both advantages and disadvantages. Obviously, there is a high chance that someone just living in the district will add some info in the article in Russian Wikipedia. On the other hand, there are two major problems with these articles in Russian Wikipedia - copyright violation (big pieces are added to articles and stay there for years - things became considerably better with the implementation of flagged revisions, but still persist), and adding a big number of insignificant and often unsourced details, including spamming of local interest websites. The English articles are completely free of these problems. I realize though that this line of reasoning can not be generalized to all articles, since the articles on other topics may have very different issues.
Summing up, there is plenty of room on the bottom (c) Richard Feynman.
Cheers Yaroslav
On Fri, Jun 3, 2011 at 2:02 PM, Yaroslav M. Blanter putevod@mccme.ru wrote:
this would be 10 years of my work. Note that this is just a narrow topic which does not overlap with my professional interest (I am a theoretical physicist specializing in nanoscience). In this field, I am just an amateur (may be slightly above the average level).
You are interested in this topic; many users (or most, I am afraid) are not.
My conclusions from the two-weeks experience:
- May be the really low hanging fruit, almost on the ground level, has
been picked up, but in the vast majority of articles there is much room for improvement. Note that I did not add any special things - only the basic info which you expect to find in the encyclopaedia. I did not aim at GA or FA. I used may be 10% of the information I had, and what I had I found in the internet.
You have an expert knowledge in this topic, namely the knowledge of Russian language. Most of English Wikipedia editors don't.
There is a plenty of low-hanging fruits in classical music articles (especially in Russian Wikipedia). With a use of Grove Dictionary of Music and Musicians, one could add more than 1000 paragraphs of text into Wikipedia by just retelling its content. Grove has a basic coverage of topics, and it has 29 volumes in it; I think that describes the breadth of the topic well. Anyone claiming there is nothing to write about there must be kidding.
The reason nobody writes them are: 1. Nobody cares; 2. Nobody understands the terms (most of which are fairly easy to learn).
- I seem to be perfectly suited for these articles - I have a general
interest in the topic, and also I know Russian and can work with Russian sources. On the other hand, I an not a native speaker, and I can leave some slight spelling errors / incorrect wording etc. This may be a problem, and generally I am not sure how this problem can be solved. However, if I estimate the balance, probably I created more of a useful product than I created troubles.
The very fact that you were able to write this paragraph suggests that your English is enough to write English Wikipedia articles. Grammar, if not horrible, is an issue for FAs and GAs, and there you can get someone to aid you.
- Even for an technically experienced user as myself it is difficult to
start contributing to the project. I was able to clear the barrier, but I am afraid many of the regular users would leave, not being able to understand the usage of templates and similar things. On there other side, I got some necessary help, and I know where to ask if I need more.
Please, share your findings with our usability team. Even if it does not exist anymore (I am not completely sure), they would be glad to help.
- Comparing the quality of this particular class of articles to Russian
Wikipedia, I see both advantages and disadvantages. Obviously, there is a high chance that someone just living in the district will add some info in the article in Russian Wikipedia. On the other hand, there are two major problems with these articles in Russian Wikipedia - copyright violation (big pieces are added to articles and stay there for years - things became considerably better with the implementation of flagged revisions, but still persist), and adding a big number of insignificant and often unsourced details, including spamming of local interest websites. The English articles are completely free of these problems. I realize though that this line of reasoning can not be generalized to all articles, since the articles on other topics may have very different issues.
I assume that this is because they are less popular. Articles about US towns may have the same problems.
-Victor
You are interested in this topic; many users (or most, I am afraid) are not.
You have an expert knowledge in this topic, namely the knowledge of Russian language. Most of English Wikipedia editors don't.
There is a plenty of low-hanging fruits in classical music articles (especially in Russian Wikipedia). With a use of Grove Dictionary of Music and Musicians, one could add more than 1000 paragraphs of text into Wikipedia by just retelling its content. Grove has a basic coverage of topics, and it has 29 volumes in it; I think that describes the breadth of the topic well. Anyone claiming there is nothing to write about there must be kidding.
The reason nobody writes them are:
- Nobody cares;
- Nobody understands the terms (most of which are fairly easy to
learn).
Well, I think you just confirm my points. For a given topic, most of the readers and most of the editors shown no interest. I am interested in may be 10 topics but do not care about 90 more. This is perfectly fine. Moreover, I am in this context just an average wikimedian, who happens to have just one advantage - speaking a foreign language (with respect to the project I am contributing to). There are plenty of these walking around. And I did not have any difficulties finding a topic where I can contribute, where very few people can contribute, and which is in principle needed (not a luxury or excessive). There would be a different story if I were contributing to the topic of my professional interest, but like this I could easily find a dozen of more topics I can work on. (May be when I retire and Wikipedia is still around, I will also work on those). Which means there is plenty of fruit around, and much of this fruit is actually low-hanging. That was my point.
For the usability, last time I checked the usability wiki was dead as well as the Wikiproject Usability on en.wp. If someone can show me what would be an appropriate place to list my issues (meaning there is somebody there who can use them), I would do it, otherwise I would not bother to spend an hour describing them.
Cheers Yaroslav
On 3 June 2011 14:54, Yaroslav M. Blanter putevod@mccme.ru wrote:
For the usability, last time I checked the usability wiki was dead as well as the Wikiproject Usability on en.wp. If someone can show me what would be an appropriate place to list my issues (meaning there is somebody there who can use them), I would do it, otherwise I would not bother to spend an hour describing them.
wikitech-l is a place that the people who would be dealing with the issues would be reading. But foundation-l is not a bad place either, IMO.
- d.
On 06/03/11 3:46 AM, Victor Vasiliev wrote:
On Fri, Jun 3, 2011 at 2:02 PM, Yaroslav M. Blanterputevod@mccme.ru wrote:
this would be 10 years of my work. Note that this is just a narrow topic which does not overlap with my professional interest (I am a theoretical physicist specializing in nanoscience). In this field, I am just an amateur (may be slightly above the average level).
You are interested in this topic; many users (or most, I am afraid) are not.
It can often be better to write about something where one is NOT a professional. It takes special skill to write about professional areas in a way that the general public will understand, even in one's native language.
My conclusions from the two-weeks experience:
- May be the really low hanging fruit, almost on the ground level, has
been picked up, but in the vast majority of articles there is much room for improvement. Note that I did not add any special things - only the basic info which you expect to find in the encyclopaedia. I did not aim at GA or FA. I used may be 10% of the information I had, and what I had I found in the internet.
You have an expert knowledge in this topic, namely the knowledge of Russian language. Most of English Wikipedia editors don't.
There is a plenty of low-hanging fruits in classical music articles (especially in Russian Wikipedia). With a use of Grove Dictionary of Music and Musicians, one could add more than 1000 paragraphs of text into Wikipedia by just retelling its content. Grove has a basic coverage of topics, and it has 29 volumes in it; I think that describes the breadth of the topic well. Anyone claiming there is nothing to write about there must be kidding.
The reason nobody writes them are:
- Nobody cares;
- Nobody understands the terms (most of which are fairly easy to learn).
Most unilingual English editors are surprised by the vast quantity of low hanging fruit. Out of curiosity at one time I looked up the fairly common Spanish name "Reyes" in the original 70 volume "Enciclopedia universal ilustrada". I found 30 individuals there with that simple uncompounded surname. Only two of these appeared in the English Wikipedia, and only one of the two in the Spanish Wikipedia. Could something similar be said of the great Soviet Encyclopedia? I have before looked at a couple short encyclopedic works in Russian, one relating to hockey and one to movies. Both did address their subject as it related to the United States as well as their own country. Comparably sized American publications would leave the reader wondering if there even was such a thing as a Russian film industry. The dismissive attitude that Russian works were written by Communists becomes quite thin in subjects that are inherently apolitical.
I assume that this is because they are less popular. Articles about US towns may have the same problems.
In the earliest days the starting articles for most US small towns were botted in. Many people complained about this at the time.
Ec
You have an expert knowledge in this topic, namely the knowledge of Russian language. Most of English Wikipedia editors don't.
<...>
Most unilingual English editors are surprised by the vast quantity of low hanging fruit. Out of curiosity at one time I looked up the fairly common Spanish name "Reyes" in the original 70 volume "Enciclopedia universal ilustrada". I found 30 individuals there with that simple uncompounded surname. Only two of these appeared in the English Wikipedia, and only one of the two in the Spanish Wikipedia. Could something similar be said of the great Soviet Encyclopedia? I have before looked at a couple short encyclopedic works in Russian, one relating to hockey and one to movies. Both did address their subject as it related to the United States as well as their own country. Comparably
sized American publications would leave the reader wondering if there even was such a thing as a Russian film industry. The dismissive attitude that Russian works were written by Communists becomes quite thin in subjects that are inherently apolitical.
Indeed, 7 out of 19 articles I created over three weeks are in Great Soviet Encyclopaedia (and about a dozen of more among those I expanded as well). One of these (on a folklorist Anna Astakhova, which I started today) does not exist in Russian Wikipedia. Admittedly all of them (in GSE, not those I created) are very short and hardly useful for creation a Wikipedia article - although GSE gives a clear proof of notability, just in case.
Cheers Yaroslav
There is certainly a lot of low hanging fruit.
I don't think we've covered more than a few percent of the topics currently considered notable. We still have a factor of 10 or more to grow covering things that others have already included in existing summaries and references.
But the parallel to Feynman's idea of "the bottom" might be verifiable and locally important/notable things which until now have not been included in encyclopedias for reasons of size and cost. At that level, we could grow another few magnitudes while still organizing and sharing valuable knowledge.
On Wed, Jun 8, 2011 at 6:05 AM, Ray Saintonge saintonge@telus.net wrote:
Most unilingual English editors are surprised by the vast quantity of low hanging fruit. Out of curiosity at one time I looked up the fairly common Spanish name "Reyes" in the original 70 volume "Enciclopedia universal ilustrada". I found 30 individuals there with that simple uncompounded surname. Only two of these appeared in the English Wikipedia, and only one of the two in the Spanish Wikipedia.
Remembering a discussion about Wiki Loves Monuments, I believe there are around 1M European monuments... most don't have an article in any language, or any mention at all in a wikipedia. And yet there are public lists that indicate briefly their existence, importance, and basic information.
It might be useful to expand lists of "topics in <foo>" for various existing data sources, such as publishers' or libraries' lists of published works or art; major non-english encyclopedias; major specialist encyclopedias; and lists of monuments or public works.
SJ
On Wed, Jun 01, 2011 at 01:17:15PM -0700, phoebe ayers wrote:
This week, the Wikimedia Foundation Board of Trustees unanimously passed a resolution addressing the issue of controversial content on the projects. The Board also unanimously passed a resolution addressing images of identifiable, living people on the projects. The resolutions are posted at:
http://wikimediafoundation.org/wiki/Resolution:Controversial_content
Re:
# We ask the Executive Director, in consultation with the # community, to develop and implement a personal image hiding # feature that will enable readers to easily hide images hosted on # the projects that they do not wish to view, either when first # viewing the image or ahead of time through preference settings. # We affirm that no image should be permanently removed because of # this feature, only hidden; that the language used in the # interface and development of this feature be as neutral and # inclusive as possible; that the principle of least astonishment # for the reader is applied; and that the feature be visible, # clear and usable on all Wikimedia projects for both logged-in # and logged-out readers.
At the time this point looked pretty uncontroversial, especially in context. However, I feel that most currently proposed mecahnisms for implementation of this point actually (indirectly) violate the other points in the resolution.
To wit, the proposed implementation of a category system for controversial content (required for many plausible implementations of this point) is exploitable by 3rd parties and/or can lead to in-community conflicts; depending on the exact chosen implementation.
Such exploits and/or conflicts could indirectly end up censoring wikipedia, and/or end up violating the Neutral Point Of View founding principle.
Also, the consultation with the community is currently rather heavy handed; by which I mean that the power balance might not be in favor of those who are most influenced by the implementation.
This is something that should certainly be watched carefully, and perhaps further amendment, clarification, or retraction by the foundation might be needed.
sincerely, Kim Bruning
On Fri, Aug 19, 2011 at 9:03 AM, Kim Bruning kim@bruning.xs4all.nl wrote:
On Wed, Jun 01, 2011 at 01:17:15PM -0700, phoebe ayers wrote:
This week, the Wikimedia Foundation Board of Trustees unanimously passed a resolution addressing the issue of controversial content on the projects. The Board also unanimously passed a resolution addressing images of identifiable, living people on the projects. The resolutions are posted at:
http://wikimediafoundation.org/wiki/Resolution:Controversial_content
Re:
# We ask the Executive Director, in consultation with the # community, to develop and implement a personal image hiding # feature that will enable readers to easily hide images hosted on # the projects that they do not wish to view, either when first # viewing the image or ahead of time through preference settings. # We affirm that no image should be permanently removed because of # this feature, only hidden; that the language used in the # interface and development of this feature be as neutral and # inclusive as possible; that the principle of least astonishment # for the reader is applied; and that the feature be visible, # clear and usable on all Wikimedia projects for both logged-in # and logged-out readers.
At the time this point looked pretty uncontroversial, especially in context. However, I feel that most currently proposed mecahnisms for implementation of this point actually (indirectly) violate the other points in the resolution.
To wit, the proposed implementation of a category system for controversial content (required for many plausible implementations of this point) is exploitable by 3rd parties and/or can lead to in-community conflicts; depending on the exact chosen implementation.
Such exploits and/or conflicts could indirectly end up censoring wikipedia, and/or end up violating the Neutral Point Of View founding principle.
Also, the consultation with the community is currently rather heavy handed; by which I mean that the power balance might not be in favor of those who are most influenced by the implementation.
This is something that should certainly be watched carefully, and perhaps further amendment, clarification, or retraction by the foundation might be needed.
sincerely, Kim Bruning
Thanks Kim; I agree there's a lot of room to figure out the best way to do this, and problems with possible interpretations or implementations. That's part of the thought behind putting this up for another round of discussion (albeit in a different manner than the other rounds).
As for the power balance issue: this tool is ultimately for the readers. We don't have a good way for readers to vote, though. And I am also personally sympathetic to the idea that the stakeholders -- i.e. the editing community -- should be the ones to vote anyway. We did set a very low suffrage bar for this vote (10 edits, in good standing): I think it might be the lowest ever, actually. I think one thing that will come out of this, which I'm really happy about, is that we will learn a lot more about a broadly consultative vote and how to do it well.
best, phoebe
On 19/08/2011 2:46 PM, phoebe ayers wrote:
I think one thing that will come out of this, which I'm really happy about, is that we will learn a lot more about a broadly consultative vote and how to do it well.
I think that the first thing that should be learned -- and indeed that should have been learned /before/ this farce -- is that begging the question in a "referendum" is fundamentally dishonest.
I was oh so very pleased to learn that I get to give my opinion on insignificant implementation details of a "feature" that stands in opposition to everything Wikipedia stands for which is going to be committed against us whether we like it or not.
-- Coren / Marc
On 19 August 2011 20:50, Marc A. Pelletier marc@uberbox.org wrote:
I was oh so very pleased to learn that I get to give my opinion on insignificant implementation details of a "feature" that stands in opposition to everything Wikipedia stands for which is going to be committed against us whether we like it or not.
If it gets a really low score on the 0-10 "do you want this?" scale, it is likely to be referred back to the board before proceeding (with "the community hate this idea *this much*"). So it's still killable.
(0 might be "let's publicise http://en.wikipedia.org/wiki/Help:Options_to_not_see_an_image ".)
- d.
On Fri, Aug 19, 2011 at 09:33:07PM +0100, David Gerard wrote:
On 19 August 2011 20:50, Marc A. Pelletier marc@uberbox.org wrote:
I was oh so very pleased to learn that I get to give my opinion on insignificant implementation details of a "feature" that stands in opposition to everything Wikipedia stands for which is going to be committed against us whether we like it or not.
If it gets a really low score on the 0-10 "do you want this?" scale, it is likely to be referred back to the board before proceeding (with "the community hate this idea *this much*"). So it's still killable.
Actually, there's no "do you want this" question.
It's all more along the lines of "Choose! Choose the form of your destructor!" [1]
So I picked the Stay-Puft Marshmellow Man, as that's the most harmless thing that I could think of.
sincerely, Kim Bruning
[1] http://en.wikiquote.org/wiki/Ghostbusters
On Fri, Aug 19, 2011 at 11:46:45AM -0700, phoebe ayers wrote:
Thanks Kim; I agree there's a lot of room to figure out the best way to do this, and problems with possible interpretations or implementations. That's part of the thought behind putting this up for another round of discussion (albeit in a different manner than the other rounds).
I appreciate the thought. :-)
What I think we all want to do is approach the problem from all angles, and converge on a working answer. (Or at least converge on "We've done our best at least")
As for the power balance issue: this tool is ultimately for the readers. We don't have a good way for readers to vote, though. And I am also personally sympathetic to the idea that the stakeholders -- i.e. the editing community -- should be the ones to vote anyway. We did set a very low suffrage bar for this vote (10 edits, in good standing): I think it might be the lowest ever, actually.
I feel that things like the voting population or the suffrage bar are not so relevant to the outcome of a vote. The questions, counting method, and interpretation of the numbers tend to have a lot more influence.
I think one thing that will come out of this, which I'm really happy about, is that we will learn a lot more about a broadly consultative vote and how to do it well.
You're so enthusiastic about this! Can we chat online at some point?
In any case, as long as we have another round after this, possibly things will end up ok in the long term.
sincerely, Kim Bruning
-- "be professional, be polite, have a plan to reach consensus with everyone you meet"
On Fri, Aug 19, 2011 at 06:03:24PM +0200, Kim Bruning wrote:
To wit, the proposed implementation of a category system for controversial content (required for many plausible implementations of this point) is exploitable by 3rd parties and/or can lead to in-community conflicts; depending on the exact chosen implementation.
I would like to expand on this point.
Apparently, the generation of such categorisation schemes has previously been discussed by the American Library Association as far back as 1951.
A relevant paragraph from their conclusion (last amended Jan 19, 2005):
"Labels on library materials may be viewpoint-neutral directional aids that save the time of users, or they may be attempts to prejudice or discourage users or restrict their access to materials. When labeling is an attempt to prejudice attitudes, it is a censor's tool. The American Library Association opposes labeling as a means of predisposing people's attitudes toward library materials." [1]
I assume that the board was previously not aware of the central role such a categorisation scheme would take in any form of practicable image filter.
sincerely, Kim Bruning
[1] http://www.ala.org/Template.cfm?Section=interpretations&Template=/Conten...
On Tue, Aug 23, 2011 at 16:20, Kim Bruning kim@bruning.xs4all.nl wrote:
On Fri, Aug 19, 2011 at 06:03:24PM +0200, Kim Bruning wrote:
To wit, the proposed implementation of a category system for controversial content (required for many plausible implementations of this point) is exploitable by 3rd parties and/or can lead to in-community conflicts; depending on the exact chosen implementation.
I would like to expand on this point.
Apparently, the generation of such categorisation schemes has previously been discussed by the American Library Association as far back as 1951.
A relevant paragraph from their conclusion (last amended Jan 19, 2005):
"Labels on library materials may be viewpoint-neutral directional aids that save the time of users, or they may be attempts to prejudice or discourage users or restrict their access to materials. When labeling is an attempt to prejudice attitudes, it is a censor's tool. The American Library Association opposes labeling as a means of predisposing people's attitudes toward library materials." [1]
I assume that the board was previously not aware of the central role such a categorisation scheme would take in any form of practicable image filter.
Board was aware of that, as the first Robert Harris' report included very similar text from Canadian librarian association.
On Tue, Aug 23, 2011 at 05:21:23PM +0200, Milos Rancic wrote:
Board was aware of that, as the first Robert Harris' report included very similar text from Canadian librarian association.
I would then like to point out that there is no practical way to make a value-neutral categorisation scheme to use for filtering.
You can't have your cake and eat it too. Either the scheme is neutral *or* it contains a value judgement so as to be usable for filtering. Logic holds that there is no middle ground here. Further any value judgement once made cannot be culturally neutral in practice.
I'm going to assume in good faith that the board was not aware of this minor flaw. But I can't really stretch it much further.
Could board members please chime in and elucidate?
sincerely, Kim Bruning
I suspect the only answer you are going to get is that some people or places would need it. What you will not get is any explanation for why the WMF itself should do it. instead of letting those who want it, to do it as they wish outside Wikipedia, as our licensing permits. (You might get the reason that we can do it more gently, and perhaps we could, but then it would hardly satisfy those who really want to censor. Any one on the board who wants to pursue that route is free to set up an organization to enable their own gentle value judgments) If someone wants to make what amounts to a skin for whatever purposes, there is nothing to stop them. We are right to have a license that permits this, even if 95% of us disapprove of their purposes. They can use whatever they might want from the neutral tags we should be applying in our categorization, or devise whatever of their own they find appropriate for expressing their own values..
Our principles of freedom for the use of our material are so broad they make no value judgments at all, and include even the freedom to censor.
On Tue, Aug 23, 2011 at 11:14 AM, Kim Bruning kim@bruning.xs4all.nl wrote:
On Tue, Aug 23, 2011 at 05:21:23PM +0200, Milos Rancic wrote:
Board was aware of that, as the first Robert Harris' report included very similar text from Canadian librarian association.
I would then like to point out that there is no practical way to make a value-neutral categorisation scheme to use for filtering.
You can't have your cake and eat it too. Either the scheme is neutral *or* it contains a value judgement so as to be usable for filtering. Logic holds that there is no middle ground here. Further any value judgement once made cannot be culturally neutral in practice.
I'm going to assume in good faith that the board was not aware of this minor flaw. But I can't really stretch it much further.
Could board members please chime in and elucidate?
sincerely, Kim Bruning
-- [Non-pgp mail clients may show pgp-signature as attachment] gpg (www.gnupg.org) Fingerprint for key FEF9DD72 5ED6 E215 73EE AD84 E03A 01C5 94AC 7B0E FEF9 DD72
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Tue, Aug 23, 2011 at 8:14 AM, Kim Bruning kim@bruning.xs4all.nl wrote:
On Tue, Aug 23, 2011 at 05:21:23PM +0200, Milos Rancic wrote:
Board was aware of that, as the first Robert Harris' report included very similar text from Canadian librarian association.
I would then like to point out that there is no practical way to make a value-neutral categorisation scheme to use for filtering.
This seems like an over-hasty statement. There are many possible categorization schemes that are neutral; the ALA in fact makes that distinction itself, since libraries (obviously) use all kinds of labeling and categorization schemes all the time. The ALA and other library organizations have taken a stand against censorious and non-neutral labeling, not all labeling. If you keep reading the ALA page you linked, it says that the kind of labels that are not appropriate are when "the prejudicial label is used to warn, discourage or prohibit users or certain groups of users from accessing the material" -- e.g. a label that reads "not appropriate for children". That does not mean that picture books for kids, or mystery novels, or large-print books, aren't labeled as such in every public library in the country -- and that is the difference between informative and prejudicial labeling.
The ALA also makes a point of stating that materials should be on open shelves and accessible to everyone regardless of labeling -- this comes out of, among other things, the once-common practice of not allowing children in the adult section of the library. The natural equivalent for us I think is to make sure that all materials we host are accessible to everyone regardless of any label, which is certainly a principle we have and continue to uphold.
The Board didn't specify any particular mechanism or system in our resolution. What we did was to ask for a particular kind of feature and spell out some principles for its development. We talked about neutral language in the interface, and our intent was exactly that distinction I noted between informative and prejudicial -- we do not wish to set up a system that privileges certain value judgments about content. We wish *readers to have a choice* when they use our projects -- one they do not have now unless they are remarkably technically inclined and forward-looking.
We didn't address the categorization system in particular because frankly, it's not our business. It's the community's, and tech's. And the Trustees didn't all agree on whether we thought categorization as proposed in the first draft of the system was the best idea, anyway; some of us thought it was appropriately in line with the principle of least astonishment, and some of us thought it could lead to problems. But we did come to consensus on the high-level idea as expressed in the resolution, and we agreed and understood that the ideas around how to implement it would have to iterate, with reevaluation along the way. But after all, developing informative, neutral and useful systems for organizing information is something that the Wikimedia projects have become world-famous for -- so if anyone can do it I have faith that we can :)
As I told DGG, there's a lot of caveats in that resolution. And those caveats are there for a reason. It should not be extrapolated that the Board as a whole *actually* supports a particular, or different, or more censorious, filtering scheme. What we want is for people to easily be able to hide images for themselves if they don't want to see them when using our projects. (And we also want other things, like better tools for Commons, that are expressed in other parts of that resolution.)
I know we are all looking forward to seeing the referendum results, and the data from it will need to be carefully considered. In the meantime I am glad to see more discussion of this, but I am remembering that it is a stressful topic!
best, -- phoebe
On Thu, Aug 25, 2011 at 12:10:12PM -0700, phoebe ayers wrote:
On Tue, Aug 23, 2011 at 8:14 AM, Kim Bruning kim@bruning.xs4all.nl wrote:
I would then like to point out that there is no practical way to make a value-neutral categorisation scheme to use for filtering.
This seems like an over-hasty statement. There are many possible categorization schemes that are neutral;
Labels designed for other purposes need not be prejudicial, I agree. But then I don't think they would be (as) suitable for use by a filter.
organizations have taken a stand against censorious and non-neutral labeling, not all labeling. If you keep reading the ALA page you linked, it says that the kind of labels that are not appropriate are when "the prejudicial label is used to warn, discourage or prohibit users or certain groups of users from accessing the material" -- e.g. a label that reads "not appropriate for children".
Well, as far as I can tell, any label that suggests "... and you might want to filter this" falls under this definition of a prejudicial label that is used to warn and discourage users (and may be used by 3rd parties to prohibit users).
Am I missing something?
The Board didn't specify any particular mechanism or system in our resolution.
Fair enough. So if we can get much of what some people want without resorting to labelling, that'd be ok too?
But after all, developing informative, neutral and useful systems for organizing information is something that the Wikimedia projects have become world-famous for -- so if anyone can do it I have faith that we can :)
I'm not sure I like the idea of developing an informative and useful system for non-neutrality. I know we're not *deliberately* trying to do that, but the discussion *does* keep crossing that line accidentally, and I kind of get quesy easily. ^^;;
As I told DGG, there's a lot of caveats in that resolution. And those caveats are there for a reason. It should not be extrapolated that the Board as a whole *actually* supports a particular, or different, or more censorious, filtering scheme. What we want is for people to easily be able to hide images for themselves if they don't want to see them when using our projects. (And we also want other things, like better tools for Commons, that are expressed in other parts of that resolution.)
*nod* I understand what is wanted.
I just think that part of the discussion should be about the actual practical feasibility of this aim within the limits of our foundation objectives. We've been asking the wrong questions at the wrong times.
I know we are all looking forward to seeing the referendum results, and the data from it will need to be carefully considered.
The data needs to be very, very carefully piped to /dev/null.
The list of issues with the 'referendum' is too long to fit in the margin of this foundation-l post. Chief among those issues, however, is that it is not a referendum. [1]
To wit: it doesn't ask whether people accept or reject this proposal.
An example of something closer to a referendum on the topic can be found at de.wikipedia at [2].
In the meantime I am glad to see more discussion of this, but I am remembering that it is a stressful topic!
I think we're trying to fit too many angels on the heads of our pins. I'm kind of worried, at what point will the angels fall off? O:-)
sincerely, Kim Bruning
[1] "A referendum (also known as a plebiscite or a ballot question) is a direct vote in which an entire electorate is asked to either accept or reject a particular proposal." Referendum. (2011, August 23). In Wikipedia, The Free Encyclopedia. Retrieved 22:06, August 25, 2011, from http://en.wikipedia.org/w/index.php?title=Referendum&oldid=446246705
[2] http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
Phebe, I ask you once more, how do you go from the statement that filtering should be available to those who want to use it, to the statement that W<F should provide filtering? It's like going from the statement that people should be able to use Wikipedia content for political purposes, to the statement that the WMF should do so. Or to make it plainer, that people who find Wikipedia articles appropriate for advocating their religious beliefs may use the content for that purpose, to that the WMF should find some universally acceptable sets of spiritual beliefs, and use its content to advocate them. Taking one of the proposed possibilities (probably the one that instigated this), providing for censoring images on the grounds of sexual content is doing exactly that for views on sexual behavior. We're officially saying that X is content you may find objectionable, but Y isn't. That's making an editorial statement about what is shown on X and Y. We can make a descriptive statement, as libraries do, (in this case, perhaps that X shows naked human female breasts, and Y shows male ones) but not an editorial one, that one not the other is likely to be objectionable. Anyone is certainly free to make such an assertion, but not the Foundation.
I want to ask you something else. It's been suggested several times at various places that the present resolution is justified as a compromise to prevent a considerably more repressive form of censorship. I'm not asking who, though I can guess one or two of them from their previous public statements, but I think it would be very enlightening to know those positions. Perhaps you could, however, say what those proposals were. I am not going to put you on the spot by asking whether if such a view had been the majority, whether you would you still have voted for it to preserve unanimity. I am however going to ask whether the fact that such proposals were entertained, shows the validity of the argument that we're on a slippery slope. Once you admit censorship, it's hard to limit it; once you admit POV editing, it inevitable develops into arrant promotionalism. Censorship is inherently POV editing.
David
On Thu, Aug 25, 2011 at 3:10 PM, phoebe ayers phoebe.wiki@gmail.com wrote:
On Tue, Aug 23, 2011 at 8:14 AM, Kim Bruning kim@bruning.xs4all.nl wrote:
On Tue, Aug 23, 2011 at 05:21:23PM +0200, Milos Rancic wrote:
Board was aware of that, as the first Robert Harris' report included very similar text from Canadian librarian association.
I would then like to point out that there is no practical way to make a value-neutral categorisation scheme to use for filtering.
This seems like an over-hasty statement. There are many possible categorization schemes that are neutral; the ALA in fact makes that distinction itself, since libraries (obviously) use all kinds of labeling and categorization schemes all the time. The ALA and other library organizations have taken a stand against censorious and non-neutral labeling, not all labeling. If you keep reading the ALA page you linked, it says that the kind of labels that are not appropriate are when "the prejudicial label is used to warn, discourage or prohibit users or certain groups of users from accessing the material" -- e.g. a label that reads "not appropriate for children". That does not mean that picture books for kids, or mystery novels, or large-print books, aren't labeled as such in every public library in the country -- and that is the difference between informative and prejudicial labeling.
The ALA also makes a point of stating that materials should be on open shelves and accessible to everyone regardless of labeling -- this comes out of, among other things, the once-common practice of not allowing children in the adult section of the library. The natural equivalent for us I think is to make sure that all materials we host are accessible to everyone regardless of any label, which is certainly a principle we have and continue to uphold.
The Board didn't specify any particular mechanism or system in our resolution. What we did was to ask for a particular kind of feature and spell out some principles for its development. We talked about neutral language in the interface, and our intent was exactly that distinction I noted between informative and prejudicial -- we do not wish to set up a system that privileges certain value judgments about content. We wish *readers to have a choice* when they use our projects -- one they do not have now unless they are remarkably technically inclined and forward-looking.
We didn't address the categorization system in particular because frankly, it's not our business. It's the community's, and tech's. And the Trustees didn't all agree on whether we thought categorization as proposed in the first draft of the system was the best idea, anyway; some of us thought it was appropriately in line with the principle of least astonishment, and some of us thought it could lead to problems. But we did come to consensus on the high-level idea as expressed in the resolution, and we agreed and understood that the ideas around how to implement it would have to iterate, with reevaluation along the way. But after all, developing informative, neutral and useful systems for organizing information is something that the Wikimedia projects have become world-famous for -- so if anyone can do it I have faith that we can :)
As I told DGG, there's a lot of caveats in that resolution. And those caveats are there for a reason. It should not be extrapolated that the Board as a whole *actually* supports a particular, or different, or more censorious, filtering scheme. What we want is for people to easily be able to hide images for themselves if they don't want to see them when using our projects. (And we also want other things, like better tools for Commons, that are expressed in other parts of that resolution.)
I know we are all looking forward to seeing the referendum results, and the data from it will need to be carefully considered. In the meantime I am glad to see more discussion of this, but I am remembering that it is a stressful topic!
best, -- phoebe _______________________________________________ foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Fri, Aug 26, 2011 at 6:45 AM, David Goodman dggenwp@gmail.com wrote:
I want to ask you something else. It's been suggested several times at various places that the present resolution is justified as a compromise to prevent a considerably more repressive form of censorship.
This implies that the proposed image hiding feature is a less repressive form of censorship. I do not see the proposed feature as censorship - all the images remain on the site. Nothing is removed. Nothing is suppressed. Everything remains. [1]
I am however
going to ask whether the fact that such proposals were entertained, shows the validity of the argument that we're on a slippery slope.
Are we truly on a slippery slope with 'informative labelling' with neutral language? Or can this be considered another aspect of curation?
Once you admit censorship, it's hard to limit it; once you admit POV editing, it inevitable develops into arrant promotionalism. Censorship is inherently POV editing.
Are we really admitting censorship via the front or even through the back door through the image hiding feature?
If everything remains on the site, and you and I can continue to see everything that exists just as we do today, how are we 'admitting censorship'? I have read the comments on meta, about the possibility of this opening doors to government requests for removal of content - that, in my view would be censorship. The Board resolution affirms that "Wikimedia projects are not censored." [2]
Cheers Bishakha
[1] http://en.wikipedia.org/wiki/Censorship [2] http://wikimediafoundation.org/wiki/Resolution:Controversial_content
On 26 August 2011 08:55, Bishakha Datta bishakhadatta@gmail.com wrote:
Are we truly on a slippery slope with 'informative labelling' with neutral language? Or can this be considered another aspect of curation?
We have a category system. Modulo idiots (the danger of a wiki is that people can edit it), it mostly works.
How neutral can a "block any image in these categories" system be?
- d.
On Fri, Aug 26, 2011 at 10:15:48AM +0100, David Gerard wrote:
On 26 August 2011 08:55, Bishakha Datta bishakhadatta@gmail.com wrote:
Are we truly on a slippery slope with 'informative labelling' with neutral language? Or can this be considered another aspect of curation?
We have a category system. Modulo idiots (the danger of a wiki is that people can edit it), it mostly works.
How neutral can a "block any image in these categories" system be?
In that scenario: On day 1, the system, and categories, will be entirely neutral. However the categories that are blessed (cursed) by the image-hiding system are now potentially non-neutral.
Due to the way wikis work: by day ~365, the categories will very likely be non-neutral in practice. We'll then be in the same situation as if we had started out with non-neutral categories in the first place (verboten).
sincerely, Kim Bruning
On Fri, Aug 26, 2011 at 01:25:32PM +0530, Bishakha Datta wrote:
On Fri, Aug 26, 2011 at 6:45 AM, David Goodman dggenwp@gmail.com wrote:
I want to ask you something else. It's been suggested several times at various places that the present resolution is justified as a compromise to prevent a considerably more repressive form of censorship.
This implies that the proposed image hiding feature is a less repressive form of censorship. I do not see the proposed feature as censorship - all the images remain on the site. Nothing is removed. Nothing is suppressed. Everything remains.
The image hiding feature itself is not a form of censorship, as far as I'm aware of.
The data used to feed the image hiding feature can be classified as a "censorship tool" (Source: ALA... Read The Fine Thread for details).
Even if we *never* build the image hider itself, but just prepare special categories for it, we would be participating in (stages of) censorship.
sincerely, Kim Bruning
I think there are definitely some neutral criteria which might be applicable. And maybe there are some criteria which are harder to neutralize (yeah, i know - has a different meaning :) )
Take for example nudity. It should be possible to create a category "Images that show a vagina", "images that show a penis" which can even be subcategorized into "(...) as main topic of the picture" or "(...) as detail of the picture". It will require some work and thinking by neutrality thinkers like you, but it should be possible. And I'm confident that you and the likes of you will stay close on the topic to help us remember that we should make it as objective as possible.
The next step is that someone can use these neutral categories to choose what he/she wants or does not want to see. For example, maybe someone has a fear of elevators, so that person can hide all images in the category "images that show an elevator".
Violence is definitely a topic harder to define objectively - but I'm confident we'll find a way to do that. If people have problems with that, we shouldn't change the categories (we could add more), but they should change their filter, and choose other categories to hide/show.
The only truely non-neutral part could be where we suggest which categories someone might want to hide. Or packages of categories.
Lodewijk
2011/8/26 Kim Bruning kim@bruning.xs4all.nl
On Fri, Aug 26, 2011 at 01:25:32PM +0530, Bishakha Datta wrote:
On Fri, Aug 26, 2011 at 6:45 AM, David Goodman dggenwp@gmail.com
wrote:
I want to ask you something else. It's been suggested several times at various places that the present resolution is justified as a compromise to prevent a considerably more repressive form of censorship.
This implies that the proposed image hiding feature is a less repressive form of censorship. I do not see the proposed feature as censorship - all the images remain on the site. Nothing is removed. Nothing is suppressed. Everything remains.
The image hiding feature itself is not a form of censorship, as far as I'm aware of.
The data used to feed the image hiding feature can be classified as a "censorship tool" (Source: ALA... Read The Fine Thread for details).
Even if we *never* build the image hider itself, but just prepare special categories for it, we would be participating in (stages of) censorship.
sincerely, Kim Bruning
-- [Non-pgp mail clients may show pgp-signature as attachment] gpg (www.gnupg.org) Fingerprint for key FEF9DD72 5ED6 E215 73EE AD84 E03A 01C5 94AC 7B0E FEF9 DD72
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
If this should succeed I shall work as I do now, in other areas. I want to add content and keep out spam, not to dispute whether , for example, the "images that show a human penis" should include ones where the anatomical details are blurred, or only the outline visible. There is no point in discussing the details of censorship with censors; there is point is discussing the concept of censorship with the people who are inclined to support it.
Labeling designed to accomodate censorship is censorship, as Kim says. This labeling is proposed to be done on the basis not of the regular commons categories, but of special ones designed for the purpose; not on the regular WP editors, but a special committee. (There is a valid argument that the present manner of categorizing images needs some major improvements) As Lodewiijk says, anyone who wants to make use of these categories -for any purpose, is free to do so outside WP. f they want to design a filter imposed on access to WP using them, they are free to do so. If they want to use their own categories for this, they are free to do so. If they want to use computer image analysis for this, they are free to do so;, I personally consider these at best unproductive things to do, but anyone else is free to think & act otherwise.
The key question remains. ''Why on wikipedia'' when it even gives the appearance of being opposed to our principles.
As fr the slippery slope, Kim gives one way it can happen.There are others, which I think are pretty obvious to those who would support them. It would take very little to change the wording or appearance on the button to make it more obtrusive, or to initially hide the image. It would be easily possible to have the hide preference panel set to hide particular classes of images unless changed, instead of being blank. It would take the flip of a single bit to change the default to "hide," whether for anon users, or everyone. It wouldn't be that hard to make changing the default for some classes of images a two-step process, with the second being "are you sure?" , or even "are you of legal age in your jurisdiction?". All of these steps are under the control of the people who imposed the system in the first place.
This is why I asked the question, what more drastic proposals are being supported. at the board? The very fact that they were suggested at the board level implies there are some there who would do these things, and proves the slippery slope argument to be real. . Eventually we may not have someone as sensible as phoebe to stop them (and the others who feel this way. (but as they are not commenting it is not appropriate to not name them--I give them my apologies.) . Now, if I am wrong, and there were not any more drastic alternatives considered, I will need to retract this--but it was described as a compromise.
On 26 August 2011 16:06, David Goodman dggenwp@gmail.com wrote:
This labeling is proposed to be done on the basis not of the regular commons categories, but of special ones designed for the purpose; not on the regular WP editors, but a special committee.
Ooh, *really*. Then this initiative will be bitterly resisted at every turn.
- d.
On 26 August 2011 12:35, Kim Bruning kim@bruning.xs4all.nl wrote:
This implies that the proposed image hiding feature is a less repressive form of censorship. I do not see the proposed feature as censorship - all the images remain on the site. Nothing is removed. Nothing is suppressed. Everything remains.
The image hiding feature itself is not a form of censorship, as far as I'm aware of.
Just as an interesting point I've not seen mentioned yet: ar.wp has an image-hiding feature, implemented using a template (قالب:إخفاء صورة) and which effectively conceals the image until the user clicks to display.
It's manually added to pages, is currently used in ~100 (predominantly medical/sexual?) articles, and has been used for approximately three years. I'm not aware of any other projects currently using a similar one, but it doesn't seem to have caused the end of the world there :-) My Arabic is basically nonexistent, so while I can tell there *are* some past discussions about it, I've no idea what they were saying. Anyone?
On Sat, Aug 27, 2011 at 11:40:24PM +0100, Andrew Gray wrote:
On 26 August 2011 12:35, Kim Bruning kim@bruning.xs4all.nl wrote:
This implies that the proposed image hiding feature is a less repressive form of censorship. I do not see the proposed feature as censorship - all the images remain on the site. Nothing is removed. Nothing is suppressed. Everything remains.
The image hiding feature itself is not a form of censorship, as far as I'm aware of.
I'm not aware of any other projects currently using a similar one, but it doesn't seem to have caused the end of the world there :-)
Right, because the image hiding feature itself is not a problem. No one has a problem with it AFAIK. No issues have been raised with the concept of hiding images in this thread IIRC.
The only thing ar could improve is to have the feature for all images, not just those with templates. (This would fix a potential minor exploit).
(Resummarizing thread: The novel proposal is to have an image filter that would *hide images by category*. And the problem there is the categories themselves, because categories used in an image filter are per definition non-neutral. (They're saying "you might want to filter this", and that's non neutral). ALA has -for something like half a century- said that this kind of categorization is "a tool for censorship". )
sincerely, Kim Bruning
On 26 August 2011 02:15, David Goodman dggenwp@gmail.com wrote:
make it plainer, that people who find Wikipedia articles appropriate for advocating their religious beliefs may use the content for that purpose, to that the WMF should find some universally acceptable sets of spiritual beliefs, and use its content to advocate them. Taking one of the proposed possibilities (probably the one that instigated this), providing for censoring images on the grounds of sexual content is doing exactly that for views on sexual behavior. We're officially saying that X is content you may find objectionable, but Y isn't. That's making an editorial statement about what is shown on X and Y.
I've finally twigged what's worrying me about this discussion.
We're *already* making these editorial statements, deciding what is and isn't appropriate or offensive for the readers on their behalf, and doing it within articles on a daily basis.
When we, as editors, consider including a contentious image, we have a binary choice - do it or don't do it. It's not like text, where we can spend a nice meandering paragraph weighting the merits of position A and position B and referring in passing to position C; the picture's there or it isn't, and we've gone with the "inclusionist" or the "exclusionist" position. At the moment, there is a general consensus that, more or less, we prefer including images unless there's a problem with them, and when we exclude them, we do so after an editorial discussion, guided by policy and determined by our users on the basis of what they feel is appropriate, offensive, excessively graphic, excessively salacious, etc.
In other words, we decide whether or not to include images, and select between images, based on our own community standards. These aren't particularly bad as standards go, and they're broadly sensible and coherent and clear-headed, but they're ours; they're one particular perspective, and it is inextricably linked to the systemic bias issues we've known about for years and years. This is a bit of a weird situation for us to be in. We can - and we do - try hard to make our texts free of systemic bias, of overt value judgements, and so forth, and then we promptly have to make binary yes-or-no value judgements about what is and isn't appropriate to include in them. As Kim says upthread somewhere, these judgements can't and won't be culturally neutral.
(To use a practical example, different readers in different languages get given different sets of images, handled differently, in comparable Wikipedia articles - sometimes the differences are trivial, sometimes significant. Does this mean that one project is neutral in selection and one not? All sorts of cans of worms...)
As such, I don't think considering this as the first step towards censorship, or as a departure from initial neutrality, is very meaningful; it's presuming that the alternative is reverting to a neutral and balanced status quo, but that never really existed. The status quo is that every reader, in every context, gets given the one particular image selection that a group of Wikipedians have decided is appropriate for them to have, on a take-it-or-leave-it basis...
On Sat, Aug 27, 2011 at 10:19:45PM +0100, Andrew Gray wrote:
On 26 August 2011 02:15, David Goodman dggenwp@gmail.com wrote:
make it plainer, that people who find ? Wikipedia articles appropriate for advocating their religious beliefs may use the content for that purpose, to that the WMF should find some universally acceptable sets of spiritual beliefs, and use its content to advocate them. Taking one of the proposed possibilities (probably the one that instigated this), providing for censoring images on the grounds of sexual content is doing exactly that for views on ?sexual behavior. We're officially saying that X is content you may find objectionable, but Y isn't. That's making an editorial statement about what is shown on X and Y.
I've finally twigged what's worrying me about this discussion.
We're *already* making these editorial statements, deciding what is and isn't appropriate or offensive for the readers on their behalf, and doing it within articles on a daily basis.
What's worrying me about your discussion is that you're differentiating between users who are readers and users who are editors. You've given up on wikipedia being a wiki? ;-)
As such, I don't think considering this as the first step towards censorship, or as a departure from initial neutrality, is very meaningful; it's presuming that the alternative is reverting to a neutral and balanced status quo, but that never really existed.
I'm not saying there are no (minor) issues with neutrality as it stands. We work hard on that every day. I don't believe that just because we're not 100% perfect means that we can just give up and throw out NPOV entirely!
The status quo is that every reader, in every context, gets given the one particular image selection that a group of Wikipedians have decided is appropriate for them to have, on a take-it-or-leave-it basis...
The status quo is that anyone can edit an article or enter a discussion about the article. I don't understand why you would say it is take-it-or-leave-it. I can still, today, as an anon, remove or add images as I see fit. This is permitted and even encouraged, provided that what I am doing is sane (And thus most likely meets consensus).
sincerely, Kim Bruning
On Sun, Aug 28, 2011 at 11:04 AM, Kim Bruning kim@bruning.xs4all.nl wrote:
I can still, today, as an anon, remove or add images as I see fit. This is permitted and even encouraged, provided that what I am doing is sane (And thus most likely meets consensus).
Tried it lately?
pb
___________________ Philippe Beaudette Head of Reader Relations Wikimedia Foundation, Inc.
415-839-6885, x 6643
philippe@wikimedia.org
On Thu, Aug 25, 2011 at 10:10 PM, phoebe ayers phoebe.wiki@gmail.com wrote:>
This seems like an over-hasty statement. There are many possible categorization schemes that are neutral; the ALA in fact makes that distinction itself, since libraries (obviously) use all kinds of labeling and categorization schemes all the time. The ALA and other library organizations have taken a stand against censorious and non-neutral labeling, not all labeling. If you keep reading the ALA page you linked, it says that the kind of labels that are not appropriate are when "the prejudicial label is used to warn, discourage or prohibit users or certain groups of users from accessing the material" -- e.g. a label that reads "not appropriate for children". That does not mean that picture books for kids, or mystery novels, or large-print books, aren't labeled as such in every public library in the country -- and that is the difference between informative and prejudicial labeling.
Would I be incorrect in pointing out that American public librarys routinely exclude world famous childrens book author Astrid Lindgrens childrens books, because to puritanical minds a man who can elevate himself with a propeller beany, and look into childs rooms thereby, smacks too much of pedophilia?
On Wed, Sep 21, 2011 at 6:31 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
On Thu, Aug 25, 2011 at 10:10 PM, phoebe ayers phoebe.wiki@gmail.com wrote:>
This seems like an over-hasty statement. There are many possible categorization schemes that are neutral; the ALA in fact makes that distinction itself, since libraries (obviously) use all kinds of labeling and categorization schemes all the time. The ALA and other library organizations have taken a stand against censorious and non-neutral labeling, not all labeling. If you keep reading the ALA page you linked, it says that the kind of labels that are not appropriate are when "the prejudicial label is used to warn, discourage or prohibit users or certain groups of users from accessing the material" -- e.g. a label that reads "not appropriate for children". That does not mean that picture books for kids, or mystery novels, or large-print books, aren't labeled as such in every public library in the country -- and that is the difference between informative and prejudicial labeling.
Would I be incorrect in pointing out that American public librarys routinely exclude world famous childrens book author Astrid Lindgrens childrens books, because to puritanical minds a man who can elevate himself with a propeller beany, and look into childs rooms thereby, smacks too much of pedophilia?
Uh... yes, you would be incorrect? I certainly checked out Astrid Lindgren books from the public library when I was a kid. I have never heard of them getting challenged in the US. Citation needed?
The ALA maintains a list of books that do get routinely challenged in US libraries here: http://www.ala.org/ala/issuesadvocacy/banned/frequentlychallenged/index.cfm. Note, this just means someone *asked* for the book to be removed from the public or school library, not that it actually was; libraries generally stand up to such requests.
Also note that challenges are typically asking for the book to be removed from the library altogether -- restricting access to it for everyone in the community -- as opposed to simply not looking at it yourself or allowing your own kids to check it out. It's the 'removal for everyone' part that is the problem; the issue here is freedom of choice: people should have the right to read, or not read, a particular book as they see fit.
-- phoebe
On Wed, Sep 21, 2011 at 5:53 PM, phoebe ayers phoebe.wiki@gmail.com wrote:
On Wed, Sep 21, 2011 at 6:31 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
On Thu, Aug 25, 2011 at 10:10 PM, phoebe ayers phoebe.wiki@gmail.com wrote:>
This seems like an over-hasty statement. There are many possible categorization schemes that are neutral; the ALA in fact makes that distinction itself, since libraries (obviously) use all kinds of labeling and categorization schemes all the time. The ALA and other library organizations have taken a stand against censorious and non-neutral labeling, not all labeling. If you keep reading the ALA page you linked, it says that the kind of labels that are not appropriate are when "the prejudicial label is used to warn, discourage or prohibit users or certain groups of users from accessing the material" -- e.g. a label that reads "not appropriate for children". That does not mean that picture books for kids, or mystery novels, or large-print books, aren't labeled as such in every public library in the country -- and that is the difference between informative and prejudicial labeling.
Would I be incorrect in pointing out that American public librarys routinely exclude world famous childrens book author Astrid Lindgrens childrens books, because to puritanical minds a man who can elevate himself with a propeller beany, and look into childs rooms thereby, smacks too much of pedophilia?
Uh... yes, you would be incorrect? I certainly checked out Astrid Lindgren books from the public library when I was a kid. I have never heard of them getting challenged in the US. Citation needed?
The ALA maintains a list of books that do get routinely challenged in US libraries here: http://www.ala.org/ala/issuesadvocacy/banned/frequentlychallenged/index.cfm. Note, this just means someone *asked* for the book to be removed from the public or school library, not that it actually was; libraries generally stand up to such requests.
Also note that challenges are typically asking for the book to be removed from the library altogether -- restricting access to it for everyone in the community -- as opposed to simply not looking at it yourself or allowing your own kids to check it out. It's the 'removal for everyone' part that is the problem; the issue here is freedom of choice: people should have the right to read, or not read, a particular book as they see fit.
The wikipedia article does mention the controversy, but omits the fact that several libraries did in fact pull the books from their inventory...
http://en.wikipedia.org/wiki/Karlsson-on-the-Roof
Am 21.09.2011 17:21, schrieb Jussi-Ville Heiskanen:
On Wed, Sep 21, 2011 at 5:53 PM, phoebe ayersphoebe.wiki@gmail.com wrote:
On Wed, Sep 21, 2011 at 6:31 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
On Thu, Aug 25, 2011 at 10:10 PM, phoebe ayersphoebe.wiki@gmail.com wrote:>
This seems like an over-hasty statement. There are many possible categorization schemes that are neutral; the ALA in fact makes that distinction itself, since libraries (obviously) use all kinds of labeling and categorization schemes all the time. The ALA and other library organizations have taken a stand against censorious and non-neutral labeling, not all labeling. If you keep reading the ALA page you linked, it says that the kind of labels that are not appropriate are when "the prejudicial label is used to warn, discourage or prohibit users or certain groups of users from accessing the material" -- e.g. a label that reads "not appropriate for children". That does not mean that picture books for kids, or mystery novels, or large-print books, aren't labeled as such in every public library in the country -- and that is the difference between informative and prejudicial labeling.
Would I be incorrect in pointing out that American public librarys routinely exclude world famous childrens book author Astrid Lindgrens childrens books, because to puritanical minds a man who can elevate himself with a propeller beany, and look into childs rooms thereby, smacks too much of pedophilia?
Uh... yes, you would be incorrect? I certainly checked out Astrid Lindgren books from the public library when I was a kid. I have never heard of them getting challenged in the US. Citation needed?
The ALA maintains a list of books that do get routinely challenged in US libraries here: http://www.ala.org/ala/issuesadvocacy/banned/frequentlychallenged/index.cfm. Note, this just means someone *asked* for the book to be removed from the public or school library, not that it actually was; libraries generally stand up to such requests.
Also note that challenges are typically asking for the book to be removed from the library altogether -- restricting access to it for everyone in the community -- as opposed to simply not looking at it yourself or allowing your own kids to check it out. It's the 'removal for everyone' part that is the problem; the issue here is freedom of choice: people should have the right to read, or not read, a particular book as they see fit.
The wikipedia article does mention the controversy, but omits the fact that several libraries did in fact pull the books from their inventory...
Most of the very popular books where removed due to other problems. Some would have a format/case that would not suite (Madonna for example). Some others would be bought and immediately "sold out". It's simply not the job of a library to represent bestsellers as soon they come out for give away. That is often misinterpreted as banned books. It just leads to the fact, that some books are bought later on, when the hype settled down.
Am 21.09.2011 16:53, schrieb phoebe ayers:
On Wed, Sep 21, 2011 at 6:31 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
On Thu, Aug 25, 2011 at 10:10 PM, phoebe ayersphoebe.wiki@gmail.com wrote:>
This seems like an over-hasty statement. There are many possible categorization schemes that are neutral; the ALA in fact makes that distinction itself, since libraries (obviously) use all kinds of labeling and categorization schemes all the time. The ALA and other library organizations have taken a stand against censorious and non-neutral labeling, not all labeling. If you keep reading the ALA page you linked, it says that the kind of labels that are not appropriate are when "the prejudicial label is used to warn, discourage or prohibit users or certain groups of users from accessing the material" -- e.g. a label that reads "not appropriate for children". That does not mean that picture books for kids, or mystery novels, or large-print books, aren't labeled as such in every public library in the country -- and that is the difference between informative and prejudicial labeling.
Would I be incorrect in pointing out that American public librarys routinely exclude world famous childrens book author Astrid Lindgrens childrens books, because to puritanical minds a man who can elevate himself with a propeller beany, and look into childs rooms thereby, smacks too much of pedophilia?
Uh... yes, you would be incorrect? I certainly checked out Astrid Lindgren books from the public library when I was a kid. I have never heard of them getting challenged in the US. Citation needed?
The ALA maintains a list of books that do get routinely challenged in US libraries here: http://www.ala.org/ala/issuesadvocacy/banned/frequentlychallenged/index.cfm. Note, this just means someone *asked* for the book to be removed from the public or school library, not that it actually was; libraries generally stand up to such requests.
Also note that challenges are typically asking for the book to be removed from the library altogether -- restricting access to it for everyone in the community -- as opposed to simply not looking at it yourself or allowing your own kids to check it out. It's the 'removal for everyone' part that is the problem; the issue here is freedom of choice: people should have the right to read, or not read, a particular book as they see fit.
-- phoebe
As described multiple times earlier.
That is not the main problem. The categorization of the content _by ourselfs_ is the problem. It is strongly against the basic rules that made Wikipedia motivative and big. Your advocacy means more harm then benefit for the project. We waste an enormous effort, open new battlefields aside from the content/article related discussions and we open the door to censorship. We would set an example that censorship or self censorship is needed! Is it that what you try to reach?
It's your basic philosophy that sucks. It's _not_ the choice of the reader to hide image he don't like. It's the choice of the reader to hide image that others don't like! Now get a cup of tea and think about it.
Tobias
Am 21.09.2011 um 17:36 schrieb Tobias Oelgarte tobias.oelgarte@googlemail.com:
It's your basic philosophy that sucks. It's _not_ the choice of the reader to hide image he don't like. It's the choice of the reader to hide image that others don't like! Now get a cup of tea and think about it.
It's the bad double-think that sucks. In most cases pictures give no neccessary information in an article or they represent no NPOV information at all. They just illustrate. No piece of information would be missing if the pictures were linked instead of shown. Often it is sheer random which picture is choosen for an article.
But You are right. The basic conflict is philosophical. The question behind is: Shall we continue as tough guys with porn pictures, no limits and no rules as everything started or shall we include more sensitive people, women and nations?
Shall our knowledge come rude in one step to everybody or shall we try to reach more people by making steps of least astonishment towards the same truth, but in a pace everybody can live with?
For me this discussion is hypocrite. Don't hide Yoursef behind the "choice of the reader". The writers of an article choose alone. They choose words, order and content. The pictures are in most cases the least important of these. So every article hides a lot of information the writers choose not to show. That's normal. And they normally flippantly forget to write a style the more sensitive can live with, that's all.
Am 21.09.2011 18:31, schrieb Kanzlei:
Am 21.09.2011 um 17:36 schrieb Tobias Oelgartetobias.oelgarte@googlemail.com:
It's your basic philosophy that sucks. It's _not_ the choice of the reader to hide image he don't like. It's the choice of the reader to hide image that others don't like! Now get a cup of tea and think about it.
It's the bad double-think that sucks. In most cases pictures give no neccessary information in an article or they represent no NPOV information at all. They just illustrate. No piece of information would be missing if the pictures were linked instead of shown. Often it is sheer random which picture is choosen for an article.
For the same reason you could write articles consisting only out of links, since writing the article would "represent no NPOV information at all". Do you really believe that nonsense you just wrote down?
But You are right. The basic conflict is philosophical. The question behind is: Shall we continue as tough guys with porn pictures, no limits and no rules as everything started or shall we include more sensitive people, women and nations?
We already include them. The problem aren't some articles. The problem is the needed knowledge to participate in an encyclopedia that forces you to understand a complete syntax before you even know what your doing. That makes us geeky, not our content. Additionally this claim:
"tough guys with porn pictures, no limits and no rules".
Sorry, i won't comment on this. It's just so out of place and complete nonsense-strong-wording.
Shall our knowledge come rude in one step to everybody or shall we try to reach more people by making steps of least astonishment towards the same truth, but in a pace everybody can live with?
We have no problem with reaching people. We have a problem to let them participate. The images aren't the issue. The main issue is the editor and overall project climate. Aggressive people, that using one false claim after the other or would need to append {{citation needed}} after every word, are the ones that drive authors away. Just let the people do as they please, and don't say them what they shouldn't look at. That is their own decision. The WMF should provide them tools to edit and to discuss, but not to blend out the actual content.
For me this discussion is hypocrite. Don't hide Yoursef behind the "choice of the reader". The writers of an article choose alone. They choose words, order and content. The pictures are in most cases the least important of these. So every article hides a lot of information the writers choose not to show. That's normal. And they normally flippantly forget to write a style the more sensitive can live with, that's all.
How writes articles in "a style the more sensitives can life with" should just leave the project. This would be bending of facts and a strict violation against NPOV.
On 9/21/2011 7:53 AM, phoebe ayers wrote:
On Wed, Sep 21, 2011 at 6:31 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
On Thu, Aug 25, 2011 at 10:10 PM, phoebe ayersphoebe.wiki@gmail.com wrote:>
This seems like an over-hasty statement. There are many possible categorization schemes that are neutral; the ALA in fact makes that distinction itself, since libraries (obviously) use all kinds of labeling and categorization schemes all the time. The ALA and other library organizations have taken a stand against censorious and non-neutral labeling, not all labeling. If you keep reading the ALA page you linked, it says that the kind of labels that are not appropriate are when "the prejudicial label is used to warn, discourage or prohibit users or certain groups of users from accessing the material" -- e.g. a label that reads "not appropriate for children". That does not mean that picture books for kids, or mystery novels, or large-print books, aren't labeled as such in every public library in the country -- and that is the difference between informative and prejudicial labeling.
Would I be incorrect in pointing out that American public librarys routinely exclude world famous childrens book author Astrid Lindgrens childrens books, because to puritanical minds a man who can elevate himself with a propeller beany, and look into childs rooms thereby, smacks too much of pedophilia?
Uh... yes, you would be incorrect? I certainly checked out Astrid Lindgren books from the public library when I was a kid. I have never heard of them getting challenged in the US. Citation needed?
The ALA maintains a list of books that do get routinely challenged in US libraries here: http://www.ala.org/ala/issuesadvocacy/banned/frequentlychallenged/index.cfm. Note, this just means someone *asked* for the book to be removed from the public or school library, not that it actually was; libraries generally stand up to such requests.
Also note that challenges are typically asking for the book to be removed from the library altogether -- restricting access to it for everyone in the community -- as opposed to simply not looking at it yourself or allowing your own kids to check it out. It's the 'removal for everyone' part that is the problem; the issue here is freedom of choice: people should have the right to read, or not read, a particular book as they see fit.
I'm unable to find a source on this that doesn't appear to be relying on the Wikipedia article in the first place. The supposed rationale seems to be that Karlsson is sort of subversive, if you will, and the books might undermine traditional concepts of authority (for people of a certain era, maybe it also didn't help that the books were popular in the USSR). It's possible that somebody somewhere did question its inclusion once, which could be true of just about any book. Even if so, nothing suggests that the concern had anything to do with encouraging or catering to pedophiles. Were that the issue, I would have thought The Brothers Lionheart a more obvious target, seeing as how it has young boys bathing nude in a river (the scene is illustrated - child porn!), and I've never heard of it being banned either.
--Michael Snow
Am 21.09.2011 18:56, schrieb Michael Snow:
On 9/21/2011 7:53 AM, phoebe ayers wrote:
On Wed, Sep 21, 2011 at 6:31 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
On Thu, Aug 25, 2011 at 10:10 PM, phoebe ayersphoebe.wiki@gmail.com wrote:>
This seems like an over-hasty statement. There are many possible categorization schemes that are neutral; the ALA in fact makes that distinction itself, since libraries (obviously) use all kinds of labeling and categorization schemes all the time. The ALA and other library organizations have taken a stand against censorious and non-neutral labeling, not all labeling. If you keep reading the ALA page you linked, it says that the kind of labels that are not appropriate are when "the prejudicial label is used to warn, discourage or prohibit users or certain groups of users from accessing the material" -- e.g. a label that reads "not appropriate for children". That does not mean that picture books for kids, or mystery novels, or large-print books, aren't labeled as such in every public library in the country -- and that is the difference between informative and prejudicial labeling.
Would I be incorrect in pointing out that American public librarys routinely exclude world famous childrens book author Astrid Lindgrens childrens books, because to puritanical minds a man who can elevate himself with a propeller beany, and look into childs rooms thereby, smacks too much of pedophilia?
Uh... yes, you would be incorrect? I certainly checked out Astrid Lindgren books from the public library when I was a kid. I have never heard of them getting challenged in the US. Citation needed?
The ALA maintains a list of books that do get routinely challenged in US libraries here: http://www.ala.org/ala/issuesadvocacy/banned/frequentlychallenged/index.cfm. Note, this just means someone *asked* for the book to be removed from the public or school library, not that it actually was; libraries generally stand up to such requests.
Also note that challenges are typically asking for the book to be removed from the library altogether -- restricting access to it for everyone in the community -- as opposed to simply not looking at it yourself or allowing your own kids to check it out. It's the 'removal for everyone' part that is the problem; the issue here is freedom of choice: people should have the right to read, or not read, a particular book as they see fit.
I'm unable to find a source on this that doesn't appear to be relying on the Wikipedia article in the first place. The supposed rationale seems to be that Karlsson is sort of subversive, if you will, and the books might undermine traditional concepts of authority (for people of a certain era, maybe it also didn't help that the books were popular in the USSR). It's possible that somebody somewhere did question its inclusion once, which could be true of just about any book. Even if so, nothing suggests that the concern had anything to do with encouraging or catering to pedophiles. Were that the issue, I would have thought The Brothers Lionheart a more obvious target, seeing as how it has young boys bathing nude in a river (the scene is illustrated - child porn!), and I've never heard of it being banned either.
--Michael Snow
There might be simple reason for that. Some nude boys bathing in a river has nothing to do with pornography and therefor nothing to do with child pornography. A simple fact that is widely ignored in many discussions, by fundamentalists. They claim that any depiction of a nude body is sexual and porn. Not even law agrees to this extreme point of view.
wikimedia-l@lists.wikimedia.org