In a message dated 5/24/2009 12:11:40 PM Pacific Daylight Time, thomas.dalton@gmail.com writes:
At any rate, the person would have to sue the editor, not the project,
and
the editor could stand on the basis of simply quoting the PDR.
Could they sue other people that have edited the article without fixing the mistake? What about someone that reverted vandalism to that sentence, thus putting back the incorrect information? We can't rely on the law only holding the person directly responsible liable.>>
-------------------- I don't think you would agree if this logic were extended to all articles.
Am I responsible, fixing the birthplace of George Bush, that someone else, in another section of that article has said "He killed his parents when he was three."
No I'm not responsible for that. I'm solely responsible for the edits I make, not those of others.
Similar to reverting vandalism. If the previous version was incorrect, than the responsibility rests on whomever put that into the article in the first place. Not on any subsequent editor. We are not all experts in what the PDR does and doesn't say. But any of us can fix spelling errors in an article. That does not mean, that we must know and approve the entire article and be responsible for it, simply because we are changing something of little consequence in it.
That's true for all articles, not just ones on drugs.
Will Johnson
************** An Excellent Credit Score is 750. See Yours in Just 2 Easy Steps! (http://pr.atwola.com/promoclk/100126575x1221322948x1201367184/aol?redir=http...; bcd=MayExcfooterNO62)
On Sun, May 24, 2009 at 11:46 PM, WJhonson@aol.com wrote:
In a message dated 5/24/2009 12:11:40 PM Pacific Daylight Time, thomas.dalton@gmail.com writes:
At any rate, the person would have to sue the editor, not the project,
and
the editor could stand on the basis of simply quoting the PDR.
Could they sue other people that have edited the article without fixing the mistake? What about someone that reverted vandalism to that sentence, thus putting back the incorrect information? We can't rely on the law only holding the person directly responsible liable.>>
I don't think you would agree if this logic were extended to all articles.
Disagree.
Am I responsible, fixing the birthplace of George Bush, that someone else, in another section of that article has said "He killed his parents when he was three."
Fixing birthplace, maybe not. But reverting vandalism is different.
No I'm not responsible for that. I'm solely responsible for the edits I make, not those of others.
If you revert to a version that includes stuff previously taken out by another editor, then you are re-instating the material that was removed. That is why I always check a diff of what changes have been made before, or just after, saving. That is also why I argue against bot-like blanket reversion of contributions of banned users without manual checking. If they removed vandalism, we can't blindly revert that.
Similar to reverting vandalism. If the previous version was incorrect, than the responsibility rests on whomever put that into the article in the first place. Not on any subsequent editor.
With vandalism, I think there is a duty of care to check the recent history and go back to the last version before the vandalism started. Sometimes you have to stop and look quite carefully, but if you don't, who else will?
So many times I've seen Twinkle and Huggle users only revert the last bit of vandalism and ignoring the previous 3 or 4 edits that also added vandalism. It makes the Twinkle and Huggle users look really, really silly. They end up saving an article with blatant vandalism that they would see if they had looked at it for even a few seconds.
The different scenario where you spot a single mistake and go in and change it is somewhat different. Reading and checking the whole of an article is not always feasible. But I would be happier if there was a tick box to be updated by trusted editors that said "I've read the whole of this article and it looks OK". After months and years of nothing but vandalism addition and reverts, it is easy for stuff to creep in without being spotted. Sometimes every article needs someone to step back, read the whole thing, make what overall changes are needed, and tick the box saying "an editor has read and checked the whole article".
Carcharoth
On Mon, May 25, 2009 at 12:12 AM, Carcharoth carcharothwp@googlemail.comwrote:
With vandalism, I think there is a duty of care to check the recent history and go back to the last version before the vandalism started. Sometimes you have to stop and look quite carefully, but if you don't, who else will?
I agree. Quite often vandals will come in and keep making vandal edits until they are stopped. It only needs some other user to make a routine edit in the middle for the reverter to miss the earlier edits, which might mean that the article will be left with vandalism that appears to have been accepted as valid. It's always worth investigating the history, and also whether the account or IP has vandalised anything else, when doing a vandal revert.
Sam Blacketer (2009/5/25):
Quite often vandals will come in and keep making vandal edits until they are stopped
I concur with that. When I come across an account behaving so, I yearn for a "revert last X edits" function.
Now that I think of it, I'm sure there is an administrator js block that enables one to do that. Is it VoiceOfAll's?
*AGK*
On an article, rollback will do that if there is a sequence of edits by a single editor and there are no intervening edits. If there are intervening edits, it's normally worth looking closer and checking what exactly to revert or change. I think you have to click rollback on the editor's contributions log, rather than the article, but I might be wrong there.
Carcharoth
On Mon, May 25, 2009 at 3:06 PM, agk agkwiki@googlemail.com wrote:
Sam Blacketer (2009/5/25):
Quite often vandals will come in and keep making vandal edits until they are stopped
I concur with that. When I come across an account behaving so, I yearn for a "revert last X edits" function.
Now that I think of it, I'm sure there is an administrator js block that enables one to do that. Is it VoiceOfAll's?
*AGK* _______________________________________________ WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
The questions of liability and encyclopedic nature are really tangential to the core reasons for the guideline. The text of the guideline and discussions about it have generally made no reference to whether the material is encyclopedic or whether legal ramifications exist for having the wrong information. Since many of the editors of drug information have some connection to the health care industry, whether as physicians or nurses or etc., the focus has understandably been about the potential for harming people who use incorrect information or misuse correct information. I haven't seen this problem adequately addressed here; it's roughly analogous to why we don't include instructions on how to make bombs. A specialist encyclopedia of explosives and ordnance might include information on how such weapons are built, but we don't. Similarly, medical references include information on lethal dosages and dangerous applications for drugs, but we don't.
Nathan
From my experience as a biomedical librarian, when I see someone say,
the ordinary reader won't know how to use it, I see the continuation of guild mentality, the desire to keep information obscure to protect revenues and status.
We provide information on many potentially dangerous things. We do not provide detailed practical instructions. but the plain statement of normal mg/kg is not detailed instruction any more than is information on indications. If we give the information for vitamin requirements, we can give it for drugs. We often give LD50s, though sometimes inconspicuously and with an unfortunate tendency to give the values for rats even if the human value is known.
Basic information that anyone can understand is what is known to be safe, and what is known to be dangerous. The more directly we present it, the more we fulfill our mandate. NOT CENSORED, frankly, and that should settle it. Some people think it applies only to sexual images, but that's just a function of our culture preoccupation with them. There are more important things to avoid censoring. If the information is known reliably, we have no justification for not publishing it. The very meaning of NOT CENSORED is that information is always preferred to ignorance. The key word is "always". The only restraint should be legal restrictions, which does not apply here. If it's verifiable, legal, and pertinent, and we do not state it, we are censoring.
David Goodman, Ph.D, M.L.S. http://en.wikipedia.org/wiki/User_talk:DGG
On Mon, May 25, 2009 at 11:22 AM, Nathan nawrich@gmail.com wrote:
The questions of liability and encyclopedic nature are really tangential to the core reasons for the guideline. The text of the guideline and discussions about it have generally made no reference to whether the material is encyclopedic or whether legal ramifications exist for having the wrong information. Since many of the editors of drug information have some connection to the health care industry, whether as physicians or nurses or etc., the focus has understandably been about the potential for harming people who use incorrect information or misuse correct information. I haven't seen this problem adequately addressed here; it's roughly analogous to why we don't include instructions on how to make bombs. A specialist encyclopedia of explosives and ordnance might include information on how such weapons are built, but we don't. Similarly, medical references include information on lethal dosages and dangerous applications for drugs, but we don't.
Nathan _______________________________________________ WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
On Mon, 25 May 2009, David Goodman wrote:
Basic information that anyone can understand is what is known to be safe, and what is known to be dangerous. The more directly we present it, the more we fulfill our mandate. NOT CENSORED, frankly, and that should settle it. Some people think it applies only to sexual images, but that's just a function of our culture preoccupation with them. There are more important things to avoid censoring. If the information is known reliably, we have no justification for not publishing it. The very meaning of NOT CENSORED is that information is always preferred to ignorance. The key word is "always".
This is a prime example of how rules are taken to be everything on Wikipedia, and how common sense is ignored.
Wikipedia should not provide information that is likely to lead to harm. If there's a rule which says that we must provide it, then that rule is wrong. This is so even if the rule is called a "mandate". Mandates, rules, or whatever are never supposed to be applied without common sense.
This is actually similar to some BLP issues. We don't have an article on Brian Peppers because "not censored" doesn't mean that we shouldn't remove things that have impact on the real world.
2009/5/26 Ken Arromdee arromdee@rahul.net:
This is a prime example of how rules are taken to be everything on Wikipedia, and how common sense is ignored.
Wikipedia should not provide information that is likely to lead to harm.
That would require us to exclude information on rather a lot of ethnic conflicts.
If there's a rule which says that we must provide it, then that rule is wrong. This is so even if the rule is called a "mandate". Mandates, rules, or whatever are never supposed to be applied without common sense.
Why do you expect anyone else to follow your version of "common sense"?
2009/5/26 geni geniice@gmail.com:
2009/5/26 Ken Arromdee arromdee@rahul.net:
This is a prime example of how rules are taken to be everything on Wikipedia, and how common sense is ignored.
Wikipedia should not provide information that is likely to lead to harm.
That would require us to exclude information on rather a lot of ethnic conflicts.
Could you explain that one?
2009/5/26 geni geniice@gmail.com:
2009/5/26 Ken Arromdee arromdee@rahul.net:
This is a prime example of how rules are taken to be everything on Wikipedia, and how common sense is ignored.
Wikipedia should not provide information that is likely to lead to harm.
That would require us to exclude information on rather a lot of ethnic conflicts.
Could you explain that one?
I understood it well enough. Accurate information on a number of subjects is inflammatory. Imagine if the Chinese people actually had access to a video of soldiers machine gunning Tienanmen protesters. I doubt if anyone eating at the McDonald's at the site would have much appetite.
On Tue, 26 May 2009, Fred Bauder wrote:
I understood it well enough. Accurate information on a number of subjects is inflammatory.
This is another example of being overly literal and avoiding common sense. Obviously, when I say Wikipedia should avoid harm, I don't mean it should avoid *any harm whatsoever*. Rather, it means that we need to think about how much harm something can do and not cause harm that is exceptionally acute when the benefit to the encyclopedia is relatively small. How do you figure this out? Well, you have to think--there's no rule for it.
On Tue, 26 May 2009, Fred Bauder wrote:
I understood it well enough. Accurate information on a number of subjects is inflammatory.
This is another example of being overly literal and avoiding common sense. Obviously, when I say Wikipedia should avoid harm, I don't mean it should avoid *any harm whatsoever*. Rather, it means that we need to think about how much harm something can do and not cause harm that is exceptionally acute when the benefit to the encyclopedia is relatively small. How do you figure this out? Well, you have to think--there's no rule for it.
You're preaching to the choir. Often when we want to do the right thing, we are confronted with a demand for a rule, or presented with one, typically "no censorship". There is no substitute for doing what is appropriate in the circumstances. Trying to codify that principle is futile, although Ignore all rules comes close.
Fred Bauder
On Tue, 26 May 2009, Fred Bauder wrote:
You're preaching to the choir. Often when we want to do the right thing, we are confronted with a demand for a rule, or presented with one, typically "no censorship". There is no substitute for doing what is appropriate in the circumstances. Trying to codify that principle is futile, although Ignore all rules comes close.
IAR is particularly subject to wikilawyering in this situation. It says that it applies when a rule prevents you from improving or maintaining Wikipedia. This can be easily interpreted to mean that any use of IAR must improve Wikipedia itself, and that considerations outside Wikipedia (such as BLP and other issues related to avoiding harm) are ineligible for IAR.
On Tue, 26 May 2009, Fred Bauder wrote:
You're preaching to the choir. Often when we want to do the right thing, we are confronted with a demand for a rule, or presented with one, typically "no censorship". There is no substitute for doing what is appropriate in the circumstances. Trying to codify that principle is futile, although Ignore all rules comes close.
IAR is particularly subject to wikilawyering in this situation. It says that it applies when a rule prevents you from improving or maintaining Wikipedia. This can be easily interpreted to mean that any use of IAR must improve Wikipedia itself, and that considerations outside Wikipedia (such as BLP and other issues related to avoiding harm) are ineligible for IAR.
Trying to do Biographies of living persons without a rule proved futile; so a written policy was created. We still don't have a corresponding policy for organizations. The underlying principle is don't hang an article on scraps of negative information, but you could write a book on the biographies on Wikipedia, and an even more interesting book if you collected all the half-cocked material we have excluded for one reason or another. Not a book you would want to publish or distribute in the UK, however.
Fred Bauder
2009/5/26 Fred Bauder fredbaud@fairpoint.net:
Trying to do Biographies of living persons without a rule proved futile; so a written policy was created.
Which only works because it's NPOV/NOR/V with (a working aim for) no eventualism whatsoever.
We still don't have a corresponding policy for organizations.
It'd need a bad example as compelling as Siegenthaler.
The underlying principle is don't hang an article on scraps of negative information, but you could write a book on the biographies on Wikipedia, and an even more interesting book if you collected all the half-cocked material we have excluded for one reason or another. Not a book you would want to publish or distribute in the UK, however.
*cough* indeed :-)
(The UK phone-call-receiving people do get calls or emails from UK article subjects, with varying degrees of legal threat attached. As per any customer support, solving the problem, or pointing them in the right direction to solve the problem, usually deals with things very nicely. I avoid editing legally problematic BLPs about UK-based subjects, but in almost all cases they're happy to have someone helping solve the problem. Pointing out that we take this very seriously when we're alerted to it helps a great deal too.)
- d.
On Tue, May 26, 2009 at 7:51 PM, Fred Bauder fredbaud@fairpoint.net wrote:
<snip>
you could write a book on the biographies on Wikipedia
[...]
Not a book you would want to publish or distribute in the UK, however.
Turning away from BLPs to featured articles, it is well-known that articles on people make up a large proportion of the articles on Wikipedia (last time I looked it was about 1 in 5). The number of featured articles that are biographies is another interesting stat, as is the number of featured articles we have that are BLPs.
Total number of articles: 2,893,595 Total number of articles on people: 673,918 (23.29% of all articles) Total number of featured biographies: 618 (0.09% of biographies) Total number of BLPs: 375,584 (55.73% of biographies) Total number of featured BLPs: unknown
Can anyone work out that last figure?
A book consisting of the featured biographies might not be bad.
Carcharoth
Sources:
http://en.wikipedia.org/wiki/Special:Statistics http://en.wikipedia.org/wiki/Category:Biography_articles_by_quality http://en.wikipedia.org/wiki/Category:Living_people http://en.wikipedia.org/wiki/Category:FA-Class_biography_articles
On Tue, May 26, 2009 at 10:41 PM, Carcharoth carcharothwp@googlemail.com wrote:
<snip>
Total number of articles: 2,893,595 Total number of articles on people: 673,918 (23.29% of all articles) Total number of featured biographies: 618 (0.09% of biographies) Total number of BLPs: 375,584 (55.73% of biographies) Total number of featured BLPs: unknown
Worked out an approximation for the latter figure using the CatScan tool:
http://toolserver.org/~daniel/WikiSense/CategoryIntersect.php?wikilang=en&am...
Intersection of Category:FA-Class biography articles and Template:Blp = 169 articles
http://toolserver.org/~daniel/WikiSense/CategoryIntersect.php?wikilang=en&am...
Intersection of Category:FA-Class biography articles and Category:Biography articles of living people = 168 articles.
I sorted them in Excel and the total is 169. There are 36 articles on music groups. Three "other" articles (a list, a criminal trial article, and a summary-style daughter article). The other 130 articles are on living individuals. I'll throw a list up on-wiki somewhere and link from here.
http://en.wikipedia.org/wiki/User:Carcharoth/Featured_BLPs
Carcharoth
Sources:
http://en.wikipedia.org/wiki/Special:Statistics http://en.wikipedia.org/wiki/Category:Biography_articles_by_quality http://en.wikipedia.org/wiki/Category:Living_people http://en.wikipedia.org/wiki/Category:FA-Class_biography_articles
2009/5/26 Ken Arromdee arromdee@rahul.net:
This is another example of being overly literal and avoiding common sense.
I'm not interested in the prejudices you acquired by the age of ten.
Obviously, when I say Wikipedia should avoid harm, I don't mean it should avoid *any harm whatsoever*.
Then don't say that.
Rather, it means that we need to think about how much harm something can do and not cause harm that is exceptionally acute when the benefit to the encyclopedia is relatively small. How do you figure this out? Well, you have to think--there's no rule for it.
Anything that doesn't present a significant chance of destroying the species cannot be considered exceptionally acute given how many things there are around that do carry that risk.
On Wed, 27 May 2009, geni wrote:
This is another example of being overly literal and avoiding common sense.
I'm not interested in the prejudices you acquired by the age of ten.
Obviously, when I say Wikipedia should avoid harm, I don't mean it should avoid *any harm whatsoever*.
Then don't say that.
That too is an example of being overly literal and avoiding common sense.
On 27/05/2009, Ken Arromdee arromdee@rahul.net wrote:
That too is an example of being overly literal and avoiding common sense.
Please explain how removing publicly available, legal, verifiable, information from the wikipedia is common sense again?
I think this is madness. And further, I don't have to follow it anyway. You're espousing censorship, but it's a *core value* that the wikipedia is *not* censored. But it's only a *guideline* that we don't include typical dosages. ergo: we don't have to follow it.
On Wed, 27 May 2009, Ian Woollard wrote:
Please explain how removing publicly available, legal, verifiable, information from the wikipedia is common sense again?
Because whether it's common sense to remove the material doesn't depend on whether it's publically available, legal, or verifiable.
(And anyway, it's only verifiable under ideal circumstances. If we have it, it will get vandalized. The vandalized version, of course, won't be verifiable, but it's still going to stick around for a while.)
I think this is madness. And further, I don't have to follow it anyway. You're espousing censorship, but it's a *core value* that the wikipedia is *not* censored.
IAR is a core value, and supersedes all other core values. It's never legitimate to say "we should ignore common sense because our core values don't allow for it".
On 28/05/2009, Ken Arromdee arromdee@rahul.net wrote:
On Wed, 27 May 2009, Ian Woollard wrote:
Please explain how removing publicly available, legal, verifiable, information from the wikipedia is common sense again?
Because whether it's common sense to remove the material doesn't depend on whether it's publically available, legal, or verifiable.
You didn't answer the question. I want to know why legal information that can be googled up in a minute or so shouldn't be in the wikipedia.
(And anyway, it's only verifiable under ideal circumstances.
Straw man.
If we have it, it will get vandalized.
Unlike... the rest of the wikipedia? And nobody ever checks for and removes vandalism of course.
The vandalized version, of course, won't be verifiable, but it's still going to stick around for a while.)
So on that 'logic' we should remove all information that even theoretically could be harmful from the wikipedia immediately, because ummm... it might get vandalised!
So I think we should start with the hydrogen article. Knowledge of hydrogen could get people killed! It's an EXPLOSIVE GAS!!!! We should definitely remove the flammability limits- it's heinous that people should know how much hydrogen you need to burn it!!! People could die.
Then there's all the metals. A lot of those are poisonous! Copper, lead, cadmium; somebody could poison somebody! People could die.
And the articles on flight, somebody might try to build an aircraft, and die!!! Aircraft pages need to go! People could die.
Do you want to do the AFDs or should I? I reckon we should have maybe 10-20% of the wikipedia left before we've finished, flower arranging (without using any of those dangerous pins though, you could prick your finger and get an infection and die) and so forth. I think we need to do away with all the geology articles, people might throw rocks at each other. Maybe drawing and stuff about crayons can stay, provided we can prove that people usually don't eat too many.
In fact, perhaps we need to just shut the whole wikipedia down- somebody could choke on the crayons. People could die.
I think this is madness. And further, I don't have to follow it anyway. You're espousing censorship, but it's a *core value* that the wikipedia is *not* censored.
IAR is a core value, and supersedes all other core values. It's never legitimate to say "we should ignore common sense because our core values don't allow for it".
Common sense is the *lowest* level of intelligence. Has anyone you know, actually died or got injured from the wikipedia, ever?
The wikipedia itself is not common sense.
Common sense is the *lowest* level of intelligence. Has anyone you know, actually died or got injured from the wikipedia, ever?
-- -Ian Woollard
I'm pretty sure we're partially to blame for a suicide or two, by users, not readers...
Fred
Common sense is the *lowest* level of intelligence. Has anyone you know, actually died or got injured from the wikipedia, ever?
-- -Ian Woollard
on 5/29/09 8:30 AM, Fred Bauder at fredbaud@fairpoint.net wrote:
I'm pretty sure we're partially to blame for a suicide or two, by users, not readers...
If you are serious here, Fred, that is quite a statement. Please be careful with that thought.
Marc Riddell
Common sense is the *lowest* level of intelligence. Has anyone you know, actually died or got injured from the wikipedia, ever?
-- -Ian Woollard
on 5/29/09 8:30 AM, Fred Bauder at fredbaud@fairpoint.net wrote:
I'm pretty sure we're partially to blame for a suicide or two, by users, not readers...
If you are serious here, Fred, that is quite a statement. Please be careful with that thought.
Marc Riddell
Interactions between less than perfect people and less than perfect organizations are complex. We can do our best to be as compassionate as possible in all interactions, but there can be a great deal of pain regardless. That is one reason to try to keep the door open even with editors that are troublesome and be forgiving of human weakness.
Fred Bauder
Common sense is the *lowest* level of intelligence. Has anyone you know, actually died or got injured from the wikipedia, ever?
-- -Ian Woollard
on 5/29/09 8:30 AM, Fred Bauder at fredbaud@fairpoint.net wrote:
I'm pretty sure we're partially to blame for a suicide or two, by users, not readers...
If you are serious here, Fred, that is quite a statement. Please be careful with that thought.
Marc Riddell
on 5/29/09 9:06 AM, Fred Bauder at fredbaud@fairpoint.net wrote:
Interactions between less than perfect people and less than perfect organizations are complex. We can do our best to be as compassionate as possible in all interactions, but there can be a great deal of pain regardless. That is one reason to try to keep the door open even with editors that are troublesome and be forgiving of human weakness.
I agree with everything you say here, Fred. I was just stumbling over the use of the word "suicide" in this context. As far as the Project dealing "compassionately" with human interaction, I see no evidence of that. This brings up the old question: What is more important, the product or the people who create it? This Project has not successfully resolved that question.
Marc
on 5/29/09 9:06 AM, Fred Bauder at fredbaud@fairpoint.net wrote:
Interactions between less than perfect people and less than perfect organizations are complex. We can do our best to be as compassionate as possible in all interactions, but there can be a great deal of pain regardless. That is one reason to try to keep the door open even with editors that are troublesome and be forgiving of human weakness.
I agree with everything you say here, Fred. I was just stumbling over the use of the word "suicide" in this context. As far as the Project dealing "compassionately" with human interaction, I see no evidence of that. This brings up the old question: What is more important, the product or the people who create it? This Project has not successfully resolved that question.
Marc
That attitude is an ideal for both administrators and arbitrators, and ultimately comes from Jimbo, who is very patient and forgiving with troublesome behavior. While there are massive failures, I do see lots of evidence of patience and forgiveness.
Fred
On Fri, 29 May 2009, Ian Woollard wrote:
Please explain how removing publicly available, legal, verifiable, information from the wikipedia is common sense again?
Because whether it's common sense to remove the material doesn't depend on whether it's publically available, legal, or verifiable.
You didn't answer the question. I want to know why legal information that can be googled up in a minute or so shouldn't be in the wikipedia.
Because that's like saying "if everyone else litters, why shouldn't I litter, too". We have an obligation to avoid harm caused by us, even if other people may cause similar harm.
And anyway, the other Googleable sites * are much less prone to vandalism and errors * are less trusted by Internet users * are much less *prominent*.
(And anyway, it's only verifiable under ideal circumstances.
Straw man.
It's not a straw man. You wanted to know why we should remove verifiable information. The answer is that if we have this particular verifiable information we will have time periods where it's vandalized (and therefore not verifiable at that moment).
If we have it, it will get vandalized.
Unlike... the rest of the wikipedia? And nobody ever checks for and removes vandalism of course.
If it gets vandalized and the vandalism is fixed, there's an interval of time when the vandalism is in existence. This is an acceptable cost if the article is about George Washington's birthdate; it's not an acceptable cost when someone could get hurt. Moreover, the time it takes to fix vandalism can vary greatly, and several factors make it more likely that vandalism will stick around on drug articles than on, say, the Obama artlcle.
So on that 'logic' we should remove all information that even theoretically could be harmful from the wikipedia immediately, because ummm... it might get vandalised! So I think we should start with the hydrogen article. Knowledge of hydrogen could get people killed! It's an EXPLOSIVE GAS!!!! We should definitely remove the flammability limits- it's heinous that people should know how much hydrogen you need to burn it!!! People could die.
Chances are very low that someone who wants to burn hydrogen is going to go to Wikipedia to find out how much they need to burn. Likewise, chances are low that someone's going to use Wikipedia's information to build an aircraft. This is where the common sense comes in: some types of information are more likely than others, *in practice*, to be used in situations where someone can get hurt.
On 29/05/2009, Ken Arromdee arromdee@rahul.net wrote:
Chances are very low that someone who wants to burn hydrogen is going to go to Wikipedia to find out how much they need to burn. Likewise, chances are low that someone's going to use Wikipedia's information to build an aircraft.
The chances that somebody will solely consult the wikipedia for prescription information is very low also. The chances that somebody will be vandalising the article at the same time is enormously lower still, and that the prescribing person doesn't notice that it has been vandalised is lower again.
Even if that happens, it cannot be said that it is the wikipedia is at fault. We do not condone vandalism, nor do we condone using information based solely on the wikipedia in life-threatening scenarios; and I don't think that any other encyclopedia is different in this.
This is a ridiculous over-reaction to something that has never happened in real life, and is extremely unlikely to occur, and even if it did happen would not be the responsibility, in any sensible way, of the wikipedia.
This is where the common sense comes in: some types of information are more likely than others, *in practice*, to be used in situations where someone can get hurt.
Is there such thing as a situation where somebody cannot get hurt?
And what about the potential uses of information that could save people's lives? One of the uses is to *check* a prescription, and this is a valid use that is much less likely to cause harm.
2009/5/29 Ian Woollard ian.woollard@gmail.com:
And what about the potential uses of information that could save people's lives? One of the uses is to *check* a prescription, and this is a valid use that is much less likely to cause harm.
For the sake of the record, I've ended up using a Wikipedia article to check a prescription - I'd been given an antibiotic which I'd seen mentioned as used in treatment of the condition, but at a dosage about eight times lower. It turned out - and our article explained quite clearly and with detail - that there were two treatment regimes; one is basically a "short sharp shock", and the other runs over a week. I'd been placed on the second, but had only seen reference to the first. Score one to Wikipedia; I felt quite reassured knowing that.
I can think of a number of cases where we could pose much more immediate risk to someone using Wikipedia as a quick-reference - household wiring, for example! "Oh, live is *blue*, right..."
To be honest, this worry seems a bit presumptive about the suggestibility of our users. On the whole, people are much more likely to ring up a pharmacy and say "excuse me, are you sure this instruction is right?" than they are to decide the writing on the bottle was clearly wrong and they should take twenty tablets each morning rather than two... do people *really* decide to self-medicate based entirely on one thing they read on the internet, and go off and acquire the medication and so on without ever noticing anything to the contrary?
On 26/05/2009, Ken Arromdee arromdee@rahul.net wrote:
Wikipedia should not provide information that is likely to lead to harm. If there's a rule which says that we must provide it, then that rule is wrong.
Uh huh. And if it also is possible to use the information to avoid harm? What if it's only a tiny amount of harm, should it be removed then? And if not, how much harm does it take, and who gets to judge?
In other words who died and made you head censor?
On Tue, 26 May 2009, Ian Woollard wrote:
Wikipedia should not provide information that is likely to lead to harm. If there's a rule which says that we must provide it, then that rule is wrong.
Uh huh. And if it also is possible to use the information to avoid harm? What if it's only a tiny amount of harm, should it be removed then?
There's an aswer to this.
Think.
There's *no rule* you can use for this. You *have* to consider it case by case.
On 26/05/2009, Ken Arromdee arromdee@rahul.net wrote:
Wikipedia should not provide information that is likely to lead to harm. If there's a rule which says that we must provide it, then that rule is wrong.
Uh huh. And if it also is possible to use the information to avoid harm? What if it's only a tiny amount of harm, should it be removed then? And if not, how much harm does it take, and who gets to judge?
In other words who died and made you head censor?
-- -Ian Woollard
We're all censors, we just vary with respect to what we censor.
Fred Bauder
On 26/05/2009, Fred Bauder fredbaud@fairpoint.net wrote:
We're all censors, we just vary with respect to what we censor.
No, I don't think I am. I don't remove anything except that which is believed to be illegal in the state of Florida... which this isn't. That's not my censorship, that Florida's.
You guys that are removing this information are setting yourself up as censors. You're removing *legal* information from the wikipedia that could *save* lives (because it helps people check their prescriptions for errors).
It's specifically a censorship of the wikipedia, and for a fictitious reason that has never, to my knowledge even happened in real life.
Fred Bauder
Nathan wrote:
A specialist encyclopedia of explosives and ordnance might include information on how such weapons are built, but we don't. Similarly, medical references include information on lethal dosages and dangerous applications for drugs, but we don't.
We do include detailed information on how weapons are built, though. There was a big argument a few years back about whether we ought to tone down the amount of coverage we give to details of how various nuclear bomb designs work (or at least are alleged to work, based on public information), but it was decided that including it was encyclopedic.
We don't include HOWTO style step-by-step instructions, of course, but we include all the details that are available, from assembly procedures to, sizes of various parts, quantity and purity of fuel required, machining requirements, etc.
-Mark
Rollback definitely works on the article's diff page. Twinkle also does the same thing (assumes continued vandalism/agf) for all its various options.
~A
On Mon, May 25, 2009 at 10:20, Carcharoth carcharothwp@googlemail.comwrote:
On an article, rollback will do that if there is a sequence of edits by a single editor and there are no intervening edits. If there are intervening edits, it's normally worth looking closer and checking what exactly to revert or change. I think you have to click rollback on the editor's contributions log, rather than the article, but I might be wrong there.
Carcharoth
On Mon, May 25, 2009 at 3:06 PM, agk agkwiki@googlemail.com wrote:
Sam Blacketer (2009/5/25):
Quite often vandals will come in and keep making vandal edits until they are stopped
I concur with that. When I come across an account behaving so, I yearn
for
a "revert last X edits" function.
Now that I think of it, I'm sure there is an administrator js block that enables one to do that. Is it VoiceOfAll's?
*AGK* _______________________________________________ WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l