A Wikipol is an political program. It is a collection of wiki-pages that
describe the actual stands of (webbased) political parties. The members of
the e-party can develop and update these wiki-pages by amendments.
I've been increasingly concerned lately about Wikimedia's coverage of living
people, both within biographies of living people (BLPs) on Wikipedia, and in
coverage of living people in non-BLP text. I've asked the board to put this
issue on the agenda for the April meeting in Berlin, and I'm hoping there to
figure out some concrete next steps to support quality in this area. In
advance of that, I want to ask for input from you.
First, I'm going to lay out the scope of the problem as I see it. (If you're
already up to speed, you might want to skip that bit.) Then I'll lay out a
little of my thinking on how we could aim to improve. I would very much
appreciate any feedback from you -ideally here on this list- before the
April meeting :- )
(Please note that for convenience I'm going to use the phrase "BLP" as
shorthand for the whole issue of coverage of living people throughout all
Wikimedia projects. BLP's probably constitute the majority of that coverage,
but not all of it.)
Scope of the problem:
I am sure that BLP subjects have been complaining about their portrayals
since Wikipedia's very early days. And I am sure that BLPs have always
suffered from the same problems and errors that occur in all articles:
malicious vandalism, biased editing, lack of citations, and so on. However,
I am particularly worried about BLPs, for two reasons:
1. BLPs are, by definition, about living people. A mistake in an article
about the War of 1812 is too bad. A mistake in an article about a living
person could cause that person real-world harm. We don't want to do that.
2. I believe the risk of hurting people is greater than it used to be,
because Wikipedia is growing increasingly unignorable. People are using the
internet to check out job applicants, colleagues, dates - and we are the
first search result for many names.
As Wikipedia generally becomes bigger and smarter and more in-depth, its
credibility increases - and so the gap between what we aim to do and what we
actually achieve on many BLPs, becomes ever more visible and disappointing.
This hurts our mission:
* We want to be taken seriously. Having a large number of influential,
accomplished people (the people who are typically subjects of BLPs)
distrusting or disliking us, damages our credibility.
* We aspire to be neutral and accurate. We know that not all BLP
complainants share that goal - some simply want their BLP whitewashed. The
existence of unfounded complaints, though, doesn't undercut the seriousness
of the real problem: many BLPs are inaccurate, unfair and paint a distorted
picture of their subject. They are not up to Wikipedia's standards.
* And -as I said earlier- these are real people's lives. Neutrally-written,
sourced information that is unflattering to the subject of an article is
appropriate to an encyclopedia, but lies, nonsense, insinuations and
unbalanced portrayals are not.
So what can we do? Here are the things I am thinking about. I would love
* Do we think the current complaints resolution systems are working? Is it
easy enough for article subjects to report problems? Are we courteous and
serious in our handling of complaints? Do the people handling complaints
need training/support/resources to help them resolve the problem (if there
is one)? Are there intractable problems, and if so, what can we do to solve
them? Some Wikimedia chapters have pioneered more systematic training of
volunteers to handle OTRS responses; should we try to scale up those or
* Are there technical tools we could implement, that would support greater
quality in BLPs? For example – easy problem reporting systems,
particular configurations of Flagged Revs, etc.
* Wikimedians have developed lots of tools for preventing/fixing vandalism
and errors of fact. Where less progress has been made, I think, is on the
question of disproportionate criticism. It seems to me that the solution may
include the development of systems designed to expose particularly biased
articles to a greater number of people who can help fix them. But this is a
pretty tough problem and I would welcome people's suggestions for resolving
* The editors I've spoken with about BLPs are pretty serious about them –
they are generally conservative, restrained, privacy-conscious, etc. But I
wonder if that general attitude is widely-shared. If Wikipedia believes (as
is said in -for example- the English BLP policy) that it has a
responsibility to take great care with BLPs, should there be a
Wikipedia-wide BLP policy, or a projects-wide statement of some kind?
BLPs and our general effect on living people have been a tough problem for a
long time, and I think we need now to bring together the appropriate people
and resources, and hash through how to best make some progress on the
problem. I'd like to start that discussion here, now. I'd appreciate any
feedback from you all, before April. Please note I am deliberately not
asking questions about who should be responsible for what: chapters,
individual volunteers, the Wikimedia board or staff. We can figure that
part out later. Right now I'm mostly interested in what we should be doing.
David, please consider that English is not the native language of everybody.
I took a lot of effort to learn your language, it would be appropriate if
you would try from your side to make it more easy for non natives to
Instead of mocking about GerardM's English, please do the following:
- use a common word instead of a difficult one, if it exists
- make no jokes or ironic references that require a specific national
background to understand
- be as short as possible to express your message
Nothing that cannot be asked from a good Wikipedian, isn't it? :-)
2009/3/5 David Gerard <dgerard(a)gmail.com>
> 2009/3/5 Gerard Meijssen <gerard.meijssen(a)gmail.com>:
> > It is not that I am not able to look up words in a dictionary.. When an
> > excess of dificult word is used, the message is lost.
> None of these were excessively difficult, and now you know more English
> - d.
> foundation-l mailing list
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Ziko van Dijk
I'm breaking this specific idea out of the main thread, in order to focus on it.
There seems to be considerable support for adding some kind of "Report
a problem" link to pages, (probably not necessarily) to the sidebar.
I'd like to give a little more thought to this idea, i.e. where we
want a link to go, and what we want it to do.
Personally I think if such a link simply mails OTRS, that would be
suboptimal. It would risk creating a lot of email volume for
relatively minor problems, and make it harder to differentiate
important issues from minor ones.
By preference, I'd like to see a link that goes to a simple page for
requesting help with options such as, "post a public message on the
talk page", "email a volunteer for help", etc. In principle, such a
page, could even have a single text box for composing a message, a set
of instructions, and a dropdown list of actions to take ranging from a
talk page post to an OTRS email, etc. Reports of vandalism and other
simple problems might also be channeled automatically to one of the
existing onwiki noticeboards if the reporter is not asking for
Clicking on the "report a problem" link should automatically fill in
what page one came from. Even more ideally, the report a problem link
might be modified based on indicators in the page, such as
"Category:Living people", in order to better prioritise and direct
correspondence. If the person reporting the problem does choose to
post publicly, the post could be flagged with something like
"Category:Unresolved problem reports", which might then be replaced
with "Category:Resolved problem reports" after it has been looked at
Ideally, I think problem reports should include the option of being
completely anonymous (though presumably with a CAPTCHA or other device
to limit spam posts).
*phoebe ayers* phoebe.wiki at gmail.com writes:
I'm not sure there's any way to get a non-self-selected survey about anything
on the projects due to anonymity concerns.
I'm a 17-year veteran of implementing professional quantitative survey
research. Self-selection bias is a very complicated study, but there are
some fairly accessible and intuitive techniques one may implement to create
a thoughtful survey of a target population which minimizes self-selection
bias concerns. This allows the stakeholders to focus on the challenge of
deriving meaning from the response data rather than feeling nausea over the
I am willing to give, pro bono, 45 minutes of telephone consulting time to
any Wikimedia Foundation staff member who is attached to this particular
survey project, on the condition that they will be open and attentive to the
possibility that a properly-designed and fairly-executed survey may not
return results that foster their preconceived desires to railroad through a
license migration (which, unfortunately, is my key takeaway from observing
*Phil Nash* pn007a2145 at blueyonder.co.uk said:
Except of course, that such a survey would arguably not have "preconceived
desires". So much for empiricism!
I offered to give some pro bono guidance on overcoming (to a degree)
self-selection bias, even among an anonymity-heightened population. I
didn't say that I would be involved in the actual design and execution of
the survey. So much for civility!
This is a very small, self-selected sample; there would be
> no harm or cost associated with turning it on for a much larger
> percentage (or all) of logged-in users on the top-ten languages, not
> just English or German alone, which both have peculiarities associated
> with being the largest Wikipedia communities.
Is there a version of the survey that does *not* entail a self-selected
sample? The methodologist in me wants to know, because it seems to me that
selection bias is inherent in any survey of this sort. (What's more, it
seems fairly predictable in which direction that bias would skew results.)
Ipatrol has just came on IRC claiming that he has been told that the WMF is
hiring people to "validate" articles, and that the foundation is doing it in
secret by using thousands of IPs and academics. He claims that the WMF has
contracted colleges all across the US have been recruiting academics to
"validate" articles, and states that admins are involved in this 'cabal', or
[[Afghan Hound]] is going to be rewritten soon by someone being paid
$1000/month from the foundation, according to his 'source'. His source is
apparently "an ex-teacher from Wright State University". According to
ipatrol, the source claims that they have thousands of closed proxies
operated through closed sites, and that the board are well aware of this.
Apparently "the site's director" does not know, however. Apparently they
showed ipatrol a pay stub from "Wikimedia Foundation". Apparently the person
can also modify the content of diffs, which of course would imply that the
person had root access.
Yes, this sounds like complete and utter nonsense to me too, but I figured I
should bring it up here since it is a pretty serious accusation (if
ridiculous). Ipatrol is apparently going to release the name of the person
to the WMF in private (naturally -l is not the best place to do such a
thing), so hopefully that will sort everything out.
Can anyone shed some light on whether this is even feasible? I really don't
see how it could possibly be construed as being such.
- Chris Down
Anyone give me some idea abaut my blog .....http://jokarwilis2009.blogspot.com
Because my blog is low trafic
From: Erik Moeller
To: Wikimedia Foundation Mailing List
ReplyTo: Wikimedia Foundation Mailing List
Subject: Re: [Foundation-l] Attribution survey, first results
Sent: Mar 4, 2009 10:15 AM
2009/3/3 Thomas Dalton <thomas.dalton(a)gmail.com>:
> Excellent. Getting some idea of community opinion is very important.
> However, has anyone carried out my suggestion of consulting with the
> CC lawyers?
We've been in repeated conversations with CC about the possible
attribution models. CC counsel has commented specifically that
attribution-by-URL is a permissible attribution model that is
consistent with the language and intent of CC-BY*.
Deputy Director, Wikimedia Foundation
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
foundation-l mailing list
Sent from my BlackBerry®
powered by Sinyal Kuat INDOSAT