Pursuant to prior discussions about the need for a research
policy on Wikipedia, WikiProject Research is drafting a
policy regarding the recruitment of Wikipedia users to
participate in studies.
At this time, we have a proposed policy, and an accompanying
group that would facilitate recruitment of subjects in much
the same way that the Bot Approvals Group approves bots.
The policy proposal can be found at:
The Subject Recruitment Approvals Group mentioned in the proposal
is being described at:
Before we move forward with seeking approval from the Wikipedia
community, we would like additional input about the proposal,
and would welcome additional help improving it.
Also, please consider participating in WikiProject Research at:
University of Minnesota
I am doing a PhD on online civic participation project
(e-participation). Within my research, I have carried out a user
survey, where I asked how many people ever edited/created a page on a
Wiki. Now I would like to compare the results with the overall rate of
wiki editing/creation on country level.
I've found some country-level statistics on Wikipedia Statistics (e.g.
3,000 editors of Wikipedia articles in Italy) but data for UK and
France are not available since Wikipedia provides statistics by
languages, not by countries. I'm thus looking for statistics on UK and
France (but am also interested in alternative ways of measuring wiki
editing/creation in Sweden and Italy).
I would be grateful for any tips!
Sunny regards, Alina
European University Institute
For the last week or so I am getting the following error when trying to
use the http://wikidashboard.appspot.com/ tool: "403: User account
expired. The page you requested is hosted by the Toolserver user
wiki_researcher, whose account has expired. Toolserver user accounts are
automatically expired if the user is inactive for over six months. To
prevent stale pages remaining accessible, we automatically block
requests to expired content. If you think you are receiving this page in
error, or you have a question, please contact the owner of this
document: wiki_researcher [at] toolserver [dot] org. (Please do not
contact Toolserver administrators about this problem, as we cannot fix
it---only the Toolserver account owner may renew their account.)"
I've tried contacting the owner, and send an email to PARC
<http://en.wikipedia.org/wiki/PARC_%28company%29> (it's their project,
per the logo seen at the project page ) through their web form, but so
far - nothing. Can anyone help to contact them?
The tool is useful not only for research (I've used and I am sure so
have others here); it is also one of the tools used by Good Article
reviewers (and linked from
Why we allow toolserver tools used by the community to expire in such a
confusing way is beyond me.
Piotr Konieczny, PhD
WMF researchers have agreed to participate in an office hour about WMF research projects and methodologies.
The currently scheduled participants are:
* Aaron Halfaker, Research Analyst (contractor)
* Jonathan Morgan, Research Strategist (contractor)
* Evan Rosen, Data Analytics Manager, Global Development
* Haitham Shammaa, Contribution Research Manager
* Dario Taraborelli, Senior Research Analyst, Strategy
We'll meet on IRC in #wikimedia-office on April 22 at 1800 UTC. Please join us.
I'm starting a new project, a wiki search engine. It uses MediaWiki,
Semantic MediaWiki and other minor extensions, and some tricky templates
I remember Wikia Search and how it failed. It had the mini-article thingy
for the introduction, and then a lot of links compiled by a crawler. Also
something similar to a social network.
My project idea (which still needs a cool name) is different. Althought it
uses an introduction and images copied from Wikipedia, and some links from
the "External links" sections, it is only a start. The purpose is that
community adds, removes and orders the results for each term, and creates
redirects for similar terms to avoid duplicates.
Why this? I think that Google PageRank isn't enough. It is frequently
abused by farmlinks, SEOs and other people trying to put their websites
Search "Shakira" in Google for example. You see 1) Official site, 2)
Wikipedia 3) Twitter 4) Facebook, then some videos, some news, some images,
Myspace. It wastes 3 or more results in obvious nice sites (WP, TW, FB).
The wiki search engine puts these sites in the top, and an introduction and
related terms, leaving all the space below to not so obvious but
interesting websites. Also, if you search for "semantic queries" like
"right-wing newspapers" in Google, you won't find real newspapers but
"people and sites discussing about ring-wing newspapers". Or latex and
LaTeX being shown in the same results pages. These issues can be resolved
with disambiguation result pages.
How we choose which results are above or below? The rules are not fully
designed yet, but we can put official sites in the first place, then .gov
or .edu domains which are important ones, and later unofficial websites,
blogs, giving priority to local language, etc. And reaching consensus.
We can control aggresive spam with spam blacklists, semi-protect or protect
highly visible pages, and use bots or tools to check changes.
It obviously has a CC BY-SA license and results can be exported. I think
that this approach is the opposite to Google today.
For weird queries like "Albert Einstein birthplace" we can redirect to the
most obvious results page (in this case Albert Einstein) using a hand-made
redirect or by software (some little change in MediaWiki).
You can check a pretty alpha version here http://www.todogratix.es (only
Spanish by now sorry) which I'm feeding with some bots.
I think that it is an interesting experiment. I'm open to your questions
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT <http://code.google.com/p/avbot/> |
| WikiEvidens <http://code.google.com/p/wikievidens/> |
| WikiTeam <http://code.google.com/p/wikiteam/>
Personal website: https://sites.google.com/site/emijrp/
I am trying to gather some data for a new paper, but I wonder if there
is a more efficient way of doing so than by using Wikipedia Special:Contribs
I have a list of editors, whose edits I'd like to analyze and get
numbers on their contributions by mainspace, and to specific groups of
pages (such as Wikipedia:Arbitration and its subpages, for example). In
other words, for a defined group of users, I would like to know if they
have ever contributed to an arbitration page, and if they did, how many
edits did they make.
I am assuming this wouldn't be that difficult for somebody who knows how
to run the queries on the Wikipedia database, but I have never been able
to develop enough of a coding skill to do so. Still, if people could
direct me to a page with instructions on how to run a database query,
perhaps I can try to learn. THat is, if they have been made more non-CS
person friendly, as two or so years ago when I last research this topic
they were, IMHO, still beyond the means of a non-coder to deal with.
Alternatively, I can consider paying someone to run a number of such
queries for me, since I now even have a real research budget :)
Piotr Konieczny, PhD