Bernie Hogan is soliciting for a part-time Wikipedia researcher, via his
blog:
http://people.oii.ox.ac.uk/hogan/2012/05/job-opening-for-part-time-research…
======
Job opening for part time researcher on Wikipedia
We are looking for a part time researcher for our wikipedia project. It
should be someone with quantitative skills and a history of writing for an
academic audience (and some post-graduate training). We can be somewhat
flexible in terms of hours. That is to say, if you are interested in
working near full time over the summer, rather than half time for six
months, we can negotiate.
Here is the post:
Part-time Research Assistant Fell Fund Project
OXFORD INTERNET INSTITUTE
Grade 6: Salary £26,004 – £31,020 p.a. (Variable hours – up to 18.5 hours a
week)
We are a leading international research Institute looking for a part-time
Research Assistant to carry out research into the geography and social
structure of Wikipedia in the Middle East and North Africa through
large-scale data analysis. The position will involve the analysis of the
corpus of Wikipedia text, user-pages and history files and the use of
statistical techniques to explain spatial and social patterns. Our research
question focuses on patterns of representation on Wikipedia as well as an
articulation of patterns of conflict and barriers to participation.
Based at our OII North office at 66 Banbury Road, this position is
available immediately for 6 months, with variable hours – up to 18.5 hours
a week.
Applications for this vacancy are to be made online. To apply for this role
and for further details, including a job description and selection
criteria, please click here
Only applications received before 12:00 midday on the 1st June 2012 can be
considered. Interviews for those short-listed are currently planned to take
place in the week commencing 4th June 2012
-------- Message original --------
Sujet: Re: [Wiki-research-l] Experimental study of informal rewards in peer production
Date : Thu, 26 Apr 2012 15:50:44 -0400
De : Michael Restivo <mike.restivo(a)gmail.com>
Pour : Chitu Okoli <Chitu.Okoli(a)concordia.ca>, Research into Wikimedia content and communities <wiki-research-l(a)lists.wikimedia.org>
Hi Chitu,
Yes, your conjecture is spot-on. Here is a more detailed response that I sent to Joseph. I tried sending this to the wiki-research-l but the email keeps bouncing back to me. If you're interested and willing to share it with the list, that would be acceptable to me.
We thought about this question quite extensively and there are a few reasons why we sampled the top 1% (which we didn't get around to discussing in this brief paper). First, because of the high degree of contribution inequality in Wikipedia's editing community, we were primarily interested in how status rewards affect the all-important core of highly-active editors. There is also a lot of turn-over in the long tail of the distribution, and even among the most active editors, there is considerable heterogeneity. Focusing on the most active users ensured us sufficient statistical power. (Post-hoc power analysis suggests that our sample size would need to be several thousand users in the 80-90th percentiles, and several hundred in the 90-99th percentiles, to discern an effect of the same strength.) Also, we considered the question of construct validity: which users are deserving (so to speak) of receiving an editing award or social recognition of their work?
You are right that it should be fairly easy to extend this analysis beyond just the top 1%, but just how wide a net to cast remains a question. The issue of power calculation and sample size becomes increasingly difficult to manage for lower deciles because of the power-law distribution. And I don't think it would be very meaningful to assess the effect of barnstars on the bottom half of the distribution, for example, for the substantive reasons I mentioned above. Still, I'd be curious to hear what you think, and whether there might be some variations on this experiment that could overcome these limitations.
In terms of data dredging, that is always a concern and I completely understand where you are coming from. In fact, as both and author and consumer of scientific knowledge, I'm rarely ever completely satisfied. For example, a related concern that I have is the filing cabinet effect - when research produces null (or opposite) results and hence the authors decide to not attempt to have it published.
In this case, I actually started this project with the hunch that barnstars would lead to a slight decline in editing behavior; my rationale was that rewards would act as social markers that editors' past work was sufficient to earn social recognition and hence receiving such a reward would signal that the editor had "done enough" for the time being. In addition to there being substantial support for this idea in the economics literature, this intuition stemmed from hearing about an (unpublished) observational study of barnstars by Gueorgi Kossinets (formerly at Cornell, now at Google) that suggested editors receive barnstars at the peak of their editing activity. Of course, we chose an experimental design precisely to help us to tease out the causal direction as well as what effect barnstars have for recipients relative to their unrewarded counterparts. We felt like no matter what we found - either a positive, negative, or even no effect - it would have been interesting
enough to publish, so hopefully that alleviates some of your concerns.
Please let me know if you have any other questions, and I'd love to hear your thoughts about potential follow-ups to this research.
Regards,
Michael
On Thu, Apr 26, 2012 at 3:30 PM, Chitu Okoli <Chitu.Okoli(a)concordia.ca <mailto:Chitu.Okoli@concordia.ca>> wrote:
One obvious issue is that it would be unethical to award barnstars to contributors who did not deserve them. However, the 1% most productive contributors, by definition, deserved the barnstars that the experimenter awarded them. Awarding barnstars to undeserving contributors for experimental purposes probably would not have flown so easily by the ethical review board. As the article notes:
----------
This study's research protocol was approved by the Committees on Research Involving Human Subjects (IRB) at the State University of New York at Stony Brook (CORIHS #2011-1394). Because the experiment presented only minimal risks to subjects, the IRB committee determined that obtaining prior informed consent from participants was not required.
----------
This is my conjecture; I'd like to hear the author's comments.
~ Chitu
-------- Message original --------
Sujet: [Wiki-research-l] Experimental study of informal rewards in peer production
De : Joseph Reagle <joseph.2011(a)reagle.org <mailto:joseph.2011@reagle.org>>
Pour : michael.restivo(a)stonybrook.edu <mailto:michael.restivo@stonybrook.edu>
Copie à : Research into Wikimedia content and communities <wiki-research-l(a)lists.wikimedia.org <mailto:wiki-research-l@lists.wikimedia.org>>
Date : 26 Avril 2012 11:42:01
In this [study](http://www.plosone.org/article/info:doi%2F10.1371%2Fjournal.pone.003…:
> Abstract: We test the effects of informal rewards in online peer production. Using a randomized, experimental design, we assigned editing awards or “barnstars” to a subset of the 1% most productive Wikipedia contributors. Comparison with the control group shows that receiving a barnstar increases productivity by 60% and makes contributors six times more likely to receive additional barnstars from other community members, revealing that informal rewards significantly impact individual effort.
I wonder why it is limited to the top 1%? I'd love to see the analysis repeated (should be trivial) on each decile. Besides satisfying my curiosity, some rationale and/or discussion of other deciles would also address any methodological concern about data dredging.
--
Michael Restivo
Department of Sociology
Social and Behavioral Sciences S-433
Stony Brook University
Stony Brook, NY 11794
mike.restivo(a)gmail.com <mailto:mike.restivo@gmail.com>
Hi @all,
I am looking into cases of language switch and I wish to find edits not undertaken by bots,
- how can I do such a search in Wikipedia? would I do this by a search that involves a string similar to
[[xx:xxxxxxxx...]]?
- where would I find the entry point for such a search?
maybe there are some better ideas about how to proceed ;-)
thanks & cheers,
Claudia
koltzenburg(a)w4w.net
Current historical research appears first in journal articles and
later in books (books take perhaps 3-5 years to get published in
history!). Convention papers are not so useful because historians do
not usually circulate their convention papers widely.
The latest edition of a textbook will try to reflect recent
scholarship but in history they rarely have footnotes so it is never
quite clear what or who they are referencing.
Very few Wiki articles in history city any journals, and the books
used tend to be out of date or else well known new books by famous
authors working at the Pulitzer prize level--those prize books do get
cited. However much less often does Wiki cite monographs from
university presses. It is now possible to use google and amazon for
their excellent search and excerpt roles --but those were not
available back in 2006-8 when most of the writing was done. In my
opinion a way to attract professors is to encourage them to use their
classes to upgrade the scholarship in the Wiki articles. ~~~~
JSTOR reports there were about 300 articles on Shakespeare a year in
scholarly journals in 1997 to 2006; none of them are cited, nor any
since then and only one before then. This is typical as well of
political and military history. Wiki editors are not using scholarly
journals. I assume that is because they are unaware of them. ~~~~
ww.jstor.org is available worldwide but most of its 1000-or-so
journals are English language journals. (many published in the UK)
. It is broadening its reach --they have added
numerousSpanish French, & German journals (eg 14 titles that start
with "Revista" & 18 that start with "Revue" and 23 "Zeitschrift")
about 18% of the content is available free to individuals with no
library access. All the tables of content are open access. Note
that JSTOR has competitors such as Project MUSE with about 500
journals (especially in the Humanities). Several publishers are
setting up their own journal archives. I do not know of any
comparable sites based in Europe -- but India has a major site.
At 12:42 AM 5/4/2012, you wrote:
>Hoi,
>How well is JSTOR available outside of the English speaking world...
>The notion that English is the only language with a Wikipedia is
>obviously wrong.
>Thanks,
>Gerard
Open access journals are not going anywhere soon and hypothetical
reliance on them would be a death sentence to Wikipedia because their
range of coverage is so narrow.
As for JSTOR, tens of millions of Wiki users get immediate full
access as do patrons of 6500 libraries and practically all students
in higher education in US, UK, Canada, etc.- They have access but
do not use it when they edit for Wikipedia--I suggest they do not
know about it. (about half the Wikipedia editors are younger than 23,
suggesting that they are probably students with free access to
library services.)
As for scholarly articles, a common format is for the first part of
the article to summarize the state of the art and the remainder of
the article to present the author's original research. That first
part is especially valuable for Wiki editors. ~~~~
I am looking at the edit history of a number of major articles on
historical topics (in the English Wikipedia)
I find that most of the important writing was done in 2006-8.
Typically, the article reached maturity about 2008 and since then the
rate of editing has plunged. In most cases I see only minor or
maintenance editing since then. The new material since 2008 is
mostly cosmetic: illustrations still get added, lots of links are
made, new categories added, new lists are appended, vandalism is
removed. The citations are increasingly out of date. The articles
are long in tooth.
Wiki is now resembling the old paper encyclopedias--they would get
old fast and need constant updating either through yearbooks or new editions.
Richard Jensen
MathJax [1] is now enabled site-wide as an opt-in preference. You can now see beautifully rendered, accessible, copy&pasteable and standard-compliant (MathML) formulas on Wikipedia, replacing the old TeX-rendered PNGs.
To enable MathJax on Wikipedia or any other Wikimedia project, go to your preferences > Appearance > Math and select "MathJax (experimental; best for most browsers)". A big thanks to the MathJax developers and all the folks who helped deploy it. [2]
Dario
[1] http://www.mathjax.org/
[2] https://bugzilla.wikimedia.org/show_bug.cgi?id=31406
I've been looking over a lot of history articles and the tupical
pattern in terms of edits is a bell-shaped curve with the peak around 2007.
For a good example see
Shakespeare
http://toolserver.org/~tparis/articleinfo/index.php?article=William_Shakesp…
look at the bar chart under "year counts..
By Nov 2007 the surge of editing virtually ended. The article was
then 83kb in length...it had a small burst of growth in late 2009
reaching 100k in June 2009; it is now 106k long. Basically the
article was mostly finished in 2007, and has had little change in the
last 3 years. With a couple minor exceptions the youngest source
cited in the footnotes is 2006. The newest item in the bibliography
is one book from 2007, I saw n=1 article in a scholarly journal
(from 1969). Maybe it's ok for a college freshman but an English
major so unaware of the recent scholarship would not get a good grade.
The look at the contributors
http://toolserver.org/~daniel/WikiSense/Contributors.php?wikilang=en&wikifa…
of the 9 editors with over 100 edits, only two have been active on
this article in 2012
Shakespeare received 648,000 views in April 2012, compared to 585,000
in April 2010 and 575,000 in April 2008. As for the often heard
fear that anyone can edit it, note that 1100 editors are watching
over that article and are alerted to any changes. However none of
them has added anything from the ton of scholarship that has appeared
since 2006. ~~~~