[[User:Mauro742]] has produced some statistics:
http://meta.wikimedia.org/wiki/Usage_of_edit_summary_on_Wikipedia
There are great differences among Wikipedias.
Apparently, sk.wiki and pl.wiki are the only Wikipedias where standard
edit summaries buttons are used, and where edit summary usage is about 90%.
Wha to do you think? Are those buttons useful, or maybe they make edit
summaries more similar and pointless? Do they increase usability? Should
we introduce them elsewhere?
Nemo
Hello.
Thanks to feedback from Church_of_emacs and DerHexer, we fixed a bug in the slides of the flagged revisions presentation at Wikimania 2010.
http://www.slideshare.net/glimmerphoenix/flagged-revs
The bug is concretely in the graph of protection actions. Green and blue lines (slide 23) were mistakenly labeled as protection and unprotection in talk pages, while it actually corresponds to user pages.
I also included Tobia's comments on the sudden raise in the green line:
"The reason for the sudden increase of page protections is mass-blocking
of open proxies, which was combined with creating the userpage of
blocked IPs with a template and protecting them".
Open sourcing research also works ;).
Best,
Felipe.
PS. I downloaded the file, and it has been updated on Slideshare, but the preview still displays the old version. I hope it may be just a matter of time that preview is also updated.
Hi all,
I'm working on a study for which I'd like to know more about editors' watchlisting practices. Of course what I'd really like is to know who had what page on their watchlist when, but I understand the obvious privacy issues there. I assume those issues explain why that information is not (AFAIK) available in dumps etc.
I have read some great qualitative pieces which discuss watchlisting [e.g. 1], which are very helpful (please don't hesitate to suggest others), but haven't seen quantitative data, which our study calls for.
Failing exact data, what do we know about the distribution of practices of watchlisting?
Currently my plan is to assume that anyone who has edited an article in the past 6 months has it on their watchlist. Obviously a very corse assumption. If we had any empirical knowledge about these practices then I could use a distribution (e.g. editors have the page on their watchlist at some % chance, altering depending on their number/tenure of editing that page). I also don't have any way to estimate whether someone who has never edited a page has a page on their watchlist (or assuming that some do, whether there's any useful way to guess which pages they are likely to have on their watchlists).
Grateful for any suggestions or reactions,
Thanks,
James Howison
[1]: Bryant, S., Forte, A., and Bruckman, A. (2005). Becoming Wikipedian: transformation of participation in a collaborative online encyclopedia. In Proceedings of the 2005 international ACM SIGGROUP conference on Supporting group work, page 10. ACM.
Some motivation for a proper WikiCite project. --sj
=============== Begin forwarded message ==================
"How citation distortions create unfounded authority: analysis of a
citation network"
http://www.bmj.com/cgi/content/full/339/jul20_3/b2680
Abstract:
Objective -To understand belief in a specific scientific claim by
studying the pattern of citations among papers stating it.
Design - A complete citation network was constructed from all PubMed
indexed English literature papers addressing the belief that \u03b2
amyloid, a protein accumulated in the brain in Alzheimer\u2019s
disease, is produced by and injures skeletal muscle of patients with
inclusion body myositis. Social network theory and graph theory were
used to analyse this network.
Main outcome measures - Citation bias, amplification, and invention,
and their effects on determining authority.
Results:
The network contained 242 papers and 675 citations addressing the
belief, with 220 553 citation paths supporting it. Unfounded authority
was established by citation bias against papers that refuted or
weakened the belief; amplification, the marked expansion of the belief
system by papers presenting no data addressing it; and forms of
invention such as the conversion of hypothesis into fact through
citation alone. Extension of this network into text within grants
funded by the National Institutes of Health
and obtained through the Freedom of Information Act showed the same
phenomena present and sometimes used to justify requests for funding.
Conclusion:
Citation is both an impartial scholarly method and a powerful form of
social communication. Through distortions in its social use that
include bias, amplification, and invention, citation can be used to
generate
information cascades resulting in unfounded authority of claims.
Construction and analysis of a claim specific citation network may
clarify the nature of a published belief system and expose distorted
methods of social citation.
--
Samuel Klein identi.ca:sj w:user:sj
--
Samuel Klein identi.ca:sj w:user:sj
Hi,
I've been working on a vandalism detection tool for Wikipedia and I am
currently looking for a list of spam words.
Basically, I am looking for a list of terms typically associated with
vandalism or spam.
Is anybody aware of such resource?
Thanks in advance for your comments.
Best regards,
--
Sérgio Nunes