Hi all,
*Who: *This mostly applies to people who have access to the stat1002 and
stat1003 statistics machines on the production cluster, and publish
datasets as static files.
*What:* We are no longer using datasets.wikimedia.org to serve static
datasets. We have set up a redirect, so requests like
https://datasets.wikimedia.org/ $1 will be sent to
https://analytics.wikimedia.org/datasets/archive/ $1. Most importantly,
publishing datasets is now much easier. Any files you put in
published-datasets on either machine:
stat1002:/a/published-datasets
stat1003:/srv/published-datasets
Are going to be merged together and served together on:
https://analytics.wikimedia.org/datasets/
One request as we all enjoy this much simpler process: let's use README
files in these directories to let future versions of us know what the
datasets are all about. That will make the repository more fun for others
to browse and ease future cleanups. Thank you!
*TODO*
If something of yours got lost, let us know, we have backups. If you had
stuff that we might have cleaned up, we put it in
/srv/otto-to-delete-datasets-cleanup and
/a/otto-to-delete-datasets-cleanup. Take a look there and you can move
files as you see fit into published-datasets
*Context*
For a long time, publishing files from stat1002 and stat1003 was quite
painful. There were three folders, some on both boxes, some only on one
box, symlinks, rsyncs, it was bad. We talked to everyone who had files in
these folders and gathered consensus for this deprecation. If this message
catches you by surprise, please let us know what channel we should reach
you in next time and we'll add it to our communication plan.
This work is tracked in T159409 <https://phabricator.wikimedia.org/T159409>
Hi, if anyone on this list was involved in today's survey you lost my
response at "
I am reading this article to *
- get an overview of the topic.
- get an in-depth understanding of the topic.
- look up a specific fact or to get a quick answer.
Making people choose one of those options loses anyone who is there to find
a typo or for any other reason.
>but that
doesn't necessarily mean that we should use policy and admin tools instead
of persuasion and other tools (such as content policies about verifiability
and notability) to address them
...
>I had an
experience myself when I made a statement to someone which from my
perspective was a statement of fact, and the other party took it as an
insult. I don't apologize for what I said since from my perspective it was
valid, and the other party has not apologized for their reaction, but the
point is that defining what constitutes a personal attack or harassment can
be a very subjective business and I'm not sure to what extent I would trust
an AI to evaluate what constitutes a personal attack or harassment in a
wide range of contexts.
Hey Pine,
A little persuasive rhetoric from a friend here. :)
I do agree with you that talking about these things with one another is
probably more fruitful than Yet Another Policy. So how do we make space for
that? How do we encourage open, honest, emotionally available discussions
around what can be very hard conversations? Talking about feelings for many
cultures is still very difficult. Even here in the midwest United States,
guys talking about how they feel is still seen as effeminate by many.
Unfortunately.
How can we elevate the awareness that despite intent, sometimes we can
insult people. Knowing how to discuss feelings and that we comfortable
doing so may help greatly in the perception and actuality of harassment on
our projects.
I'm thinking of the example you give in my own experiences working with
folks in the movement. It's important to talk about these things and try to
figure out the nuance in our behaviors and how folks reading our often
public discourse can get an impression of us that isn't representative of
our individual selves or the movement as a whole.
Semi-related, I just read this interesting article about how to apologize.
I'm not trying to admonish you here! It just seemed relevant. :) How can we
build a toolkit of awareness for emotionally-connected responses like what
is expressed in this article around apologizing?
http://nymag.com/scienceofus/2017/06/these-apology-critics-want-to-teach-yo…
Yours,
Chris Koerner
Community Liaison - Discovery
Wikimedia Foundation
Hi Everyone,
The next Research Showcase will be live-streamed this Wednesday, June 21,
2017 at 11:30 AM (PST) 18:30 UTC.
YouTube stream: https://www.youtube.com/watch?v=i2jpKRwPT-Q
As usual, you can join the conversation on IRC at #wikimedia-research. And,
you can watch our past research showcases here
<https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#June_2017>.
This month's presentations:
Title: Problematizing and Addressing the Article-as-Concept Assumption in
Wikipedia
By *Allen Yilun Lin*
Abstract: Wikipedia-based studies and systems frequently assume that each
article describes a separate concept. However, in this paper, we show that
this article-as-concept assumption is problematic due to editors’ tendency
to split articles into parent articles and sub-articles when articles get
too long for readers (e.g. “United States” and “American literature” in the
English Wikipedia). In this paper, we present evidence that this issue can
have significant impacts on Wikipedia-based studies and systems and
introduce the subarticle matching problem. The goal of the sub-article
matching problem is to automatically connect sub-articles to parent
articles to help Wikipedia-based studies and systems retrieve complete
information about a concept. We then describe the first system to address
the sub-article matching problem. We show that, using a diverse feature set
and standard machine learning techniques, our system can achieve good
performance on most of our ground truth datasets, significantly
outperforming baseline approaches.
Title: Understanding Wikidata Queries
By *Markus Kroetzsch*
Abstract: Wikimedia provides a public service that lets anyone answer
complex questions over the sum of all knowledge stored in Wikidata. These
questions are expressed in the query language SPARQL and range from the
most simple fact retrievals ("What is the birthday of Douglas Adams?") to
complex analytical queries ("Average lifespan of people by occupation").
The talk presents ongoing efforts to analyse the server logs of the
millions of queries that are answered each month. It is an important but
difficult challenge to draw meaningful conclusions from this dataset. One
might hope to learn relevant information about the usage of the service and
Wikidata in general, but at the same time one has to be careful not to be
misled by the data. Indeed, the dataset turned out to be highly
heterogeneous and unpredictable, with strongly varying usage patterns that
make it difficult to draw conclusions about "normal" usage. The talk will
give a status report, present preliminary results, and discuss possible
next steps.
--
Sarah R. Rodlund
Senior Project Coordinator-Product & Technology, Wikimedia Foundation
srodlund(a)wikimedia.org
For those interested in the mathematical side of Wikipedia / other math wikis.
CALL FOR ABSTRACTS (deadline: 30th June 2017)
ENABLING MATHEMATICAL CULTURES, University of Oxford, 5th-7th December 2017
This workshop celebrates the completion of the EPSRC-funded project
“Social Machines of Mathematics”, led by Professor Ursula Martin at
the University of Oxford. We will present research arising from the
project, and bring together interested researchers who want to build
upon and complement our work. We invite interested researchers from a
broad range of fields, including: Computer Science, Philosophy,
Sociology, History of Mathematics and Science, Argumentation theory,
and Mathematics Education. Through such a diverse mix of disciplines
we aim to foster new insights, perspectives and conversations around
the theme of Enabling Mathematical Cultures.
Our intention is to build upon previous events in the “Mathematical
Cultures” series. These conferences explored diverse topics concerning
the socio-cultural, historical and philosophical aspects of
mathematics. Our workshop will, likewise, explore the social nature of
mathematical knowledge production, through analysis of historical and
contemporary examples of mathematical practice. Our specific focus
will be on how social, technological and conceptual tools are
developed and transmitted, so as to enable participation in
mathematics, as well as the sharing and construction of group
knowledge in mathematics. In particular, we are interested in the way
online mathematics, such as exhibited by the Polymath Projects,
MathOverflow and the ArXiv, enable and affect the mathematical
interactions and cultures.
We hereby invite the submission of abstracts of up to 500 words for
papers to be presented in approximately 30 minutes (plus 10 minutes
Q+A). The Enabling Mathematical Cultures workshop will have space on
Days 2 and 3 of the meeting for a number of accepted talks addressing
the themes of social machines of mathematics, mathematical
collaboration, mathematical practices, ethnographic or sociological
studies of mathematics, computer-assisted proving, and argumentation
theory as applied in the mathematical realm. Please send your
abstracts to Fenner.Tanswell(a)Gmail.com by the deadline of the 30th
June 2017.
The event takes place in the Mathematical Institute of the University
of Oxford on 5th, 6th and 7th December 2017, with a dinner on 5th
December and an informal supper on 6th December.
The focus of Day 1 will be on success, failure and impact of
foundational research with an emphasis on history and long term
development. Days 2 and 3 will focus on studies of contemporary and
prospective mathematical cultures from sociological, philosophical,
educational and computational perspectives.
Confirmed speakers include: Andrew Aberdein, Michael Barany, Alan
Bundy, Joe Corneli, Matthew Inglis, Lorenzo Lane, Ursula Martin, Dave
Murray-Rust, Alison Pease and Fenner Tanswell.
Organising Committee: Ursula Martin, Joe Corneli, Lorenzo Lane, Fenner
Tanswell, Sarah Baldwin, Brendan Larvor, Benedikt Loewe, Alison Pease
Further information will be added to the website at
https://enablingmaths.wordpress.com
Previous "Mathematical Cultures" events can be found here:
https://sites.google.com/site/mathematicalcultures/
Hi all,
We are preparing to conduct a study about WikiProject recommendations. The
goals of our study are (1) to understand the effectiveness of different
recommendation algorithms on recruiting new members to WikiProjects, and
(2) to evaluate the effectiveness of this intervention on engaging and
retaining Wikipedia newcomers.
In this study, we will recommend related editors to the organizers of
WikiProjects, and request them to approach and recruit the editors. We will
measure the actions and reactions of the organizers and editors for
evaluation. More details about our study can be found here on this meta-page
<https://meta.wikimedia.org/wiki/Research:WikiProject_Recommendation>.
While planning the experimental design, we thought to gather more thoughts
and suggestions from the community since this study would involve the
efforts of some Wikipedians, so we wanted to open it up. Also, if you know
of existing work or study in this area, please let us know. Thanks!
Sincerely,
Bowen
Hi Kerry,
Thanks for the feedback! I'll reply to a couple of your points below:
One observation I would make is that like many education experiments, it
> does not control for (what I call) the "highly motivated researcher
> effect". What I've learned from a lifetime of "new ways to teach" is that
> the standard experiment is to parachute in a highly motivated researcher
> into the classroom to introduce the new method, collect data showing
> improved learning, and then advocate for the new method to be rolled out
> more widely. However, rolling out more widely involves taking regular
> teachers (good, bad, and in-between) to learn and apply a new approach, and
> techniques often fail in the face of teacher lack of enthusiasm to learn
> anything new, complaints it makes more demands on teachers to use the new
> method, etc.
This study was a survey of everyone who would participate across 270
courses and 6700 students. We had 90 different instructors that took the
survey as well, many of them were returning instructors (there is data on
that in the instructor survey), but across the 270 courses there was a good
mix (Wiki Ed has data on that, and I could find that if you're really
interested, but our statistical analysis showed that there was no variation
between new and returning instructors for student responses for the
instructors polled - although a different design or question set with that
in mind could produce different results).
I like that suggestion and, in (hopefully) future iterations, we can spend
more thought on that particular piece when designing surveys (like
specifically tracking how many semesters the instructors have been active
with Wiki Ed).
> In this report it says " the program staff provide Wikipedia training and
> expertise so the faculty do not need to have any experience editing" which
> is a big red flag to me. It would be interesting to see the results in an
> experiment where you first train the faculty and then the faculty carry out
> the engagement with students. And then see the results in 3 years time when
> it's a case of "business as usual" rather than "the new thing".
>
New and returning instructors are encouraged to take (and re-take) a pretty
substantial training program (training module completion is listed in the
data set for both instructors and students). Faculty are "trained" but
there is a limit to the hands-on training that one can offer, of course.
> As a general comment, students like the variety of someone new in their
> classroom. Students do tend to learn more from "real world" assignments
> than "lab" assignments because the real world is more complex. However,
> staff and students are often reluctant to have real world assignments
> significantly influence end-of-term marks/grades because of the
> uncontrollable variables in the real world assignment that makes it
> difficult to assess the relative achievement of the students.
Absolutely - Joe Reagle and I just did a presentation for New Media
Consortium that recommended that you never grade students on "what remains
on Wikipedia" - Wiki Ed has some good suggestions for grading too.
> I would expect editing Wikipedia articles to suffer from this problem as
> each student will be working on different article(s) of different starting
> size and quality and with different levels of involvement and monitoring by
> other Wikipedians. It was not clear to me from the report if students were
> being assessed on this Wikipedia assignment and how important it was to
> their overall mark/grade.
This was something that across 270 different classes we could not control.
Additionally, it was part of the IRB approval that nothing in the study can
affect the grade of the student directly (eg: instructors can't give extra
credit for taking the survey, they couldn't require students to be present
during the focus groups, no grades were reported). This would have been a
violation of FERPA in the US, as well as many rules in Canada. So in the
end, there was no "assessment" - although each student's work can be traced
back to their course and folks can do their own marking and analysis
(whether through computational means or otherwise) and compare it to
overall data if they like.
In the end, I hope you can make use of the data - as I mentioned above and
a few times throughout the report and elsewhere, this research was meant to
intersect with a variety of research questions, and I've done my best to
open up the data so that folks can investigate their own questions (like
looking at quality of work or addressing content gaps) alongside some of
the student data.
best,
Zach
--------------------
Zachary J. McDowell, PhD
www.zachmcdowell.com
On Tue, Jun 20, 2017 at 8:00 AM, <
wiki-research-l-request(a)lists.wikimedia.org> wrote:
> Send Wiki-research-l mailing list submissions to
> wiki-research-l(a)lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
> or, via email, send a message with subject or body 'help' to
> wiki-research-l-request(a)lists.wikimedia.org
>
> You can reach the person managing the list at
> wiki-research-l-owner(a)lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Wiki-research-l digest..."
>
>
> Today's Topics:
>
> 1. Student Learning Outcomes using Wikipedia-based assignments
> (Zach McDowell)
> 2. Re: Student Learning Outcomes using Wikipedia-based
> assignments (Kerry Raymond)
> 3. Research about WikiProject Recommendation (Bowen Yu)
> 4. Re: Research about WikiProject Recommendation (Kerry Raymond)
> 5. Re: Research about WikiProject Recommendation (Jonathan Cardy)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 19 Jun 2017 18:30:06 -0400
> From: Zach McDowell <zmcdowell(a)gmail.com>
> To: wiki-research-l(a)lists.wikimedia.org
> Subject: [Wiki-research-l] Student Learning Outcomes using
> Wikipedia-based assignments
> Message-ID:
> <CAMGxLgspVCfbiKfGu=rXXxj6ihTWsaJ=TTDkSJrbkWpr1e+
> f6w(a)mail.gmail.com>
> Content-Type: text/plain; charset="UTF-8"
>
> Hi Everyone,
>
> For the last year I've been working on a fairly large (social science)
> research project studying student learning outcomes using Wikipedia based
> assignments with the Wiki Education Foundation. This was a mixed-methods
> study designed to address a variety of research questions and provide open
> data for researchers to dig through, analyze, and utilize in whatever way
> they deem fit.
>
> Today I am happy to announce that the research report, the data, the
> codebooks, and many other supporting documents have been released under an
> open license.
>
> The research report mostly summarizes the preliminary analysis (there were
> a LOT of questions) of some of the qualitative and quantitative data, but
> it is also meant to help understand the larger scope of the research
> project as well. Although this is just a preliminary report, I am working
> on a few journal publications with this data, so this should lead to more
> than the report (on my end at least).
>
> If you are interested in student learning, new users, information literacy,
> or skills transfer, I hope this report and data set finds you well.
>
> Blog post by LiAnna Davis on WMF Blog:
> https://blog.wikimedia.org/2017/06/19/wikipedia-
> information-literacy-study/
>
> Full data set (zip file):
> https://github.com/WikiEducationFoundation/research
>
> Research report (commons):
> https://commons.wikimedia.org/wiki/File:Student_Learning_
> Outcomes_using_Wikipedia-based_Assignments_Fall_2016_Research_Report.pdf
>
>
> best,
>
> Zach
>
> --------------------
> Zachary J. McDowell, PhD
> www.zachmcdowell.com
>
>
> ------------------------------
>
> Message: 2
> Date: Tue, 20 Jun 2017 16:20:44 +1000
> From: "Kerry Raymond" <kerry.raymond(a)gmail.com>
> To: "'Research into Wikimedia content and communities'"
> <wiki-research-l(a)lists.wikimedia.org>
> Subject: Re: [Wiki-research-l] Student Learning Outcomes using
> Wikipedia-based assignments
> Message-ID: <00e501d2e98d$5b13cad0$113b6070$(a)gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> This is very interesting data.
>
> One observation I would make is that like many education experiments, it
> does not control for (what I call) the "highly motivated researcher
> effect". What I've learned from a lifetime of "new ways to teach" is that
> the standard experiment is to parachute in a highly motivated researcher
> into the classroom to introduce the new method, collect data showing
> improved learning, and then advocate for the new method to be rolled out
> more widely. However, rolling out more widely involves taking regular
> teachers (good, bad, and in-between) to learn and apply a new approach, and
> techniques often fail in the face of teacher lack of enthusiasm to learn
> anything new, complaints it makes more demands on teachers to use the new
> method, etc. In this report it says " the program staff provide Wikipedia
> training and expertise so the faculty do not need to have any experience
> editing" which is a big red flag to me. It would be interesting to see the
> results in an experiment where you first train the faculty and then the
> faculty carry out the engagement with students. And then see the results in
> 3 years time when it's a case of "business as usual" rather than "the new
> thing".
>
> As a general comment, students like the variety of someone new in their
> classroom. Students do tend to learn more from "real world" assignments
> than "lab" assignments because the real world is more complex. However,
> staff and students are often reluctant to have real world assignments
> significantly influence end-of-term marks/grades because of the
> uncontrollable variables in the real world assignment that makes it
> difficult to assess the relative achievement of the students. I would
> expect editing Wikipedia articles to suffer from this problem as each
> student will be working on different article(s) of different starting size
> and quality and with different levels of involvement and monitoring by
> other Wikipedians. It was not clear to me from the report if students were
> being assessed on this Wikipedia assignment and how important it was to
> their overall mark/grade.
>
> -----Original Message-----
> From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org]
> On Behalf Of Zach McDowell
> Sent: Tuesday, 20 June 2017 8:30 AM
> To: wiki-research-l(a)lists.wikimedia.org
> Subject: [Wiki-research-l] Student Learning Outcomes using Wikipedia-based
> assignments
>
> Hi Everyone,
>
> For the last year I've been working on a fairly large (social science)
> research project studying student learning outcomes using Wikipedia based
> assignments with the Wiki Education Foundation. This was a mixed-methods
> study designed to address a variety of research questions and provide open
> data for researchers to dig through, analyze, and utilize in whatever way
> they deem fit.
>
> Today I am happy to announce that the research report, the data, the
> codebooks, and many other supporting documents have been released under an
> open license.
>
> The research report mostly summarizes the preliminary analysis (there were
> a LOT of questions) of some of the qualitative and quantitative data, but
> it is also meant to help understand the larger scope of the research
> project as well. Although this is just a preliminary report, I am working
> on a few journal publications with this data, so this should lead to more
> than the report (on my end at least).
>
> If you are interested in student learning, new users, information
> literacy, or skills transfer, I hope this report and data set finds you
> well.
>
> Blog post by LiAnna Davis on WMF Blog:
> https://blog.wikimedia.org/2017/06/19/wikipedia-
> information-literacy-study/
>
> Full data set (zip file):
> https://github.com/WikiEducationFoundation/research
>
> Research report (commons):
> https://commons.wikimedia.org/wiki/File:Student_Learning_
> Outcomes_using_Wikipedia-based_Assignments_Fall_2016_Research_Report.pdf
>
>
> best,
>
> Zach
>
> --------------------
> Zachary J. McDowell, PhD
> www.zachmcdowell.com
> _______________________________________________
> Wiki-research-l mailing list
> Wiki-research-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
>
>
>
>
> ------------------------------
>
> Message: 3
> Date: Mon, 19 Jun 2017 23:35:22 -0700
> From: Bowen Yu <yuxxx856(a)umn.edu>
> To: Research into Wikimedia content and communities
> <wiki-research-l(a)lists.wikimedia.org>
> Subject: [Wiki-research-l] Research about WikiProject Recommendation
> Message-ID:
> <CAGFZukk-dV+6g4Nphh7AbSCB-smP6Wx=z6kwtvAZL61+J0muwg@
> mail.gmail.com>
> Content-Type: text/plain; charset="UTF-8"
>
> Hi all,
>
> We are preparing to conduct a study about WikiProject recommendations. The
> goals of our study are (1) to understand the effectiveness of different
> recommendation algorithms on recruiting new members to WikiProjects, and
> (2) to evaluate the effectiveness of this intervention on engaging and
> retaining Wikipedia newcomers.
>
> In this study, we will recommend related editors to the organizers of
> WikiProjects, and request them to approach and recruit the editors. We will
> measure the actions and reactions of the organizers and editors for
> evaluation. More details about our study can be found here on this
> meta-page
> <https://meta.wikimedia.org/wiki/Research:WikiProject_Recommendation>.
>
> While planning the experimental design, we thought to gather more thoughts
> and suggestions from the community since this study would involve the
> efforts of some Wikipedians, so we wanted to open it up. Also, if you know
> of existing work or study in this area, please let us know. Thanks!
>
> Sincerely,
> Bowen
>
>
> ------------------------------
>
> Message: 4
> Date: Tue, 20 Jun 2017 17:25:51 +1000
> From: "Kerry Raymond" <kerry.raymond(a)gmail.com>
> To: "'Research into Wikimedia content and communities'"
> <wiki-research-l(a)lists.wikimedia.org>
> Subject: Re: [Wiki-research-l] Research about WikiProject
> Recommendation
> Message-ID: <010e01d2e996$73d6d130$5b847390$(a)gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> Looking at the list of WikiProjects you pointed at, they seem to be a
> mixture of what I would call "process" projects (e.g. Articles for
> Creation, Deletion Sorting) vs "content" projects (e.g. Military History,
> Television) vs a third group like "Women in Red" (which is part process,
> part content).
>
> Generally the "content" projects will tag Talk pages with their
> WikiProject Banner. But "process" projects don't seem to always do this.
> For example, I don't think Women in Red has a project banner generally,
> although I think they do tag articles that arise from specific
> Edit-a-thons. Some of the process projects seem to use hidden categories
> for their work.
>
> I would suggest only working with content projects initially. Content
> projects are more similar to one another in how they operate compared to
> process projects, and I think it is easier to judge if a user is showing an
> interest in a content project than in the process project because of
> standard use of content project banners on articles. So I think you can
> probably get a better understanding if the referral mechanism is working or
> not with content projects, whereas I think process projects have a lot of
> variability in them that may make it difficult to work out if you are
> seeing success or not.
>
> And at the end of the day, as an encyclopedia, we live or die on our
> content. Processes are (or at least should be) supportive of good content
> development but are a second-order effect.
>
> I can certainly see some issues arising from pointing newcomers at process
> projects as they are unlikely to be aware of the processes at that stage.
> And indeed some process project do not accept new editors (think of
> Articles for Creation and new page patrolling). I'd see this as a second
> project if the content project referral mechanism seems to be working.
>
> Anyhow, that my 10cc!
>
> Kerry
>
> ----Original Message-----
> From: Wiki-research-l [mailto:wiki-research-l-bounces@lists.wikimedia.org]
> On Behalf Of Bowen Yu
> Sent: Tuesday, 20 June 2017 4:35 PM
> To: Research into Wikimedia content and communities <
> wiki-research-l(a)lists.wikimedia.org>
> Subject: [Wiki-research-l] Research about WikiProject Recommendation
>
> Hi all,
>
> We are preparing to conduct a study about WikiProject recommendations. The
> goals of our study are (1) to understand the effectiveness of different
> recommendation algorithms on recruiting new members to WikiProjects, and
> (2) to evaluate the effectiveness of this intervention on engaging and
> retaining Wikipedia newcomers.
>
> In this study, we will recommend related editors to the organizers of
> WikiProjects, and request them to approach and recruit the editors. We will
> measure the actions and reactions of the organizers and editors for
> evaluation. More details about our study can be found here on this
> meta-page <https://meta.wikimedia.org/wiki/Research:WikiProject_
> Recommendation>.
>
> While planning the experimental design, we thought to gather more thoughts
> and suggestions from the community since this study would involve the
> efforts of some Wikipedians, so we wanted to open it up. Also, if you know
> of existing work or study in this area, please let us know. Thanks!
>
> Sincerely,
> Bowen
> _______________________________________________
> Wiki-research-l mailing list
> Wiki-research-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
>
>
>
>
> ------------------------------
>
> Message: 5
> Date: Tue, 20 Jun 2017 11:02:21 +0100
> From: Jonathan Cardy <werespielchequers(a)gmail.com>
> To: Research into Wikimedia content and communities
> <wiki-research-l(a)lists.wikimedia.org>
> Subject: Re: [Wiki-research-l] Research about WikiProject
> Recommendation
> Message-ID: <33C5D254-B697-4F62-AF74-3948A75750B1(a)gmail.com>
> Content-Type: text/plain; charset=us-ascii
>
> Hi Bowen,
>
> If you are going to promote wikiprojects by recommendation then you need
> to test different styles of recommendation. Taking what may still be the
> two biggest wikiprojects, MILHIST and professional wrestling, what worked
> as an invitation for either might be quite different than what would work
> for Opera or chemistry. Tone of voice is important when you are seeking to
> entice volunteers.
>
> You also need to allow for the effect of different existing recruitment
> programs. These tend to be subtle, but they will vary, and that variation
> could mask your project. The most obvious recruitment is via wikiproject
> tagging of articles, and that isn't necessarily done by people who are
> active in the project concerned.
>
> Regards
>
> Jonathan
>
>
> > On 20 Jun 2017, at 07:35, Bowen Yu <yuxxx856(a)umn.edu> wrote:
> >
> > Hi all,
> >
> > We are preparing to conduct a study about WikiProject recommendations.
> The
> > goals of our study are (1) to understand the effectiveness of different
> > recommendation algorithms on recruiting new members to WikiProjects, and
> > (2) to evaluate the effectiveness of this intervention on engaging and
> > retaining Wikipedia newcomers.
> >
> > In this study, we will recommend related editors to the organizers of
> > WikiProjects, and request them to approach and recruit the editors. We
> will
> > measure the actions and reactions of the organizers and editors for
> > evaluation. More details about our study can be found here on this
> meta-page
> > <https://meta.wikimedia.org/wiki/Research:WikiProject_Recommendation>.
> >
> > While planning the experimental design, we thought to gather more
> thoughts
> > and suggestions from the community since this study would involve the
> > efforts of some Wikipedians, so we wanted to open it up. Also, if you
> know
> > of existing work or study in this area, please let us know. Thanks!
> >
> > Sincerely,
> > Bowen
> > _______________________________________________
> > Wiki-research-l mailing list
> > Wiki-research-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
>
>
>
> ------------------------------
>
> Subject: Digest Footer
>
> _______________________________________________
> Wiki-research-l mailing list
> Wiki-research-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
>
>
> ------------------------------
>
> End of Wiki-research-l Digest, Vol 142, Issue 13
> ************************************************
>
Hi Everyone,
For the last year I've been working on a fairly large (social science)
research project studying student learning outcomes using Wikipedia based
assignments with the Wiki Education Foundation. This was a mixed-methods
study designed to address a variety of research questions and provide open
data for researchers to dig through, analyze, and utilize in whatever way
they deem fit.
Today I am happy to announce that the research report, the data, the
codebooks, and many other supporting documents have been released under an
open license.
The research report mostly summarizes the preliminary analysis (there were
a LOT of questions) of some of the qualitative and quantitative data, but
it is also meant to help understand the larger scope of the research
project as well. Although this is just a preliminary report, I am working
on a few journal publications with this data, so this should lead to more
than the report (on my end at least).
If you are interested in student learning, new users, information literacy,
or skills transfer, I hope this report and data set finds you well.
Blog post by LiAnna Davis on WMF Blog:
https://blog.wikimedia.org/2017/06/19/wikipedia-information-literacy-study/
Full data set (zip file):
https://github.com/WikiEducationFoundation/research
Research report (commons):
https://commons.wikimedia.org/wiki/File:Student_Learning_Outcomes_using_Wik…
best,
Zach
--------------------
Zachary J. McDowell, PhD
www.zachmcdowell.com
Hello,
Earlier this year Wikimedia UK conducted a survey of Welch Wicipedia's
readers. We wanted to learn more about their demographics and why they
chose to read Welsh Wicipedia. 1001 people filled in the survey and the
results are available on meta-wiki in English
<https://meta.wikimedia.org/wiki/Research:Readership_of_Welsh_Wicipedia>
and Welsh
<https://meta.wikimedia.org/wiki/Research:Canlyniadau_arolwg_o_ddarllenwyr_W…>
.
Richard Nevell
--
Richard Nevell
Project Coordinator
Wikimedia UK - sign up to our newsletter <http://eepurl.com/cnYOw5>
+44 (0) 20 7065 0921 <020%207065%200921>
Wikimedia UK is a Company Limited by Guarantee registered in England and
Wales, Registered No. 6741827. Registered Charity No.1144513. Registered
Office 4th Floor, Development House, 56-64 Leonard Street, London EC2A 4LT.
United Kingdom. Wikimedia UK is the UK chapter of a global Wikimedia
movement. The Wikimedia projects are run by the Wikimedia Foundation (who
operate Wikipedia, amongst other projects).
*Wikimedia UK is an independent non-profit charity with no legal control
over Wikipedia nor responsibility for its contents.*