Hi Pine,
thank you for your detailed response. Answers in line!
On Wed, Sep 24, 2014 at 8:24 PM, Pine W <wiki.pine(a)gmail.com> wrote:
Hi Anasuya,
Thanks for this announcement. It will be interesting to see what we learn
from this round.
I have a few questions.
* People may feel more comfortable with providing reports in languages that
are familiar to them. Are responses allowed in languages other than
English?
Yes, they are. Open-ended text responses may be answered in different
languages, as well. Text boxes are dominant in the survey to make the
process less rigid. While we may need to follow up for clarification after
translation, we definitely are willing to work with those reporting to make
the process as light as possible. Further, category names should be
reported as they would appear in the project they exist in, not translated
to English.
* The "Start Date" and "End Date" fields allow free-form text. Will
that
create any difficulties for the people who compile and analyze the
information from these reports?
Actually, the fields are open text, because, if we add validation, it
requires the answer and will not permit a reporter who does not have the
exact dates, or someone who only has a date and not hour, to advance
through the rest of the reporting form. Instead, we do offer an open box,
optional to complete, and instruct responses to be in the MM/DD/YY:00:00
(UTC) format. We will clean up the data along with the rest of the
analysis.
* What is the definition of "quality content"?
There are many definitions of quality content, many of which have no easy
measure. In this question set , we expect the indicators listed to assess
how much content was affected: Number of bytes added (Bytes added, positive
sum in Wikimetrics), Number of bytes removed (Bytes added, negative sum in
Wikimetics), Number of new articles that were created (Pages created,
namespace 0 in Wikimetrics), Number of photos/media uploaded to Wikimedia
Commons, and Number of Wikimedia pages improved.
Later on in the survey, along with other outcomes rather than outputs, we
also ask for metrics indicating “quality”. For images, these metrics
include number of uniques images used, number of Wikimedia projects using
images, and the number of images which are awarded “quality image,” “valued
image,” or “featured picture” status. For article content, these metrics
include number of articles which are awarded “featured article” or
“quality article” status.
* The statement "Please upload your txt or csv file of your participant
usernames." raises an interesting privacy question. Should program leaders
be uploading those usernames if consent forms were not obtained? Also, do
different standards apply if, at the end of the Qualtrics report, the
program leader who is completing the survey selects the option to allow all
data from the survey form to be public?
Yes, that is an important clarification we can make. Importantly, opt-in
procedures should be followed in order to collect usernames at in-person
events for use in Wikimetrics. The use of this tool, housed on servers in
the US, automatically transfers data internationally (for anyone outside of
the U.S.) [1]. If usernames for a program exist publically (eg,on an event
page, or online elsewhere) we suggest an opt-out procedure for Wikimetrics
as well, but it is as a courtesy as the data have already been exchanged
internationally and publically via the internet in such a case. Regardless,
no individual usernames will appear in any level of reporting. However, we
need to have username list used for reporting. This will allow us to
validate the data, as well as have the ability to pull additional metrics,
if needed, as we proceed with analysis and reporting.
We have now added further clarification.
We have changed the language from:
Please upload your txt or csv file of your participant usernames.
to:
Please upload your txt or csv file of your participant usernames.
Note: If you have collected usernames from an in-person event as the only
record of participation, there are some important steps that may apply
[1]. If you have any questions or concerns about confidentiality of user
names please reach out to the evaluation team and we can help you determine
what is appropriate.
* I am confused by the question that ends with the report. It starts with
the statement that "Although we will not share reporting data in an
identifiable way..." and then proceeds to ask if the person completing the
report will allow WMF to "share the name of my program along with my
reported data so that people can see how we did". Can you clarify this
situation?
Last round of reporting we used the standard privacy language to assure
complete anonymity of direct reporters as a general privacy decision to
maximize comfort and accuracy in reporting. When we published our first
round of reports with such a high level of anonymity, a number of community
members protested [2] and we agreed we would allow reporters to choose what
level of anonymity they wanted. This round, reporters may choose (a) to
remain completely anonymous, (b) share that their program implementation
exists in the pool of those reported, or (c) even allow for their line of
data to be identifiable in the data tables found at the end of each report
[3].
In addition to clarifying that we mean Wikimetrics cohorts, we have also
expanded the privacy statement to be more clear We believe this is
clarified as we amended the language from:
*Your privacy is important to us. We will only share your reporting data
anonymously unless you indicate that you prefer us to share the your
program’s name.*
to:
Your privacy is important to us. By answering these questions, you permit
us to record your responses and agree to donate them to the public domain.
We will not publicly associate your program's name with your answers or
share your name or contact information, unless you permit us or disclosure
is required by law. Wikimedia is a worldwide organization. By answering our
questions, you consent to the transfer of your responses to the United
States and other places as may be necessary to carry out the objectives of
this project.
Thanks,
Pine
Thanks for bringing these points up. We are always working to improve our
tools and instructions!
Should anyone have further questions or comments regarding this second
round of voluntary programs data collection, please direct them either to
our portal talk page [4], or if related to specific text or items, you may
also comment directly on the reporting form preview [5].
On behalf of the Learning and Evaluation team,
*María Cruz * \\ Community Coordinator, PE&D Team \\ Wikimedia Foundation,
Inc.
mcruz(a)wikimedia.org | : @marianarra_ <https://twitter.com/marianarra_>
Referenced links:
[1]
https://meta.wikimedia.org/wiki/Grants:Evaluation/Wikimetrics/Forms
[2]
https://meta.wikimedia.org/wiki/Grants_talk:Evaluation/Evaluation_reports/2…
[3]
https://meta.wikimedia.org/wiki/Grants:Evaluation/Evaluation_reports/2013/E…
[4]
https://meta.wikimedia.org/wiki/Grants_talk:Evaluation
[5]
https://docs.google.com/document/d/1CG-K8I1d9JPqyRRyHTIQ5x7fASQXcHZsEMKutdA…
On Wed, Sep 24, 2014 at 4:29 PM, Anasuya Sengupta <asengupta(a)wikimedia.org
wrote:
> Dear Wikimedian friends and colleagues,
> tl;dr We have just launched our second round of voluntary reporting. This
> is the most epic data collection and analysis of Wikimedia programs we've
> done so far as a movement, and all program leaders are invited to take
> part. :-) You can do so here:
>
https://wikimedia.qualtrics.com/SE/?SID=SV_0B3azKpdZ7ggCtD (or get in
> touch with the L&E team for support).
> As we did in the Fall of 2013, we
invite community members leading and
> evaluating Wikimedia programs to share their data with the rest of the
> movement (i.e., Edit-a-thons, Editing Workshops, On-wiki Writing
> Contests, Photo Events, etc.). Last year’s data was collected and
> analysed in a series of reports that was the beginning of telling the
> Wikimedia story of impact: the incredible work of over 60 program leaders
> implementing 119 programs or projects in 30 countries across the world.
> This helped us start building a set of good and best practices for
> effective programs across our movement.[1] This year’s data drive will be
> critical to help us continue to do and learn better from each other.
> To best prepare, program leaders can
review the reporting items [1] and
> start gathering that data you have filed away about your programs since
the
last reporting round. We are looking for data on
programs completed any
time from September 1, 2013 through September 30, 2014. You can ask
questions directly on the reporting form preview [2] or on our portal
talk
> page [3]. If you are planning to report and may need support from us, do
> let us know so that we can help in any way needed.
> When ready, you will find the
reporting collector at:
>
https://wikimedia.qualtrics.com/SE/?SID=SV_0B3azKpdZ7ggCtD
> We also welcome your data in
different formats. For example, if you have
> already reported data elsewhere, we are happy to work with you to make
the
> process as easy as possible. Message eval(a)wikimedia.org and we can work
> out the easiest way to include your data.
> We are expanding the number of
programs covered in the reporting this
> year, and extend the reporting window longer for some new programs, GLAM,
> and Wiki Loves Monuments. See the schedule below for timelines for
> reporting for each program type.
> Data submission deadlines by
program:
> Due by October 20th
> -
> Edit-a-thons/editing parties
> -
> Editing Workshops
> Due by November 3rd
> -
> On-wiki Writing Contests
> -
> Photo Events (Wiki Loves Earth,
WikiExpeditions, WikiTakes, etc.)
> -
> Wikipedia Education Program
> Due by November 17th
> -
> Conferences
> -
> GLAM Content Donation
> -
> Hackathons
> -
> Wiki Loves Monuments (2013 and
2014)
> -
> Wikimedian in Residence
> Remember, reporting is voluntary but the more people do it, the better
> representation of programs we can make. This voluntary reporting allows
us
to come together and generate a bird’s eye view
of programs [4]. We want
to
> understand the impact of programs across different contexts, to examine
> both more broadly, and more deeply, what works best to meet our shared
> goals for Wikimedia and to, together, grow the awesome [5] in Wikimedia
> programs!
> On behalf of the Program Evaluation
and Design team, thank you for your
> time and support in this initiative.
> Warmly,
> Anasuya
> Resource links:
> [1]
https://meta.wikimedia.org/wiki/Grants:Evaluation/Evaluation_reports/2013
> [2]
https://docs.google.com/document/d/1CG-K8I1d9JPqyRRyHTIQ5x7fASQXcHZsEMKutdA…
> [3]
https://meta.wikimedia.org/wiki/Grants_talk:Evaluation
> [4]
https://commons.wikimedia.org/wiki/File:Evaluation_Report_(beta)_Poster_Wik…
> [5]
http://www.nbp.org/nbp/images/book_photos/MAG-AWESOME.jpg
> --
> *Anasuya SenguptaSenior Director of GrantmakingWikimedia Foundation*
> Imagine a world in which every single
human being can freely share in
> the sum of all knowledge. Help us make it a reality!
> Support Wikimedia <https://donate.wikimedia.org/>
> _______________________________________________
> Please note: all replies sent to this mailing list will be immediately
> directed to Wikimedia-l, the public mailing list of the Wikimedia
> community. For more information about Wikimedia-l:
>
https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
> _______________________________________________
> WikimediaAnnounce-l mailing list
> WikimediaAnnounce-l(a)lists.wikimedia.org
>
https://lists.wikimedia.org/mailman/listinfo/wikimediaannounce-l
_______________________________________________
Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines
Wikimedia-l(a)lists.wikimedia.org
Unsubscribe:
https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
<mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>