Hi Kerry,
The research is in too early of a stage for me to be able to expand
more, simply because we don't know more. :) Some of my responses may
look unsatisfactory, but please keep in mind it's simply because we
don't know more. We've just started.
I'll respond briefly to some of your comments.
On Thu, Jul 20, 2017 at 6:49 PM, Kerry Raymond <kerry.raymond(a)gmail.com> wrote:
Leila,
I am wondering if you can explain the project title "Voice and exit in a voluntary
work environment". I don't quite see the connection to the project as proposed
https://meta.wikimedia.org/wiki/Research:Voice_and_exit_in_a_voluntary_work…
We needed /a/ name for the meta page and that's what we liked back
when we started thinking about this research. That already doesn't
match the name of the program in the annual plan and may not match the
title of future work/publication related to it. The research around
this project may take many different directions, and that will define
how we will call it eventually.
On reading the project, I see two almost separate
items. One is the intent to survey all new users about their demographics. The second goal
here is to form newbie teams of women based on a similar interests based on "20
questions".
None of these are goals, and none of them are approaches we have
settled on. This project is in a very early stage, and these are some
ideas about how we can use some tools (like surveys) to understand the
space better, or address the cold-start problem we will have in the
current direction we're thinking.
Regarding the demographics of new users. Is this
intended to occur when they create a new account (rather than a new IP)? If so, will it be
optional? I guess my concern is people will back off from signing up, either because they
don't want to reveal the information, or because the process has just become too
heavyweight. From a privacy perspective (I presume there will be a privacy statement),
will the demographic survey remained linked to the user name? From the point of view of
the science, it would be good if it was for tracking purposes but it's also a possible
reason why people won't answer your questions if it is (or more to the point, if they
think it is). I know myself when organisations approach me for demographic information
(anonymously or linked to my username or real world identity), my reaction to such
requests tends to depend on how much I care about them (and how much I trust them). If I
am very involved in an organisation, I am generally happy to provide data that assists
them in the stated purpose because I want them to be successful. When I am marginally
engaged (the case with many a website that requires a signup), I am unlikely to provide
demographic information in general and almost certainly not at the point of signup.
These are good questions. We are thinking about all of these and more,
but because we know so little at this point, we can't fix the
direction and answer them. We will know more in 6 months hopefully.
(For example, the idea of asking about demographics directly is /one/
idea. when to ask for that information and if ask it directly are
still open questions.)
I assume the link between the two parts of the project
is that some/all of those new users whose demographic profile reveals they are women will
then be approached to form teams based on the 20 questions. Will that occur before their
first edit? I'm just thinking of the person sitting down to fix a spelling error going
through signup, demographic survey, invitation to be in a team before and possibly 20
questions before we let them do the edit they came to do. I guess I am fearful that the
experiment will drive women away if it is all too up-front heavy relative to the task they
came to do. Not in the interests of diversity.
Again, it's too early for me to share thoughts, because the research
has just started. We may move away from these. My suggestion is that
we wait until we can wrap our head around this project a bit more, and
of course, your point about not being too-up-front is taken into
account. :)
Also, the word "organic" was mentioned. Not
all new users are organic. Anyone who is signing up for a training class, edit-a-thon,
university class exercise etc is NOT organic.
Depends. You can think of the current way the editors get added to
Wikipedia (as a whole) as an organic process, independent of them
joining via an editathon (for example) or not; because this is how a
project like Wikipedia works. Thousands of people around the world
work on bringing more people to it, and this is part of the system and
its operations. If we focus on project x which doesn't have a lot of
events for bringing people to edit Wikipedia, an editathon would be
considered something that would bring people to the project
non-organically.
Can I ask that when there is a research intervention,
reasonable steps are taken to ensure that non-organic new users are not caught up in it.
That means having some way to bypass the intervention and informing the course
instructors (it's a user right) well in advance so they can ensure their groups are
bypassed. Ditto any scheduled events/edit-a-thons. Mine are published on the Wikimedia
Australia website. When you have 2 hours to teach Wikipedia (the typical time slot I get
from organisations) and you have a prepared set of PPT slides, you want the Wikipedia
interface to follow the sequence you are expecting. Trainees are confused by buttons being
relabelled different to the PPT slides etc. And it's worse if it happens to only part
of the group as they think they did something wrong. Anything that slows things up means
you don't get finished in two hours and training has failed its goals. I got caught
by the A/B testing of Visual Editor by new users. At that time, I had never seen or used
the Visual Editor and a proportion of my training class were being shown it. It was a
disaster and I nearly gave up training after that, it was just so embarrassing. I did not
know it was happening. Nor did I have any way to get those users back into the source
editor (which I was teaching at that time). While I think the VE is a good thing for
Wikipedia, that was NOT the way to experiment with it. Also with events, because of the
limit on signups per day from the same IP address, it is common to ask people to sign up
in advance for which you provide information on the process. So the bypass of the
intervention needs to be available for the signups occurring before the event so don't
think it is sufficient to just provide an "on the day" signup solution. It has
to work for the people doing it at their own desks days ahead. Given that the vast
majority of participants in my groups are women, I don't think it’s in the interests
of diversity to give them a bad experience by being inadvertently caught up in an
experiment.
This kind of project will need to be done in collaboration with
communities involved. If we keep the communication and collaboration
channels active, we can given an honest try to avoid an issue like the
one you mentioned (your slides not matching what people see, which is
a real problem, acknowledged.). What method we will use to avoid that
issue is something we can figure out together.
The above being said, I also want to highlight that in large systems
like Wikipedia, for example, where many can change the different
components of the systems without a centralized control, issues may
arise, no matter how much we try to collaborate and communicate. If we
want to learn the system and try to address some of the issues, we
need to embrace that somethings may go wrong and we will need to fix
them. What is important is that we give an honest and informed try to
avoid them as much as possible. We're committed to this.
Moving on to the newbie teams, how is this going to
work? How will they communicate?
It's too early to comment on this one as the direction is not fixed yet.
Will you tell them about the Visual Editor which is
NOT enabled by default for new users?
same as above.
As a serious statement, if we want to increase female
participation in Wikipedia, making the VE enabled as the default for new users is a simple
intervention that will probably produce more results than any other simple intervention.
Please check:
https://meta.wikimedia.org/wiki/Research:VisualEditor%27s_effect_on_newly_r…
and see the video at
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#July_2015
for research on the impact of VE on specific outcomes (short term
retention, productivity, ...)
Maybe at some point in the future of this research, we can look into
the interplay between simpler editing tools such as VE versus
Wikitext.
Also, I do have some concerns about the
"framing" of the project. The framing is "women are failing at Wikipedia,
we have to fix the women". I suggest that this is analogous to the 1960s argument
that if women want to do men's jobs, they should learn to accept nude photos of women
in the lunchroom and locker room language. I would suggest an alternative framing that
"Wikipedia is failing women, we have to fix Wikipedia". The Wikipedia
environment is toxic and this is a massive turn-off to women. It is rude, it is
impersonal, it is arrogant. The fear of "creating a burden on the community"
illustrates this point nicely.
I'd like to stay away from both framings: Wikipedia is designed by
humans for humans, with the best of intentions. The system needs
improvements and we will be focusing on that. :)
In your paragraph, you refer to other important issues, for example
toxic environment. That discussion is outside of the scope of this
research (our goal is to focus on confidence), however, there are
other initiatives that focus on that: the research on harassment is
one example.
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2017-2018/…
Best,
Leila
Kerry
_______________________________________________
Wiki-research-l mailing list
Wiki-research-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wiki-research-l