Hi all,
With the start of the new fiscal year in Wikimedia Foundation on July 1, the Research team has officially started the work on Program 12: Growing contributor diversity. [1] Here are a few announcements/pointers about this program and the research and work that will be going to it:
* We aim to keep the research documentation for this project on the corresponding research page on meta. [2] * Research tasks are hard to break down and track in task-tracking systems. This being said, any task that we can break down and track will be documented under the corresponding Epic task on Phabricator. [3] * The goals for this Program for July-September 2017 (Quarter 1) are captured on MediaWiki. [4] (The Phabricator epic will be updated with corresponding tasks as we start working on them.) * Our three formal collaborators (cc-ed) will contribute to this program: Jérôme Hergueux from ETH, Paul Seabright from TSE, and Bob West from EPFL. We are thankful to these people who have agreed to spend their time and expertise on this project in the coming year, and to those of you who have already worked with us as we were shaping the proposal for this project and are planning to continue your contributions to this program. :) * I act as the point of contact for this research in Wikimedia Foundation. Please feel free to reach out to me (directly, if it cannot be shared publicly) if you have comments/questions about the project in the coming year.
Best, Leila
[1] https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2017-2018/F... [2] https://meta.wikimedia.org/wiki/Research:Voice_and_exit_in_a_voluntary_work_... [3] https://phabricator.wikimedia.org/T166083 [4] https://www.mediawiki.org/wiki/Wikimedia_Technology/Goals/2017-18_Q1#Researc...
-- Leila Zia Senior Research Scientist Wikimedia Foundation
This sounds like a great project. Forwarding.
Pine
On Wed, Jul 19, 2017 at 10:32 AM, Leila Zia leila@wikimedia.org wrote:
Hi all,
With the start of the new fiscal year in Wikimedia Foundation on July 1, the Research team has officially started the work on Program 12: Growing contributor diversity. [1] Here are a few announcements/pointers about this program and the research and work that will be going to it:
- We aim to keep the research documentation for this project on the
corresponding research page on meta. [2]
- Research tasks are hard to break down and track in task-tracking
systems. This being said, any task that we can break down and track will be documented under the corresponding Epic task on Phabricator. [3]
- The goals for this Program for July-September 2017 (Quarter 1) are
captured on MediaWiki. [4] (The Phabricator epic will be updated with corresponding tasks as we start working on them.)
- Our three formal collaborators (cc-ed) will contribute to this
program: Jérôme Hergueux from ETH, Paul Seabright from TSE, and Bob West from EPFL. We are thankful to these people who have agreed to spend their time and expertise on this project in the coming year, and to those of you who have already worked with us as we were shaping the proposal for this project and are planning to continue your contributions to this program. :)
- I act as the point of contact for this research in Wikimedia
Foundation. Please feel free to reach out to me (directly, if it cannot be shared publicly) if you have comments/questions about the project in the coming year.
Best, Leila
[1] https://meta.wikimedia.org/wiki/Wikimedia_Foundation_ Annual_Plan/2017-2018/Final/Programs/Technology#Program_ 12:_Grow_contributor_diversity [2] https://meta.wikimedia.org/wiki/Research:Voice_and_exit_ in_a_voluntary_work_environment [3] https://phabricator.wikimedia.org/T166083 [4] https://www.mediawiki.org/wiki/Wikimedia_Technology/ Goals/2017-18_Q1#Research
-- Leila Zia Senior Research Scientist Wikimedia Foundation
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Hoi, Is this an English only project? Thanks, GerardM
On 19 July 2017 at 19:32, Leila Zia leila@wikimedia.org wrote:
Hi all,
With the start of the new fiscal year in Wikimedia Foundation on July 1, the Research team has officially started the work on Program 12: Growing contributor diversity. [1] Here are a few announcements/pointers about this program and the research and work that will be going to it:
- We aim to keep the research documentation for this project on the
corresponding research page on meta. [2]
- Research tasks are hard to break down and track in task-tracking
systems. This being said, any task that we can break down and track will be documented under the corresponding Epic task on Phabricator. [3]
- The goals for this Program for July-September 2017 (Quarter 1) are
captured on MediaWiki. [4] (The Phabricator epic will be updated with corresponding tasks as we start working on them.)
- Our three formal collaborators (cc-ed) will contribute to this
program: Jérôme Hergueux from ETH, Paul Seabright from TSE, and Bob West from EPFL. We are thankful to these people who have agreed to spend their time and expertise on this project in the coming year, and to those of you who have already worked with us as we were shaping the proposal for this project and are planning to continue your contributions to this program. :)
- I act as the point of contact for this research in Wikimedia
Foundation. Please feel free to reach out to me (directly, if it cannot be shared publicly) if you have comments/questions about the project in the coming year.
Best, Leila
[1] https://meta.wikimedia.org/wiki/Wikimedia_Foundation_ Annual_Plan/2017-2018/Final/Programs/Technology#Program_ 12:_Grow_contributor_diversity [2] https://meta.wikimedia.org/wiki/Research:Voice_and_exit_ in_a_voluntary_work_environment [3] https://phabricator.wikimedia.org/T166083 [4] https://www.mediawiki.org/wiki/Wikimedia_Technology/ Goals/2017-18_Q1#Research
-- Leila Zia Senior Research Scientist Wikimedia Foundation
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Hoi Gerard,
On Wed, Jul 19, 2017 at 9:37 PM, Gerard Meijssen gerard.meijssen@gmail.com wrote:
Hoi, Is this an English only project?
The short answer is: it's too early to give a response to this question. I'll expand below if you're interested to know more:
* We need to start small, see if we can show working results, and then try to go beyond small. :) * We don't know the state of diversity across (or at least in a few of the) Wikipedia languages. * We don't know the state of diversity across (or at least in a few of the) Wikimedia projects. * We will probably need to start with a couple of languages that have a large pool of newcomers who are added to the system organically. What is "large" is something we don't have an answer for right now. * We anticipate that an intervention aimed at boosting newcomers' self-confidence level could have a heterogeneous impact on editor retention depending on the local culture. This can have impact on the choice of the languages we start this project in. (We are using plots such as http://www.worldvaluessurvey.org/images/Cultural_map_WVS6_2015.jpg for brainstorming about this topic.) * And last but not least: Our strong preference is to do this research in communities that acknowledge the lack of diversity as an issue for their community and would like to work with us on this problem. This is not a one-year project and we need sustained collaboration between research, the communities involved, and tool/interface developers (inside and outside of WMF).
All of the above will need to be taken into account when we choose the language/project.
I hope this helps.
Best, Leila
Thanks, GerardM
On 19 July 2017 at 19:32, Leila Zia leila@wikimedia.org wrote:
Hi all,
With the start of the new fiscal year in Wikimedia Foundation on July 1, the Research team has officially started the work on Program 12: Growing contributor diversity. [1] Here are a few announcements/pointers about this program and the research and work that will be going to it:
- We aim to keep the research documentation for this project on the
corresponding research page on meta. [2]
- Research tasks are hard to break down and track in task-tracking
systems. This being said, any task that we can break down and track will be documented under the corresponding Epic task on Phabricator. [3]
- The goals for this Program for July-September 2017 (Quarter 1) are
captured on MediaWiki. [4] (The Phabricator epic will be updated with corresponding tasks as we start working on them.)
- Our three formal collaborators (cc-ed) will contribute to this
program: Jérôme Hergueux from ETH, Paul Seabright from TSE, and Bob West from EPFL. We are thankful to these people who have agreed to spend their time and expertise on this project in the coming year, and to those of you who have already worked with us as we were shaping the proposal for this project and are planning to continue your contributions to this program. :)
- I act as the point of contact for this research in Wikimedia
Foundation. Please feel free to reach out to me (directly, if it cannot be shared publicly) if you have comments/questions about the project in the coming year.
Best, Leila
[1] https://meta.wikimedia.org/wiki/Wikimedia_Foundation_ Annual_Plan/2017-2018/Final/Programs/Technology#Program_ 12:_Grow_contributor_diversity [2] https://meta.wikimedia.org/wiki/Research:Voice_and_exit_ in_a_voluntary_work_environment [3] https://phabricator.wikimedia.org/T166083 [4] https://www.mediawiki.org/wiki/Wikimedia_Technology/ Goals/2017-18_Q1#Research
-- Leila Zia Senior Research Scientist Wikimedia Foundation
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Leila,
I am wondering if you can explain the project title "Voice and exit in a voluntary work environment". I don't quite see the connection to the project as proposed
https://meta.wikimedia.org/wiki/Research:Voice_and_exit_in_a_voluntary_work_...
On reading the project, I see two almost separate items. One is the intent to survey all new users about their demographics. The second goal here is to form newbie teams of women based on a similar interests based on "20 questions".
Regarding the demographics of new users. Is this intended to occur when they create a new account (rather than a new IP)? If so, will it be optional? I guess my concern is people will back off from signing up, either because they don't want to reveal the information, or because the process has just become too heavyweight. From a privacy perspective (I presume there will be a privacy statement), will the demographic survey remained linked to the user name? From the point of view of the science, it would be good if it was for tracking purposes but it's also a possible reason why people won't answer your questions if it is (or more to the point, if they think it is). I know myself when organisations approach me for demographic information (anonymously or linked to my username or real world identity), my reaction to such requests tends to depend on how much I care about them (and how much I trust them). If I am very involved in an organisation, I am generally happy to provide data that assists them in the stated purpose because I want them to be successful. When I am marginally engaged (the case with many a website that requires a signup), I am unlikely to provide demographic information in general and almost certainly not at the point of signup.
I assume the link between the two parts of the project is that some/all of those new users whose demographic profile reveals they are women will then be approached to form teams based on the 20 questions. Will that occur before their first edit? I'm just thinking of the person sitting down to fix a spelling error going through signup, demographic survey, invitation to be in a team before and possibly 20 questions before we let them do the edit they came to do. I guess I am fearful that the experiment will drive women away if it is all too up-front heavy relative to the task they came to do. Not in the interests of diversity.
Also, the word "organic" was mentioned. Not all new users are organic. Anyone who is signing up for a training class, edit-a-thon, university class exercise etc is NOT organic. Can I ask that when there is a research intervention, reasonable steps are taken to ensure that non-organic new users are not caught up in it. That means having some way to bypass the intervention and informing the course instructors (it's a user right) well in advance so they can ensure their groups are bypassed. Ditto any scheduled events/edit-a-thons. Mine are published on the Wikimedia Australia website. When you have 2 hours to teach Wikipedia (the typical time slot I get from organisations) and you have a prepared set of PPT slides, you want the Wikipedia interface to follow the sequence you are expecting. Trainees are confused by buttons being relabelled different to the PPT slides etc. And it's worse if it happens to only part of the group as they think they did something wrong. Anything that slows things up means you don't get finished in two hours and training has failed its goals. I got caught by the A/B testing of Visual Editor by new users. At that time, I had never seen or used the Visual Editor and a proportion of my training class were being shown it. It was a disaster and I nearly gave up training after that, it was just so embarrassing. I did not know it was happening. Nor did I have any way to get those users back into the source editor (which I was teaching at that time). While I think the VE is a good thing for Wikipedia, that was NOT the way to experiment with it. Also with events, because of the limit on signups per day from the same IP address, it is common to ask people to sign up in advance for which you provide information on the process. So the bypass of the intervention needs to be available for the signups occurring before the event so don't think it is sufficient to just provide an "on the day" signup solution. It has to work for the people doing it at their own desks days ahead. Given that the vast majority of participants in my groups are women, I don't think it’s in the interests of diversity to give them a bad experience by being inadvertently caught up in an experiment.
Moving on to the newbie teams, how is this going to work? How will they communicate?
Will you tell them about the Visual Editor which is NOT enabled by default for new users? As someone who has delivered training on both editors, the VE is an absolute winner for new users, particular women. I could not do Wikipedia edit training in the source editor in 2 hours (minimum 4). But the first thing I have to get them to do after sign-in is to enable the VE (by the way, use the Editing mode "show both tabs" as there is some bug that locks you out of the VE sooner or later if you chose the other ones). So while the VE is a winning strategy for training new users, there is a problem. It doesn't work on Talk pages, User Talk pages which means that new VE users can't access the TeaHouse and the Visual Editor Feedback page (as unbelievably dumb as that sounds!). I know from source editor training that new users don't grasp how Talk pages work. They live in a work of email, Facebook, Twitter and everything else, none of which uses anything as completely unstructured as Talk pages (unstructured in the sense of the tool -- the source editor -- has no builtin "reply" or "forward/share" as they are expecting). As a consequence, I hand out my Wikimedia Australia business card to everyone who attends my events so they can email me with their questions/problems after the event. Even if they know about Talk, they mostly use my email address because they understand how to communicate via it. It also has the practical benefit that they can attach screenshots which they cannot do in Talk (not because it's technically impossible but because they don't know about uploading images and how to add them in a Talk page, remember it's not VE enabled).
On the subject of VE and User Talk, can I ask you all to add {{VEFriendly}} to the top of your User Talk page. It allows VE users to write there using VE, which is a small friendly thing to do. See it on mine: https://en.wikipedia.org/wiki/User_talk:Kerry_Raymond
As a serious statement, if we want to increase female participation in Wikipedia, making the VE enabled as the default for new users is a simple intervention that will probably produce more results than any other simple intervention.
So if there are to be newbie teams, they will probably need to interact by other than Talk pages (whether they are VE Users or not). And teams should probably form around the editor they use (they can't help each other otherwise).
Also, dealing with newbies all the time, I can say that a lot of them do come along with a "mission" in their head, often involving a new article, often of doubtful notability. I don't teach how to create new articles in the 2 hour training, instead I explain why as new users they should not attempt it. Since they always ask, I say "have at least 100 edits to existing articles before trying to create a new article". Generally the new article is not promotional (as a lot of Article for Creation articles are) but more often it's an obituary, school history, local club/society, etc. I find new people understand and agree with the "no advertising" principle, they are happy to add facts and not opinions with citations about the school or the junior athletics club (generally to the school history book they just wrote or to the club's website) but they don't grasp notability at all well. It's worth being aware that a newbie team is likely to have one or more "mission-oriented" members which might affect team dynamics. But I am a little unclear what the teams will work on (there was mention of suggestions in their topic space).
My sense of why new users give up is a combination of the mechanics of editing (which VE helps with) and finding their edits reverted (or substantially removed/changed), which can occur both because of our myriad of policies (of which Articles for Creation is a "solution") and Manual of Style issues and WikiProject conventions and gatekeepers and random WP:JUSTDONTLIKEIT reverters. These things are incredibly discouraging both as a rejection of the effort makde in the contribution but also because it rarely is it explained what they did wrong and HOW TO FIX it in terms they understand and via communication mediums they understand and respond in (given they generally don't understand Talk and most don't understand edit summaries where the reason is often hidden). Also, I do believe that there are people who will happily exploit the newbie lack of knowledge to enforce their views on things (WP:JUSTDONTLIKEIT). Where possible I try to put their User page and the articles I am aware they have been editing on my watchlist so I can try to intervene to help them move forward with an explanation of what happened and what to do about it. I see the risk for newbie teams is their combined lack of knowledge of policy and how to deal with issues. For this reason alone, I am doubtful this intervention will work (but will be delighted if it does). I think new users (or teams of users) do need a more experienced mentor, and I do understand that this creates a workload on an existing community member.
Also, I do have some concerns about the "framing" of the project. The framing is "women are failing at Wikipedia, we have to fix the women". I suggest that this is analogous to the 1960s argument that if women want to do men's jobs, they should learn to accept nude photos of women in the lunchroom and locker room language. I would suggest an alternative framing that "Wikipedia is failing women, we have to fix Wikipedia". The Wikipedia environment is toxic and this is a massive turn-off to women. It is rude, it is impersonal, it is arrogant. The fear of "creating a burden on the community" illustrates this point nicely. If the community cared about attracting more editors (whether women or not), then they would (to some extent) be willing to be mentor new contributors. The fact that the community appears to resent this as a "burden" suggests that the community does not care about attracting new editors. Indeed, I think for existing contributors, the clubhouse of Wikipedia (and their status within it) is probably more important than its wider mission. I would be very much inclined to suggest that before embarking on a large scale roll-out of the experiment, maybe just do some "qualitative" work around forming some teams of new women users first and see if it works.
Kerry
Hi Kerry,
Thanks for the comments. I hope that Leila will respond. I have a few thoughts:
1. Have you tried the New WikiText Editor (NWTE, which I want to call NEWT so that it's easily pronounceable), particularly on talk pages? I think that new users will find it to be considerably easier to use than "pure" wikimarkup.
2. I agree with the sentiment about VE on talk pages. Perhaps you could ask the WMF folks if there has been further thought about enabling VE on talk pages. The last time that I asked the answer was "no" and the evolutionary paths for talk pages are planned to be NWTE/NEWT and Flow.
3. I agree that more mentoring of newbies would be good, but there is finite human resource capacity among the more experienced editors and those editors already have plenty of work. The Wiki Ed Foundation and other organizations are increasingly providing paid staff time to mentor Wikimedians, and I think that this is a more realistic option than lecturing the existing volunteer community that we should be doing yet more work for free. WMF has plenty of money and it seems to me that spending some of that money on training and mentoring programs is probably worthwhile.
Pine
Pine,
NEWT is an acronym that has already been used on Wikipedia, and judging from multiple references in this week's RFB, while that project was suspended several years ago the acronym is not forgotten.
Jargon is confusing and a barrier to onboarding newbies at the best of times, but jargon that requires a disambiguation page is probably worse.
Jonathan
On 27 Jul 2017, at 00:09, Pine W wiki.pine@gmail.com wrote:
Hi Kerry,
Thanks for the comments. I hope that Leila will respond. I have a few thoughts:
- Have you tried the New WikiText Editor (NWTE, which I want to call NEWT
so that it's easily pronounceable), particularly on talk pages? I think that new users will find it to be considerably easier to use than "pure" wikimarkup.
- I agree with the sentiment about VE on talk pages. Perhaps you could ask
the WMF folks if there has been further thought about enabling VE on talk pages. The last time that I asked the answer was "no" and the evolutionary paths for talk pages are planned to be NWTE/NEWT and Flow.
- I agree that more mentoring of newbies would be good, but there is
finite human resource capacity among the more experienced editors and those editors already have plenty of work. The Wiki Ed Foundation and other organizations are increasingly providing paid staff time to mentor Wikimedians, and I think that this is a more realistic option than lecturing the existing volunteer community that we should be doing yet more work for free. WMF has plenty of money and it seems to me that spending some of that money on training and mentoring programs is probably worthwhile.
Pine _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Hi Kerry,
The research is in too early of a stage for me to be able to expand more, simply because we don't know more. :) Some of my responses may look unsatisfactory, but please keep in mind it's simply because we don't know more. We've just started.
I'll respond briefly to some of your comments.
On Thu, Jul 20, 2017 at 6:49 PM, Kerry Raymond kerry.raymond@gmail.com wrote:
Leila,
I am wondering if you can explain the project title "Voice and exit in a voluntary work environment". I don't quite see the connection to the project as proposed
https://meta.wikimedia.org/wiki/Research:Voice_and_exit_in_a_voluntary_work_...
We needed /a/ name for the meta page and that's what we liked back when we started thinking about this research. That already doesn't match the name of the program in the annual plan and may not match the title of future work/publication related to it. The research around this project may take many different directions, and that will define how we will call it eventually.
On reading the project, I see two almost separate items. One is the intent to survey all new users about their demographics. The second goal here is to form newbie teams of women based on a similar interests based on "20 questions".
None of these are goals, and none of them are approaches we have settled on. This project is in a very early stage, and these are some ideas about how we can use some tools (like surveys) to understand the space better, or address the cold-start problem we will have in the current direction we're thinking.
Regarding the demographics of new users. Is this intended to occur when they create a new account (rather than a new IP)? If so, will it be optional? I guess my concern is people will back off from signing up, either because they don't want to reveal the information, or because the process has just become too heavyweight. From a privacy perspective (I presume there will be a privacy statement), will the demographic survey remained linked to the user name? From the point of view of the science, it would be good if it was for tracking purposes but it's also a possible reason why people won't answer your questions if it is (or more to the point, if they think it is). I know myself when organisations approach me for demographic information (anonymously or linked to my username or real world identity), my reaction to such requests tends to depend on how much I care about them (and how much I trust them). If I am very involved in an organisation, I am generally happy to provide data that assists them in the stated purpose because I want them to be successful. When I am marginally engaged (the case with many a website that requires a signup), I am unlikely to provide demographic information in general and almost certainly not at the point of signup.
These are good questions. We are thinking about all of these and more, but because we know so little at this point, we can't fix the direction and answer them. We will know more in 6 months hopefully. (For example, the idea of asking about demographics directly is /one/ idea. when to ask for that information and if ask it directly are still open questions.)
I assume the link between the two parts of the project is that some/all of those new users whose demographic profile reveals they are women will then be approached to form teams based on the 20 questions. Will that occur before their first edit? I'm just thinking of the person sitting down to fix a spelling error going through signup, demographic survey, invitation to be in a team before and possibly 20 questions before we let them do the edit they came to do. I guess I am fearful that the experiment will drive women away if it is all too up-front heavy relative to the task they came to do. Not in the interests of diversity.
Again, it's too early for me to share thoughts, because the research has just started. We may move away from these. My suggestion is that we wait until we can wrap our head around this project a bit more, and of course, your point about not being too-up-front is taken into account. :)
Also, the word "organic" was mentioned. Not all new users are organic. Anyone who is signing up for a training class, edit-a-thon, university class exercise etc is NOT organic.
Depends. You can think of the current way the editors get added to Wikipedia (as a whole) as an organic process, independent of them joining via an editathon (for example) or not; because this is how a project like Wikipedia works. Thousands of people around the world work on bringing more people to it, and this is part of the system and its operations. If we focus on project x which doesn't have a lot of events for bringing people to edit Wikipedia, an editathon would be considered something that would bring people to the project non-organically.
Can I ask that when there is a research intervention, reasonable steps are taken to ensure that non-organic new users are not caught up in it. That means having some way to bypass the intervention and informing the course instructors (it's a user right) well in advance so they can ensure their groups are bypassed. Ditto any scheduled events/edit-a-thons. Mine are published on the Wikimedia Australia website. When you have 2 hours to teach Wikipedia (the typical time slot I get from organisations) and you have a prepared set of PPT slides, you want the Wikipedia interface to follow the sequence you are expecting. Trainees are confused by buttons being relabelled different to the PPT slides etc. And it's worse if it happens to only part of the group as they think they did something wrong. Anything that slows things up means you don't get finished in two hours and training has failed its goals. I got caught by the A/B testing of Visual Editor by new users. At that time, I had never seen or used the Visual Editor and a proportion of my training class were being shown it. It was a disaster and I nearly gave up training after that, it was just so embarrassing. I did not know it was happening. Nor did I have any way to get those users back into the source editor (which I was teaching at that time). While I think the VE is a good thing for Wikipedia, that was NOT the way to experiment with it. Also with events, because of the limit on signups per day from the same IP address, it is common to ask people to sign up in advance for which you provide information on the process. So the bypass of the intervention needs to be available for the signups occurring before the event so don't think it is sufficient to just provide an "on the day" signup solution. It has to work for the people doing it at their own desks days ahead. Given that the vast majority of participants in my groups are women, I don't think it’s in the interests of diversity to give them a bad experience by being inadvertently caught up in an experiment.
This kind of project will need to be done in collaboration with communities involved. If we keep the communication and collaboration channels active, we can given an honest try to avoid an issue like the one you mentioned (your slides not matching what people see, which is a real problem, acknowledged.). What method we will use to avoid that issue is something we can figure out together.
The above being said, I also want to highlight that in large systems like Wikipedia, for example, where many can change the different components of the systems without a centralized control, issues may arise, no matter how much we try to collaborate and communicate. If we want to learn the system and try to address some of the issues, we need to embrace that somethings may go wrong and we will need to fix them. What is important is that we give an honest and informed try to avoid them as much as possible. We're committed to this.
Moving on to the newbie teams, how is this going to work? How will they communicate?
It's too early to comment on this one as the direction is not fixed yet.
Will you tell them about the Visual Editor which is NOT enabled by default for new users?
same as above.
As a serious statement, if we want to increase female participation in Wikipedia, making the VE enabled as the default for new users is a simple intervention that will probably produce more results than any other simple intervention.
Please check: https://meta.wikimedia.org/wiki/Research:VisualEditor%27s_effect_on_newly_re... and see the video at https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#July_2015 for research on the impact of VE on specific outcomes (short term retention, productivity, ...)
Maybe at some point in the future of this research, we can look into the interplay between simpler editing tools such as VE versus Wikitext.
Also, I do have some concerns about the "framing" of the project. The framing is "women are failing at Wikipedia, we have to fix the women". I suggest that this is analogous to the 1960s argument that if women want to do men's jobs, they should learn to accept nude photos of women in the lunchroom and locker room language. I would suggest an alternative framing that "Wikipedia is failing women, we have to fix Wikipedia". The Wikipedia environment is toxic and this is a massive turn-off to women. It is rude, it is impersonal, it is arrogant. The fear of "creating a burden on the community" illustrates this point nicely.
I'd like to stay away from both framings: Wikipedia is designed by humans for humans, with the best of intentions. The system needs improvements and we will be focusing on that. :)
In your paragraph, you refer to other important issues, for example toxic environment. That discussion is outside of the scope of this research (our goal is to focus on confidence), however, there are other initiatives that focus on that: the research on harassment is one example. https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2017-2018/F...
Best, Leila
Kerry
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Jonathan, I didn't realize that there was another use for NEWT. (: It's a little old though, and it appears to be specific to English Wikipedia, and my personal feeling is that a need to disambiguate the editing interface from an old research project is the lesser evil as compared to an acronym that can't be easily pronounced.
Kerry, I think that until and unless WMF enables VE for talk pages (which I hope that they will agree to experiment with doing), teaching new users to edit talk pages with NWTE/NEWT is the best option available on wikis that don't use Flow.
I'm sensitive to folks telling volunteers that we should be doing more unpaid work, hence my admittedly strong reaction to your proposal about mentoring. In an ideal world it would be nice to have more mentors, and I think that adding paid staff to do some mentoring work would be a good use of funds; the Wiki Ed Foundation seems to be successful with this approach, and some affiliates also have a variety of training programs that they do for Wikipedia editors and/or for others such as school teachers who will then train others. I should also mention that WMF is developing training for functionaries, and WMF has provided some grant or contract funding projects where training is a focus such as the English Wikipedia Teahouse, the Wikipedia Adventure, and the video series for which I'm largely responsible. So it seems that WMF is sometimes willing to provide funding for training, and I would encourage them to continue to do so.
Pine
wiki-research-l@lists.wikimedia.org