When you make a dumb survey, answers will be dumb, as well as conclusions.
I am really frustrated with that survey... I wanted to help with my responses, but it seems that I am inside of unknown percentage of Wikimedia Commons users and contributors which doesn't fit into the survey.
So, let's start...
1. Yes, I am using Commons regularly. 2. There are many goals when I look for free media files on Commons and I can't say which one is primary. 3. I am participating in Wikimedia Commons sometimes. 4. * 100-1000 edits * 100-1000 uploads (actually I don't have a clue how to find that fact efficiently) * I don't have a clue which percentage are my own works, let's say 10-50% * And the winner: What is my reason to participate? And there is no *my* reason, even Commons is the free content project? No reason to answer. * Then, no possibility to say that I am definitely not working on "quality review, improvement & promotion of featured works"; just "rarely"; as well as that I didn't upload any animation. BTW, all types of my participation on Commons are rare.
And, instead of doing that in a couple of minutes, I've spent 30 minutes in writing this email.
I really don't know which answers such survey is able to give? How many Commons participants have less than 100 edits? SQL queries may be more helpful.
If it is for making some decisions, it is better not to use the results because survey may give very wrong conclusions. If it is for some student work, it may be better.
I don't want to say that it is possible to make a perfect survey. However, this one has ~50% of not so bright questions and/or options. Yes, there are options for "other" answers, but such answers will be not so informative (cf. my second answer) or they would be narrative and it won't be easy to categorize them. Those who don't want to bother themselves with "other" option will give wrong answers.
Some brainstorming with other Wikimedians could help. It is better to have relevant than fast answers and conclusions.
Hi Milos,
Milos Rancic wrote:
- And the winner: What is my reason to participate? And there is no
*my* reason, even Commons is the free content project? No reason to answer.
The survey was designed specifically to avoid general statements. As you very well state yourself, general statements would be quite useless, and we want to know the underlying goal of the user. In the case you raise, what is the reason why you participate in a free content project? (you can answer off-list). I believe the answer would be one of those offered in the survey (but of course, I may be wrong).
- Then, no possibility to say that I am definitely not working on
"quality review, improvement & promotion of featured works"; just "rarely"; as well as that I didn't upload any animation.
I don't really see the difference from a design point of view.
A similar concern I have heard is that some radio buttons should be non-mutually exclusive choices (e.g. for the "what is the main reason" questions). I agree it is more difficult with radio buttons, because we ask the user to actually think about what their priority is. But there again, it is more useful from a design point of view.
One thing I realized, though, was that two questions had an ambiguous wording: people wonder why they have to give reasons for not using Commons, or not participating, despite the fact that they said they do. These questions should read « what is the main reason that limits or hinders your use/participation ». Unfortunately, we can't change the survey once it is running.
I really don't know which answers such survey is able to give? How many Commons participants have less than 100 edits? SQL queries may be more helpful.
All the work documents of the Multimedia Usability project are publicly available. You're welcome to read the following pages: http://usability.wikimedia.org/wiki/Multimedia:Preliminary_user_research http://usability.wikimedia.org/wiki/Multimedia:Initial_survey
I hope they will help you better understand the context of the survey.
Thanks for your constructive feedback. We may continue the discussion off-list if you wish.
2009/10/26 Guillaume Paumier gpaumier@wikimedia.org:
One thing I realized, though, was that two questions had an ambiguous wording: people wonder why they have to give reasons for not using Commons, or not participating, despite the fact that they said they do. These questions should read « what is the main reason that limits or hinders your use/participation ». Unfortunately, we can't change the survey once it is running.
Did you not test the survey? That should have been caught by a fairly small test run (a few dozen people).
Hi,
Thomas Dalton wrote:
2009/10/26 Guillaume Paumier gpaumier@wikimedia.org:
One thing I realized, though, was that two questions had an ambiguous wording: people wonder why they have to give reasons for not using Commons, or not participating, despite the fact that they said they do. These questions should read « what is the main reason that limits or hinders your use/participation ». Unfortunately, we can't change the survey once it is running.
Did you not test the survey? That should have been caught by a fairly small test run (a few dozen people).
Yes, we tested the survey, but with a smaller number of people, because of the time constraints (the multimedia usability project really started only 2 weeks ago, and the fundraising campaign is going to be the major focus of sitenotices from next week to January). And the problem didn't appear during the tests.
That said, although it may confuse a few people who take the survey, I don't expect this to have a huge impact on the results. People who don't understand the intended meaning of the question just select "other", as a quick check of the results so far confirms.
On Mon, Oct 26, 2009 at 1:53 PM, Guillaume Paumier gpaumier@wikimedia.org wrote:
Hi,
Thomas Dalton wrote:
2009/10/26 Guillaume Paumier gpaumier@wikimedia.org:
One thing I realized, though, was that two questions had an ambiguous wording: people wonder why they have to give reasons for not using Commons, or not participating, despite the fact that they said they do. These questions should read « what is the main reason that limits or hinders your use/participation ». Unfortunately, we can't change the survey once it is running.
Did you not test the survey? That should have been caught by a fairly small test run (a few dozen people).
Yes, we tested the survey, but with a smaller number of people, because of the time constraints (the multimedia usability project really started only 2 weeks ago, and the fundraising campaign is going to be the major focus of sitenotices from next week to January). And the problem didn't appear during the tests.
I got an error during the survey: Warning: unlink(/srv/org/wikimedia/survey/tmp/template_temp_090626181630.html) [function.unlink]: Permission denied in /srv/org/wikimedia/survey/common.php on line 6221
Didn't seem to affect the function, though.
Cheers, Magnus
2009/10/26 Guillaume Paumier gpaumier@wikimedia.org:
Yes, we tested the survey, but with a smaller number of people, because of the time constraints (the multimedia usability project really started only 2 weeks ago, and the fundraising campaign is going to be the major focus of sitenotices from next week to January). And the problem didn't appear during the tests.
It would probably be better to take your time to do proper tests, even if you have to wait until after the fundraiser. It is very difficult to draw useful results from a poor survey and if you haven't done enough testing to spot something as simple as that then you haven't done enough testing to spot more serious problems. Who is your statistician? As far as I can tell, nobody on staff has significant training in statistics, so presumably you are using a consultancy firm? Did they not tell you you needed to do more testing?
Hoi, Ehm, the statistics that we have are compiled by Erik Zachte.. qualifying our staff and implicitly Erik as lacking the experience is .... a bit off. It is not only the Commons project but also the Usability Initiative and the Strategy project that will rely largely on these numbers..
Having spend a considerable amount of time getting to grips with our statistics I can tell you that the beauty of Erik's presentations is that they help you approach them in different ways. As for instance the traffic statistics change daily and consequently you see the order of the different wikipedias in this chart in time.. but you have to get a feel for these dynamics... One nice touch is that the article statistics are in the same order. This gives a feel for the relation traffic and article numbers... this is less obvious then I thought at first...
Anyway.. I am sure that with Erik involved in statistics for Commons, we will get not only numbers but also a presentations that is thought provoking. Thanks, GerardM
2009/10/26 Thomas Dalton thomas.dalton@gmail.com
2009/10/26 Guillaume Paumier gpaumier@wikimedia.org:
Yes, we tested the survey, but with a smaller number of people, because of the time constraints (the multimedia usability project really started only 2 weeks ago, and the fundraising campaign is going to be the major focus of sitenotices from next week to January). And the problem didn't appear during the tests.
It would probably be better to take your time to do proper tests, even if you have to wait until after the fundraiser. It is very difficult to draw useful results from a poor survey and if you haven't done enough testing to spot something as simple as that then you haven't done enough testing to spot more serious problems. Who is your statistician? As far as I can tell, nobody on staff has significant training in statistics, so presumably you are using a consultancy firm? Did they not tell you you needed to do more testing?
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
2009/10/26 Gerard Meijssen gerard.meijssen@gmail.com:
Hoi, Ehm, the statistics that we have are compiled by Erik Zachte.. qualifying our staff and implicitly Erik as lacking the experience is .... a bit off. It is not only the Commons project but also the Usability Initiative and the Strategy project that will rely largely on these numbers..
I'm not talking about the article numbers, user numbers, etc. statistics. I'm talking about this usability survey. Erik isn't on the usability team and even if he was helping with this his user page on enwiki doesn't mention any statistics training - his skills lie in gathering the statistics, not analysing them.
Hoi, Assume - ass u me ... When you look at the presentation of the statistics, when you consider the "score card" that were recently announced. I am happy to agree with you that it does not say on Erik's user page on en.wp that he had any formal statistics training.. Erik for your information is Dutch and I think you assume that the en.wp is Erik's main project.
That is fine and hardly relevant. The WMF staff has someone who I appreciate for his statistics work and I expect that Erik will continue to be involved in any statistics work on any project. I assume this to be the case because new statistical data has to become integrated in one way or another in order to ensure that we continue to record, report and anaylse data even after the end of a time boxed project. Thanks, GerardM
2009/10/26 Thomas Dalton thomas.dalton@gmail.com
2009/10/26 Gerard Meijssen gerard.meijssen@gmail.com:
Hoi, Ehm, the statistics that we have are compiled by Erik Zachte.. qualifying our staff and implicitly Erik as lacking the experience is .... a bit
off.
It is not only the Commons project but also the Usability Initiative and
the
Strategy project that will rely largely on these numbers..
I'm not talking about the article numbers, user numbers, etc. statistics. I'm talking about this usability survey. Erik isn't on the usability team and even if he was helping with this his user page on enwiki doesn't mention any statistics training - his skills lie in gathering the statistics, not analysing them.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
2009/10/26 Gerard Meijssen gerard.meijssen@gmail.com:
Hoi, Assume - ass u me ... When you look at the presentation of the statistics, when you consider the "score card" that were recently announced. I am happy to agree with you that it does not say on Erik's user page on en.wp that he had any formal statistics training.. Erik for your information is Dutch and I think you assume that the en.wp is Erik's main project.
I relied on the enwiki page because a) it does list his training and statistics isn't there and b) his meta page directs people to his enwiki page (and his nlwiki page). The statistics are presented in a big table and with a few bar charts, that doesn't require any training in statistics.
That is fine and hardly relevant. The WMF staff has someone who I appreciate for his statistics work and I expect that Erik will continue to be involved in any statistics work on any project. I assume this to be the case because new statistical data has to become integrated in one way or another in order to ensure that we continue to record, report and anaylse data even after the end of a time boxed project.
This is a survey not data extracted from a database, why would it be continually recorded? You aren't making any sense.
I regard Erik highly for his skills in analyzing data which is already present. I am completely confident he can do some thought provoking analysis.
Unfortunately, the skills required for putting a good questionare together are different from those on analyzing the data.
The remarks I have seen so far are about the technical features of the questionaire, lack of testing, and the contents of the questions. Bad questions generate bad responss, and bad responses mean bad data. No doubt Erik will be able to make some use of these data, but would his talents not have been much more usefull if the proper amount of thinking and testing had gone into this questionaire?
The results of the usability project have disappointed many, see earlier discussions on this list. Will there be more to come from the useability project? What actions can be taken to correct the currrent questionaire? Can it be stopped and replaced by a better set of questions later? What has suggested the need for a commons questionaire in the first place? What analysis has already been done on this topic? What gave rise to the current set of questions?
teun
On Mon, Oct 26, 2009 at 6:36 PM, Gerard Meijssen gerard.meijssen@gmail.comwrote:
Hoi, Assume - ass u me ... When you look at the presentation of the statistics, when you consider the "score card" that were recently announced. I am happy to agree with you that it does not say on Erik's user page on en.wp that he had any formal statistics training.. Erik for your information is Dutch and I think you assume that the en.wp is Erik's main project.
That is fine and hardly relevant. The WMF staff has someone who I appreciate for his statistics work and I expect that Erik will continue to be involved in any statistics work on any project. I assume this to be the case because new statistical data has to become integrated in one way or another in order to ensure that we continue to record, report and anaylse data even after the end of a time boxed project. Thanks, GerardM
2009/10/26 Thomas Dalton thomas.dalton@gmail.com
2009/10/26 Gerard Meijssen gerard.meijssen@gmail.com:
Hoi, Ehm, the statistics that we have are compiled by Erik Zachte..
qualifying
our staff and implicitly Erik as lacking the experience is .... a bit
off.
It is not only the Commons project but also the Usability Initiative
and
the
Strategy project that will rely largely on these numbers..
I'm not talking about the article numbers, user numbers, etc. statistics. I'm talking about this usability survey. Erik isn't on the usability team and even if he was helping with this his user page on enwiki doesn't mention any statistics training - his skills lie in gathering the statistics, not analysing them.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Guillaume Paumier wrote:
One thing I realized, though, was that two questions had an ambiguous wording: people wonder why they have to give reasons for not using Commons, or not participating, despite the fact that they said they do. These questions should read « what is the main reason that limits or hinders your use/participation ». Unfortunately, we can't change the survey once it is running.
I don't regularly use Commons, but when I went to fill in the answer at "Other" for why I don't use it the survey crashed.
Ec
As an example, let's have a look at the first two questions: 1) Do you use Wikimedia Commons at all? Choose one of the following answers
* Yes, regularly * Yes, sometimes * No
Criticism: not exact. What is "regularly"? Once a day? Once a week? Once a month? 5% of all wiki edits? 25% of all wikiedits?
2) What is your main goal when you look for free media files on Commons? Choose one of the following answers
* I look for media files to illustrate Wikipedia or another Wikimedia project. * I look for media files to use online on another website. * I look for media files to use offline (reports, presentations, homework). * Other:
Criticism: Many wikipedians may choose 1), look for media files, as there is no option "upload media files".
teun
On Mon, Oct 26, 2009 at 3:27 PM, teun spaans teun.spaans@gmail.com wrote:
Criticism: not exact. What is "regularly"? Once a day? Once a week? Once a month? 5% of all wiki edits? 25% of all wikiedits?
Sometimes it's good to keep it vague -- to get people's opinions on their own activity. Isn't the survey trying to gauge Commons participation and any barriers to it? If people feel they contribute very frequently to Commons, their participation isn't really hindered that much.
Criticism: Many wikipedians may choose 1), look for media files, as there is no option "upload media files".
They all say "look for media files", it's asking what your main goal is in reusing photos already there. Uploading new files doesn't really fit into the question.
2009/10/26 Casey Brown lists@caseybrown.org:
On Mon, Oct 26, 2009 at 3:27 PM, teun spaans teun.spaans@gmail.com wrote:
Criticism: not exact. What is "regularly"? Once a day? Once a week? Once a month? 5% of all wiki edits? 25% of all wikiedits?
Sometimes it's good to keep it vague -- to get people's opinions on their own activity. Isn't the survey trying to gauge Commons participation and any barriers to it? If people feel they contribute very frequently to Commons, their participation isn't really hindered that much.
I disagree. If you want opinions you have to ask for opinions. Asking vague questions just annoys people because they don't know what they are supposed to answer.
On Mon, Oct 26, 2009 at 12:52 PM, Thomas Dalton thomas.dalton@gmail.com wrote:
2009/10/26 Casey Brown lists@caseybrown.org:
On Mon, Oct 26, 2009 at 3:27 PM, teun spaans teun.spaans@gmail.com wrote:
Criticism: not exact. What is "regularly"? Once a day? Once a week? Once a month? 5% of all wiki edits? 25% of all wikiedits?
Sometimes it's good to keep it vague -- to get people's opinions on their own activity. Isn't the survey trying to gauge Commons participation and any barriers to it? If people feel they contribute very frequently to Commons, their participation isn't really hindered that much.
I disagree. If you want opinions you have to ask for opinions. Asking vague questions just annoys people because they don't know what they are supposed to answer.
There is more to it that that.
If you want to measure people's perceptions, you use qualitative judgment terms, e.g. "often", "regularly", "rarely".
If you want to measure concrete facts, you use quantified terms, e.g. "once per week", "more than 5 times per day", etc.
The latter gives direct information about participation, while the former gives a convolution of participation data with information about how people perceive their own participation. Either approach can be useful, but which one is used should be determined by a clear understanding of what it is the survey hopes to accomplish.
-Robert Rohde
2009/10/26 Robert Rohde rarohde@gmail.com:
If you want to measure people's perceptions, you use qualitative judgment terms, e.g. "often", "regularly", "rarely".
If you do that, you need to ask something like "How would describe the frequency of your Commons use?" rather than "How often do you use Commons?". It needs to be made clear that you are asking for their opinion rather than a particular answer otherwise people will be concerned that they don't know how you are defining the terms and therefore will think they can't answer the question. I would be interested to know what proportion of people that click the link to the survey actually complete it.
On Mon, Oct 26, 2009 at 7:16 PM, Ray Saintonge saintonge@telus.net wrote:
I don't regularly use Commons, but when I went to fill in the answer at "Other" for why I don't use it the survey crashed.
I got an error message after submitting the page with text in the 'other' field too. I thought it might be because I had a '/' character in there. It did allow me to carry on and complete the survey, though.
I agree with others that the survey has problems.
On the question where it asks what my contributions are in terms of various file types, most of them I had *never* contributed so I checked the 'rarest' button. However, this meant I had nowhere to go but up when it came to pictures... I would say it's *extremely* rare for me to upload a picture but I ticked the 'quite rare' rather than 'very rare' so as to distinguish something I *had* done from the things I have *never* done.
On Mon, Oct 26, 2009 at 12:41 PM, Guillaume Paumier gpaumier@wikimedia.org wrote:
Hi Milos,
Milos Rancic wrote: >
- And the winner: What is my reason to participate? And there is no
*my* reason, even Commons is the free content project? No reason to answer.
The survey was designed specifically to avoid general statements. As you very well state yourself, general statements would be quite useless, and we want to know the underlying goal of the user. In the case you raise, what is the reason why you participate in a free content project? (you can answer off-list). I believe the answer would be one of those offered in the survey (but of course, I may be wrong).
Because I support free content projects [and because I am a Wikimedian]. (My visual and sound production is miserable, but I am actively supporting it by asking those with relevant production to free their work.)
- Then, no possibility to say that I am definitely not working on
"quality review, improvement & promotion of featured works"; just "rarely"; as well as that I didn't upload any animation.
I don't really see the difference from a design point of view.
A similar concern I have heard is that some radio buttons should bem non-mutually exclusive choices (e.g. for the "what is the main reason" questions). I agree it is more difficult with radio buttons, because we ask the user to actually think about what their priority is. But there again, it is more useful from a design point of view.
I understand the reason for mutually exclusive answers. They help in making survey shorter. And I understand what you are asking a participant in survey: to try to be as constructive as it is possible. However, there is significant number of questions to which I can't give a honest answer if I want to be constructive. Giving "other" answer is not a constructive behavior and I contribute with music and photos rarely, but not with video, which means that I can't make distinction between those types of my behavior.
The survey software has a good option to fork survey paths. If someone answered that they've made more than, let's say 1000 edits all over Wikimedia projects, then such person is probably willing to give more detailed answers. For a lot of us spending 30 minutes on helping to improve some Wikimedia project is not a lot. And with good questions and more possibilities (as you said, instead of radio buttons, check boxes; as well as more options), you will have much better data.
For example, the first question may be with check boxes ("what are the main reasons?") and the next one may be with radio buttons ("which one of [your] main reasons is the most important?") or, better, one with a scale ("put numbers between one and x to describe importance of the reason").
Wikimedia community is very complex and it is probably impossible to make useful simple survey. And there is a lot of space for fine tuning of surveys: you may drive a user with simple set of questions and contributor with more complex one.
Thanks for your constructive feedback. We may continue the discussion off-list if you wish.
There is a small number of communities which may give relevant input in making research about themselves and Wikimedian community is one of them. So, it is helpful to ask it for input. Because of that I didn't ask who made that survey privately and raised that question here. (And, I know that my tone wasn't very nice, but, as I said, I was very frustrated with the survey.)
I'll contact you in the next couple of days privately about one research project which seems to fit more into Usability project than to Strategy TFs.
While I don't agree wit most of the criticisms (this is a survey after all, not a philosophy thesis; asking extremely precise questions is not the point), the --/++ questions do seem like a bad design choice. There should be at least clear indication whether an absoulte answer is expected (if I contribute very rarely to Commons, then I will do all the activities very rarely), or a relative one (most often/least often). If the two types of answers are mixed, the results will be meaningless.
Also, next time you should set up a test version of the survey for translators. Not only does this make their work easier, you get dozens of testers for free. E. g. when I translated the survey, I assumed the question about why you do not use Commons is an alternate path to the ones about the way you use it. If I knew it will be asked from everyone, I would have noted that it needs to be rephrased.
Milos Rancic wrote:
When you make a dumb survey, answers will be dumb, as well as conclusions.
I am really frustrated with that survey... I wanted to help with my responses, but it seems that I am inside of unknown percentage of Wikimedia Commons users and contributors which doesn't fit into the survey.
So, let's start...
- Yes, I am using Commons regularly.
- There are many goals when I look for free media files on Commons
and I can't say which one is primary. 3. I am participating in Wikimedia Commons sometimes. 4.
- 100-1000 edits
- 100-1000 uploads (actually I don't have a clue how to find that fact
efficiently)
- I don't have a clue which percentage are my own works, let's say 10-50%
- And the winner: What is my reason to participate? And there is no
*my* reason, even Commons is the free content project? No reason to answer.
- Then, no possibility to say that I am definitely not working on
"quality review, improvement & promotion of featured works"; just "rarely"; as well as that I didn't upload any animation. BTW, all types of my participation on Commons are rare.
And, instead of doing that in a couple of minutes, I've spent 30 minutes in writing this email.
I really don't know which answers such survey is able to give? How many Commons participants have less than 100 edits? SQL queries may be more helpful.
If it is for making some decisions, it is better not to use the results because survey may give very wrong conclusions. If it is for some student work, it may be better.
I don't want to say that it is possible to make a perfect survey. However, this one has ~50% of not so bright questions and/or options. Yes, there are options for "other" answers, but such answers will be not so informative (cf. my second answer) or they would be narrative and it won't be easy to categorize them. Those who don't want to bother themselves with "other" option will give wrong answers.
Some brainstorming with other Wikimedians could help. It is better to have relevant than fast answers and conclusions.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
I think the survey is a good _start_ to understand the basic user contribution level, challenges and where frustration lies, when one tries to contribute to Commons. Thanks to active and supportive translators, the survey is offered in more than twenty languages. This will also help the multi-media usability project team to understand what is the shortcomings of supporting non-English Commons, Wikipedia, other Wikimedia Project contributors, and potential future contributors.
Thanks,
- Naoko
wikimedia-l@lists.wikimedia.org