To be clear. Here the benefits are not calculated because we consider that a single upload is a benefit.
What is missed is the time of the volunteers.
If we calculate the overload of the community to check photos or to look for copyviol, this relation changes.
It an important parameter because a volunteer to filter a huge amount of poor photos or of copyviol has a negative impact in other activities.
Basically these parameters are not realistic. Il 05/Mag/2015 13:59, "Ilario Valdelli" valdelli@gmail.com ha scritto:
It's un unclear.
A god shot on the market costs more than 50 dollars.
Everything in high quality is a benefit.
So the aim is not to calculate a cost per shot but the relation costs benefits.
Basically the delta.
If there is the need to evaluate the success considering the poor relation costs per upload, there is no sense to have a photo contest. An editathon will produce more results. Il 05/Mag/2015 12:21, "Ivo Kruusamägi" ivo.kruusamagi@gmail.com ha scritto:
I've collected photos for Commons with cost lower than 0.01 $ per image, so I don't like claims, that "A god shot of a professional artist doesn't cost 0.90 dollars". Considering the fact, that average upload in WLM is usually out of rather poor quality and will not find itself a place in an article, then thous things aren't that easily comparable.
I specially like the comment about Romaine, and I have taken somewhat similar approach. Only if I'm able to provide constant work for the newcomers there is some chance of keeping them with the program. Getting images vs getting users are two rather different aims. I have also set interest towards getting quality images as we have so many contributors per capita in Estonia, that it isn't very likely to get an increase there without some rather desperate means. But just focusing on images could help to get significantly better quality contributions.
As of this evaluation I'd actually like to get some selected examples, that would explain somewhat on what others have done and what kind of differences there are. For instance, if someone spends thousands of dollars for this campaign, then I'd like to know where the money went, as I can't personally think of any places on where to spend that much. Or what kind of outreach approach was taken to achieve the x goals etc.
Regards Ivo Kruusamägi
2015-05-05 11:06 GMT+03:00 Ilario Valdelli valdelli@gmail.com:
To specify what I am saying:
https://meta.wikimedia.org/wiki/Grants:Evaluation/Evaluation_reports/2015/Wi...
In the paragraph "Content Production and Quality Improvement" it's not mentioned any paragraph about the quality of the photos.
It's a photo contest and the photo contest gives a prize to the best photos not to the biggest uploaders.
This is an example of divergence between the real aim of the projects and the measures of the evaluation.
Probably there is a misunderstanding in same place.
A god shot of a professional artist doesn't cost 0.90 dollars.
To measure the success the best approach is to consider that a god shot can costs around 50-100 dollars.
Replying to people that agree that the measure is to cover articles, I agree with them but I also agree that there is no sense to have bad photos even if these photos are not "descriptive".
Regards
On Tue, May 5, 2015 at 9:46 AM, Ilario Valdelli valdelli@gmail.com wrote:
Hi Lodewijk, it's not the fisrt time that I am saying that the measures of the evaluation are able to measure quantities and not qualities.
If the aim of Wikimedia is to improve also the qualities, it's clear the direction that the movement is taking.
I know that measuring quantities is easier, but it's not an evaluation, are simple numbers without a clear "strategy".
regards
On Sat, May 2, 2015 at 1:26 PM, Lodewijk lodewijk@effeietsanders.org wrote:
Hi all,
it seems that the WMF evaluation department has once again put together an evaluation of Wiki Loves Monuments. Out of curiosity, were any of the organizers involved in this? A quick glance suggests some factual errors, and again a big focus on assuming WLM is a consistent project, that is similar in each country (while in reality it is a diverse collection of projects, tailored to the needs of each country, by its community) and with a focus towards number crunching.
Statements that begin with 'the average Wiki Loves Monuments implementation/contest' make my eyes bleed... Did anyone make a more thorough analysis of the report?
Best, Lodewijk
Wiki Loves Monuments mailing list WikiLovesMonuments@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikilovesmonuments http://www.wikilovesmonuments.org
-- Ilario Valdelli Wikimedia CH Verein zur Förderung Freien Wissens Association pour l’avancement des connaissances libre Associazione per il sostegno alla conoscenza libera Switzerland - 8008 Zürich Wikipedia: Ilario https://meta.wikimedia.org/wiki/User:Ilario Skype: valdelli Facebook: Ilario Valdelli https://www.facebook.com/ivaldelli Twitter: Ilario Valdelli https://twitter.com/ilariovaldelli Linkedin: Ilario Valdelli http://www.linkedin.com/profile/view?id=6724469 Tel: +41764821371 http://www.wikimedia.ch
-- Ilario Valdelli Wikimedia CH Verein zur Förderung Freien Wissens Association pour l’avancement des connaissances libre Associazione per il sostegno alla conoscenza libera Switzerland - 8008 Zürich Wikipedia: Ilario https://meta.wikimedia.org/wiki/User:Ilario Skype: valdelli Facebook: Ilario Valdelli https://www.facebook.com/ivaldelli Twitter: Ilario Valdelli https://twitter.com/ilariovaldelli Linkedin: Ilario Valdelli http://www.linkedin.com/profile/view?id=6724469 Tel: +41764821371 http://www.wikimedia.ch
Wiki Loves Monuments mailing list WikiLovesMonuments@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikilovesmonuments http://www.wikilovesmonuments.org
Wiki Loves Monuments mailing list WikiLovesMonuments@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikilovesmonuments http://www.wikilovesmonuments.org