In today's WMF Metrics and Activities Meeting [1] Jessie Wild's presentation starting around 1:05:00 compared the meta-level grantmaking programs. The presentation is about 12 minutes long.
Jessie, I have two questions, and other people may want to ask questions as well.
1. I'm aware that Program Evaluation is examining the outcomes of conferences this year, and Jamie and I have discussed this in at least two places on Meta. I'm curious about if and how you plan to measure the online impact of conferences; not just what people and groups say they will do in post-survey conferences, but what they actually do online in verifiable ways in the subsequent 3-12 months.
2. You said in your presentation that there is no direct correlation between grant size and measurable online impact. From the slides at around the 1:13-1:15 minute marks, it looks to me like the correlation is negative, meaning that smaller grants produced disproportionately more impact. I can say that within IEG this occurred partly because we had some highly motivated and generous grantees who volunteered a considerable amount of time to work with modest amounts of money, and I don't think we should expect that level of generosity from all grantees, but I think that grantmaking committees may want (A) to take into account the level of motivation of grantees, (B) to consider breaking large block grants into discrete smaller projects with individual reporting requirements, and (C) for larger grants where there seem to be a lot of problems with reporting and a disappointing level of cost-effectiveness, to be more assertive about tying funding to demonstrated results and reliable, standardized reporting with assistance from WMF. What do you think?
Thanks, Pine
[1] https://www.youtube.com/watch?v=2JbZ1uWoKEg&feature=youtu.be
Thanks for listening to the presentation, Pine!
There will be a more comprehensive analysis posted on Meta, but in the meantime to answer your questions:
- I'm aware that Program Evaluation is examining the outcomes of
conferences this year, and Jamie and I have discussed this in at least two places on Meta. I'm curious about if and how you plan to measure the online impact of conferences; not just what people and groups say they will do in post-survey conferences, but what they actually do online in verifiable ways in the subsequent 3-12 months.
Jaime and I and the others on the Grantmaking team are working together on this, and experimenting with some different ways of evaluating the work in the few months following the conferences. One way to do this in a small experiment, for example, is to run a cohort of users who received Wikimania Scholarships through Wikimetrics at different increments throughout the year following. This is something I have been curious to do for a long time, but never had the tool to do it on an aggregate level!
- You said in your presentation that there is no direct correlation
between grant size and measurable online impact. From the slides at around the 1:13-1:15 minute marks, it looks to me like the correlation is negative, meaning that smaller grants produced disproportionately more impact. I can say that within IEG this occurred partly because we had some highly motivated and generous grantees who volunteered a considerable amount of time to work with modest amounts of money, and I don't think we should expect that level of generosity from all grantees, but I think that grantmaking committees may want (A) to take into account the level of motivation of grantees, (B) to consider breaking large block grants into discrete smaller projects with individual reporting requirements, and (C) for larger grants where there seem to be a lot of problems with reporting and a disappointing level of cost-effectiveness, to be more assertive about tying funding to demonstrated results and reliable, standardized reporting with assistance from WMF. What do you think?
Well, there are definite outliers, and the slides aggregate by program
type rather than by size. So, for example, several of the IEG grants were much bigger than than the majority of PEG grants. So - not exactly negative correlation (at least, we can't definitively say that).
I absolutely agree with your (C) suggestion, and your (B) suggestion is very interesting too - we haven't discussed that one. It may be worth considering if there are larger project-based grants. For the annual plan grants, we have this in terms of quarterly reports (and midpoint reports for IEG), so we do try to do interventions with grantees if it looks like they are off-track. As for (A), based on what we saw through our evaluation of IEG[1], motivation is definitely important but the key difference for outlier performance was from those grantees that had *specific target audiences* identified, so they knew exactly who they wanted to be working with and how to reach those people. So, I would want committees to take into account grants with a specific target audience or specific target topic area (for quality improvements, for example; we saw this for successful outreach in PEG grants[2]). More explicitly on motivation, while it is difficult to measure for new grantees, you can see a lot about someone's motivation and creativity based on their past reports if they are a returning grantee. I would definitely encourage our committees to look back on past reports from returning grantees!
- Jessie
[1] https://meta.wikimedia.org/wiki/Grants:IEG/Learning/Round_1_2013/Impact [2] https://meta.wikimedia.org/wiki/Grants:PEG/Learning/2013-14
Hi Jessie,
Thanks for the quick reply.
Issue 1 may be challenging to measure even with Wikimetrics. Can we talk about this during the Research Hackathon next week if we can set up a time off-list?
Thanks for the info about issue 2. I am grateful to learn that you did an evaluation of PEG. It is interesting to compare that evaluation with the evaluation of IEG. A number of grantmaking committee members and grantees will be at Wikimania and I hope the PED team will introduce themselves and be available to discuss these studies, especially if there is a plenary meeting of all the Meta grantmaking committee members who attend Wikimania.
Thanks very much,
Pine On Jul 31, 2014 2:50 PM, "Jessie Wild" jwild@wikimedia.org wrote:
Thanks for listening to the presentation, Pine!
There will be a more comprehensive analysis posted on Meta, but in the meantime to answer your questions:
- I'm aware that Program Evaluation is examining the outcomes of
conferences this year, and Jamie and I have discussed this in at least two places on Meta. I'm curious about if and how you plan to measure the online impact of conferences; not just what people and groups say they will do in post-survey conferences, but what they actually do online in verifiable ways in the subsequent 3-12 months.
Jaime and I and the others on the Grantmaking team are working together on this, and experimenting with some different ways of evaluating the work in the few months following the conferences. One way to do this in a small experiment, for example, is to run a cohort of users who received Wikimania Scholarships through Wikimetrics at different increments throughout the year following. This is something I have been curious to do for a long time, but never had the tool to do it on an aggregate level!
- You said in your presentation that there is no direct correlation
between grant size and measurable online impact. From the slides at around the 1:13-1:15 minute marks, it looks to me like the correlation is negative, meaning that smaller grants produced disproportionately more impact. I can say that within IEG this occurred partly because we had some highly motivated and generous grantees who volunteered a considerable amount of time to work with modest amounts of money, and I don't think we should expect that level of generosity from all grantees, but I think that grantmaking committees may want (A) to take into account the level of motivation of grantees, (B) to consider breaking large block grants into discrete smaller projects with individual reporting requirements, and (C) for larger grants where there seem to be a lot of problems with reporting and a disappointing level of cost-effectiveness, to be more assertive about tying funding to demonstrated results and reliable, standardized reporting with assistance from WMF. What do you think?
Well, there are definite outliers, and the slides aggregate by program
type rather than by size. So, for example, several of the IEG grants were much bigger than than the majority of PEG grants. So - not exactly negative correlation (at least, we can't definitively say that).
I absolutely agree with your (C) suggestion, and your (B) suggestion is very interesting too - we haven't discussed that one. It may be worth considering if there are larger project-based grants. For the annual plan grants, we have this in terms of quarterly reports (and midpoint reports for IEG), so we do try to do interventions with grantees if it looks like they are off-track. As for (A), based on what we saw through our evaluation of IEG[1], motivation is definitely important but the key difference for outlier performance was from those grantees that had *specific target audiences* identified, so they knew exactly who they wanted to be working with and how to reach those people. So, I would want committees to take into account grants with a specific target audience or specific target topic area (for quality improvements, for example; we saw this for successful outreach in PEG grants[2]). More explicitly on motivation, while it is difficult to measure for new grantees, you can see a lot about someone's motivation and creativity based on their past reports if they are a returning grantee. I would definitely encourage our committees to look back on past reports from returning grantees!
- Jessie
[1] https://meta.wikimedia.org/wiki/Grants:IEG/Learning/Round_1_2013/Impact [2] https://meta.wikimedia.org/wiki/Grants:PEG/Learning/2013-14
--
*Jessie Wild SnellerGrantmaking Learning & Evaluation * *Wikimedia Foundation*
Imagine a world in which every single human being can freely share in the sum of all knowledge. Help us make it a reality! Donate to Wikimedia https://donate.wikimedia.org/
Another point to consider is that comparing grants that include staff compensation to grants that do not is necessarily tipping the scales. Volunteer time is a cost too (though borne by the volunteers themselves and not by the funder), and ignoring it in cost-benefit analysis will always give the impression that grants including staff are significantly less effective, whether or not they truly are.
It may make sense to ignore it if the funder is only interested in straight impact-for-dollars; it seems to me that WMF is a funder that cares about _movement resources_, including volunteer time, and not just dollars out of its own budget.
A.
On Thu, Jul 31, 2014 at 10:12 PM, Pine W wiki.pine@gmail.com wrote:
Hi Jessie,
Thanks for the quick reply.
Issue 1 may be challenging to measure even with Wikimetrics. Can we talk about this during the Research Hackathon next week if we can set up a time off-list?
Thanks for the info about issue 2. I am grateful to learn that you did an evaluation of PEG. It is interesting to compare that evaluation with the evaluation of IEG. A number of grantmaking committee members and grantees will be at Wikimania and I hope the PED team will introduce themselves and be available to discuss these studies, especially if there is a plenary meeting of all the Meta grantmaking committee members who attend Wikimania.
Thanks very much,
Pine On Jul 31, 2014 2:50 PM, "Jessie Wild" jwild@wikimedia.org wrote:
Thanks for listening to the presentation, Pine!
There will be a more comprehensive analysis posted on Meta, but in the meantime to answer your questions:
- I'm aware that Program Evaluation is examining the outcomes of
conferences this year, and Jamie and I have discussed this in at least
two
places on Meta. I'm curious about if and how you plan to measure the
online
impact of conferences; not just what people and groups say they will do
in
post-survey conferences, but what they actually do online in verifiable ways in the subsequent 3-12 months.
Jaime and I and the others on the Grantmaking team are working together
on
this, and experimenting with some different ways of evaluating the work
in
the few months following the conferences. One way to do this in a small experiment, for example, is to run a cohort of users who received
Wikimania
Scholarships through Wikimetrics at different increments throughout the year following. This is something I have been curious to do for a long time, but never had the tool to do it on an aggregate level!
- You said in your presentation that there is no direct correlation
between grant size and measurable online impact. From the slides at
around
the 1:13-1:15 minute marks, it looks to me like the correlation is negative, meaning that smaller grants produced disproportionately more impact. I can say that within IEG this occurred partly because we had
some
highly motivated and generous grantees who volunteered a considerable amount of time to work with modest amounts of money, and I don't think
we
should expect that level of generosity from all grantees, but I think
that
grantmaking committees may want (A) to take into account the level of motivation of grantees, (B) to consider breaking large block grants into discrete smaller projects with individual reporting requirements, and
(C)
for larger grants where there seem to be a lot of problems with
reporting
and a disappointing level of cost-effectiveness, to be more assertive
about
tying funding to demonstrated results and reliable, standardized
reporting
with assistance from WMF. What do you think?
Well, there are definite outliers, and the slides aggregate by program
type rather than by size. So, for example, several of the IEG grants were much bigger than than the majority of PEG grants. So - not exactly
negative
correlation (at least, we can't definitively say that).
I absolutely agree with your (C) suggestion, and your (B) suggestion is very interesting too - we haven't discussed that one. It may be worth considering if there are larger project-based grants. For the annual plan grants, we have this in terms of quarterly reports (and midpoint reports for IEG), so we do try to do interventions with grantees if it looks like they are off-track. As for (A), based on what we saw through our evaluation of IEG[1], motivation is definitely important but the key difference for outlier performance was from those grantees that had
*specific
target audiences* identified, so they knew exactly who they wanted to be working with and how to reach those people. So, I would want committees
to
take into account grants with a specific target audience or specific
target
topic area (for quality improvements, for example; we saw this for successful outreach in PEG grants[2]). More explicitly on motivation,
while
it is difficult to measure for new grantees, you can see a lot about someone's motivation and creativity based on their past reports if they
are
a returning grantee. I would definitely encourage our committees to look back on past reports from returning grantees!
- Jessie
[1] https://meta.wikimedia.org/wiki/Grants:IEG/Learning/Round_1_2013/Impact [2] https://meta.wikimedia.org/wiki/Grants:PEG/Learning/2013-14
--
*Jessie Wild SnellerGrantmaking Learning & Evaluation * *Wikimedia Foundation*
Imagine a world in which every single human being can freely share in the sum of all knowledge. Help us make it a reality! Donate to Wikimedia https://donate.wikimedia.org/
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Good point, Asaf. Jessie, is there a way to take this into account when compating costs and benefits?
I agree with Gerard that there can be survey or reporting fatigue, though I have yet to hear an IEG grantee complain. The APG application and reporting system seems more extensive and I can see how it can discourage small orgs from APG funding, on the other hand there seem to be issues of cost-effectiveness and outcome reporting with some existing large APG grantees. Perhaps there should be easier an APG process for small orgs and more specific cost and outcome reporting requirements for large orgs.
Pine On Aug 1, 2014 8:56 AM, "Asaf Bartov" abartov@wikimedia.org wrote:
Another point to consider is that comparing grants that include staff compensation to grants that do not is necessarily tipping the scales. Volunteer time is a cost too (though borne by the volunteers themselves and not by the funder), and ignoring it in cost-benefit analysis will always give the impression that grants including staff are significantly less effective, whether or not they truly are.
It may make sense to ignore it if the funder is only interested in straight impact-for-dollars; it seems to me that WMF is a funder that cares about _movement resources_, including volunteer time, and not just dollars out of its own budget.
A.
On Thu, Jul 31, 2014 at 10:12 PM, Pine W wiki.pine@gmail.com wrote:
Hi Jessie,
Thanks for the quick reply.
Issue 1 may be challenging to measure even with Wikimetrics. Can we talk about this during the Research Hackathon next week if we can set up a
time
off-list?
Thanks for the info about issue 2. I am grateful to learn that you did an evaluation of PEG. It is interesting to compare that evaluation with the evaluation of IEG. A number of grantmaking committee members and grantees will be at Wikimania and I hope the PED team will introduce themselves
and
be available to discuss these studies, especially if there is a plenary meeting of all the Meta grantmaking committee members who attend
Wikimania.
Thanks very much,
Pine On Jul 31, 2014 2:50 PM, "Jessie Wild" jwild@wikimedia.org wrote:
Thanks for listening to the presentation, Pine!
There will be a more comprehensive analysis posted on Meta, but in the meantime to answer your questions:
- I'm aware that Program Evaluation is examining the outcomes of
conferences this year, and Jamie and I have discussed this in at least
two
places on Meta. I'm curious about if and how you plan to measure the
online
impact of conferences; not just what people and groups say they will
do
in
post-survey conferences, but what they actually do online in
verifiable
ways in the subsequent 3-12 months.
Jaime and I and the others on the Grantmaking team are working together
on
this, and experimenting with some different ways of evaluating the work
in
the few months following the conferences. One way to do this in a small experiment, for example, is to run a cohort of users who received
Wikimania
Scholarships through Wikimetrics at different increments throughout the year following. This is something I have been curious to do for a long time, but never had the tool to do it on an aggregate level!
- You said in your presentation that there is no direct correlation
between grant size and measurable online impact. From the slides at
around
the 1:13-1:15 minute marks, it looks to me like the correlation is negative, meaning that smaller grants produced disproportionately more impact. I can say that within IEG this occurred partly because we had
some
highly motivated and generous grantees who volunteered a considerable amount of time to work with modest amounts of money, and I don't think
we
should expect that level of generosity from all grantees, but I think
that
grantmaking committees may want (A) to take into account the level of motivation of grantees, (B) to consider breaking large block grants
into
discrete smaller projects with individual reporting requirements, and
(C)
for larger grants where there seem to be a lot of problems with
reporting
and a disappointing level of cost-effectiveness, to be more assertive
about
tying funding to demonstrated results and reliable, standardized
reporting
with assistance from WMF. What do you think?
Well, there are definite outliers, and the slides aggregate by program
type rather than by size. So, for example, several of the IEG grants
were
much bigger than than the majority of PEG grants. So - not exactly
negative
correlation (at least, we can't definitively say that).
I absolutely agree with your (C) suggestion, and your (B) suggestion is very interesting too - we haven't discussed that one. It may be worth considering if there are larger project-based grants. For the annual
plan
grants, we have this in terms of quarterly reports (and midpoint
reports
for IEG), so we do try to do interventions with grantees if it looks
like
they are off-track. As for (A), based on what we saw through our evaluation of IEG[1], motivation is definitely important but the key difference for outlier performance was from those grantees that had
*specific
target audiences* identified, so they knew exactly who they wanted to
be
working with and how to reach those people. So, I would want committees
to
take into account grants with a specific target audience or specific
target
topic area (for quality improvements, for example; we saw this for successful outreach in PEG grants[2]). More explicitly on motivation,
while
it is difficult to measure for new grantees, you can see a lot about someone's motivation and creativity based on their past reports if they
are
a returning grantee. I would definitely encourage our committees to
look
back on past reports from returning grantees!
- Jessie
[1]
https://meta.wikimedia.org/wiki/Grants:IEG/Learning/Round_1_2013/Impact
[2] https://meta.wikimedia.org/wiki/Grants:PEG/Learning/2013-14
--
*Jessie Wild SnellerGrantmaking Learning & Evaluation * *Wikimedia Foundation*
Imagine a world in which every single human being can freely share in the sum of all knowledge. Help us make it a reality! Donate to Wikimedia https://donate.wikimedia.org/
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
-- Asaf Bartov Wikimedia Foundation http://www.wikimediafoundation.org
Imagine a world in which every single human being can freely share in the sum of all knowledge. Help us make it a reality! https://donate.wikimedia.org _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Hoi, I have done a project and there were two parts to my project. There was the delivery of an input method and a font for a script that did not have any UNICODE font. At that time there was functionality for fonts. So it should have been a shoe in. The cost of the project was relatively large. This was because of the cost of producing a new font. In real world terms the font and input method were provided for a very low price..
Because of whatever internal issues, the font did not become available in MediaWiki. While waiting the partner for the project lost his subsidy and as an organisation the Royal Institute for the Tropics ended and the Tropenmuseum was merged with two other museums. This was duly mentioned at the time. I even blogged about it.
As a consequence of this all my project was gone. The money was spend, the goods were available but not available to a project. I am no longer involved in Batak and have no leads to revive it. I have no intention either.
Now a long time after all this I was hassled for a report. As far as I am aware I have attempted multiple iterations of a report. It did not fit the mold or whatever was wrong with it.
With more reporting you get less project and more irritation. I loathe the notion that more reporting will lead to anything positive. If anything it makes sense to project manage the reports, keep a finger on the pulse. But this is a personal affair and very much NOT an administrative affair.
When I am getting involved in another project I will very much try to stay away from administrative bullshit while I am very much available for personal contact. Thanks, GerardM
On 31 July 2014 23:50, Jessie Wild jwild@wikimedia.org wrote:
Thanks for listening to the presentation, Pine!
There will be a more comprehensive analysis posted on Meta, but in the meantime to answer your questions:
- I'm aware that Program Evaluation is examining the outcomes of
conferences this year, and Jamie and I have discussed this in at least
two
places on Meta. I'm curious about if and how you plan to measure the
online
impact of conferences; not just what people and groups say they will do
in
post-survey conferences, but what they actually do online in verifiable ways in the subsequent 3-12 months.
Jaime and I and the others on the Grantmaking team are working together on this, and experimenting with some different ways of evaluating the work in the few months following the conferences. One way to do this in a small experiment, for example, is to run a cohort of users who received Wikimania Scholarships through Wikimetrics at different increments throughout the year following. This is something I have been curious to do for a long time, but never had the tool to do it on an aggregate level!
- You said in your presentation that there is no direct correlation
between grant size and measurable online impact. From the slides at
around
the 1:13-1:15 minute marks, it looks to me like the correlation is negative, meaning that smaller grants produced disproportionately more impact. I can say that within IEG this occurred partly because we had
some
highly motivated and generous grantees who volunteered a considerable amount of time to work with modest amounts of money, and I don't think we should expect that level of generosity from all grantees, but I think
that
grantmaking committees may want (A) to take into account the level of motivation of grantees, (B) to consider breaking large block grants into discrete smaller projects with individual reporting requirements, and (C) for larger grants where there seem to be a lot of problems with reporting and a disappointing level of cost-effectiveness, to be more assertive
about
tying funding to demonstrated results and reliable, standardized
reporting
with assistance from WMF. What do you think?
Well, there are definite outliers, and the slides aggregate by program
type rather than by size. So, for example, several of the IEG grants were much bigger than than the majority of PEG grants. So - not exactly negative correlation (at least, we can't definitively say that).
I absolutely agree with your (C) suggestion, and your (B) suggestion is very interesting too - we haven't discussed that one. It may be worth considering if there are larger project-based grants. For the annual plan grants, we have this in terms of quarterly reports (and midpoint reports for IEG), so we do try to do interventions with grantees if it looks like they are off-track. As for (A), based on what we saw through our evaluation of IEG[1], motivation is definitely important but the key difference for outlier performance was from those grantees that had *specific target audiences* identified, so they knew exactly who they wanted to be working with and how to reach those people. So, I would want committees to take into account grants with a specific target audience or specific target topic area (for quality improvements, for example; we saw this for successful outreach in PEG grants[2]). More explicitly on motivation, while it is difficult to measure for new grantees, you can see a lot about someone's motivation and creativity based on their past reports if they are a returning grantee. I would definitely encourage our committees to look back on past reports from returning grantees!
- Jessie
[1] https://meta.wikimedia.org/wiki/Grants:IEG/Learning/Round_1_2013/Impact [2] https://meta.wikimedia.org/wiki/Grants:PEG/Learning/2013-14
--
*Jessie Wild SnellerGrantmaking Learning & Evaluation * *Wikimedia Foundation*
Imagine a world in which every single human being can freely share in the sum of all knowledge. Help us make it a reality! Donate to Wikimedia https://donate.wikimedia.org/ _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
If people are interested in the background to Gerard's rant, the grant he is talking about is [1], and the incomplete report he has been "hassled" about is [2]. I don't want to hijack this thread to discuss this specific example, but I welcome discussion on that grant report's talk page, and encourage anyone inclined to take Gerard at his word to look into the report in question and draw their own conclusions.
A.
[1] https://meta.wikimedia.org/wiki/Grants:PEG/Gerard_Meijssen_and_Michael_Evers... [2] https://meta.wikimedia.org/wiki/Grants:PEG/Gerard_Meijssen_and_Michael_Evers...
On Fri, Aug 1, 2014 at 2:06 AM, Gerard Meijssen gerard.meijssen@gmail.com wrote:
Hoi, I have done a project and there were two parts to my project. There was the delivery of an input method and a font for a script that did not have any UNICODE font. At that time there was functionality for fonts. So it should have been a shoe in. The cost of the project was relatively large. This was because of the cost of producing a new font. In real world terms the font and input method were provided for a very low price..
Because of whatever internal issues, the font did not become available in MediaWiki. While waiting the partner for the project lost his subsidy and as an organisation the Royal Institute for the Tropics ended and the Tropenmuseum was merged with two other museums. This was duly mentioned at the time. I even blogged about it.
As a consequence of this all my project was gone. The money was spend, the goods were available but not available to a project. I am no longer involved in Batak and have no leads to revive it. I have no intention either.
Now a long time after all this I was hassled for a report. As far as I am aware I have attempted multiple iterations of a report. It did not fit the mold or whatever was wrong with it.
With more reporting you get less project and more irritation. I loathe the notion that more reporting will lead to anything positive. If anything it makes sense to project manage the reports, keep a finger on the pulse. But this is a personal affair and very much NOT an administrative affair.
When I am getting involved in another project I will very much try to stay away from administrative bullshit while I am very much available for personal contact. Thanks, GerardM
On 31 July 2014 23:50, Jessie Wild jwild@wikimedia.org wrote:
Thanks for listening to the presentation, Pine!
There will be a more comprehensive analysis posted on Meta, but in the meantime to answer your questions:
- I'm aware that Program Evaluation is examining the outcomes of
conferences this year, and Jamie and I have discussed this in at least
two
places on Meta. I'm curious about if and how you plan to measure the
online
impact of conferences; not just what people and groups say they will do
in
post-survey conferences, but what they actually do online in verifiable ways in the subsequent 3-12 months.
Jaime and I and the others on the Grantmaking team are working together
on
this, and experimenting with some different ways of evaluating the work
in
the few months following the conferences. One way to do this in a small experiment, for example, is to run a cohort of users who received
Wikimania
Scholarships through Wikimetrics at different increments throughout the year following. This is something I have been curious to do for a long time, but never had the tool to do it on an aggregate level!
- You said in your presentation that there is no direct correlation
between grant size and measurable online impact. From the slides at
around
the 1:13-1:15 minute marks, it looks to me like the correlation is negative, meaning that smaller grants produced disproportionately more impact. I can say that within IEG this occurred partly because we had
some
highly motivated and generous grantees who volunteered a considerable amount of time to work with modest amounts of money, and I don't think
we
should expect that level of generosity from all grantees, but I think
that
grantmaking committees may want (A) to take into account the level of motivation of grantees, (B) to consider breaking large block grants
into
discrete smaller projects with individual reporting requirements, and
(C)
for larger grants where there seem to be a lot of problems with
reporting
and a disappointing level of cost-effectiveness, to be more assertive
about
tying funding to demonstrated results and reliable, standardized
reporting
with assistance from WMF. What do you think?
Well, there are definite outliers, and the slides aggregate by program
type rather than by size. So, for example, several of the IEG grants were much bigger than than the majority of PEG grants. So - not exactly
negative
correlation (at least, we can't definitively say that).
I absolutely agree with your (C) suggestion, and your (B) suggestion is very interesting too - we haven't discussed that one. It may be worth considering if there are larger project-based grants. For the annual plan grants, we have this in terms of quarterly reports (and midpoint reports for IEG), so we do try to do interventions with grantees if it looks like they are off-track. As for (A), based on what we saw through our evaluation of IEG[1], motivation is definitely important but the key difference for outlier performance was from those grantees that had *specific target audiences* identified, so they knew exactly who they wanted to be working with and how to reach those people. So, I would want committees
to
take into account grants with a specific target audience or specific
target
topic area (for quality improvements, for example; we saw this for successful outreach in PEG grants[2]). More explicitly on motivation,
while
it is difficult to measure for new grantees, you can see a lot about someone's motivation and creativity based on their past reports if they
are
a returning grantee. I would definitely encourage our committees to look back on past reports from returning grantees!
- Jessie
[1] https://meta.wikimedia.org/wiki/Grants:IEG/Learning/Round_1_2013/Impact [2] https://meta.wikimedia.org/wiki/Grants:PEG/Learning/2013-14
--
*Jessie Wild SnellerGrantmaking Learning & Evaluation * *Wikimedia Foundation*
Imagine a world in which every single human being can freely share in the sum of all knowledge. Help us make it a reality! Donate to Wikimedia https://donate.wikimedia.org/ _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
On 1 August 2014 17:01, Asaf Bartov abartov@wikimedia.org wrote:
If people are interested in the background to Gerard's rant, the grant he is talking about is [1], and the incomplete report he has been "hassled" about is [2]. I don't want to hijack this thread to discuss this specific example, but I welcome discussion on that grant report's talk page, and encourage anyone inclined to take Gerard at his word to look into the report in question and draw their own conclusions.
Wow. After reading this, I don't think I will go near it.
Fae
2014-07-31 23:39 GMT+03:00 Pine W wiki.pine@gmail.com:
WMF Metrics and Activities Meeting
Hi,
Are the slides available anywhere? I'm especially interested in the heat map. Is there an interactive version online (or at least the data behind it)?
Thanks, Strainu
Slides from all the presentations are available here: https://meta.wikimedia.org/wiki/WMF_Metrics_and_activities_meetings/2014-08
Dan
On 1 August 2014 12:11, Strainu strainu10@gmail.com wrote:
2014-07-31 23:39 GMT+03:00 Pine W wiki.pine@gmail.com:
WMF Metrics and Activities Meeting
Hi,
Are the slides available anywhere? I'm especially interested in the heat map. Is there an interactive version online (or at least the data behind it)?
Thanks, Strainu
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
On Fri, Aug 1, 2014 at 12:13 PM, Dan Garry dgarry@wikimedia.org wrote:
Slides from all the presentations are available here: https://meta.wikimedia.org/wiki/WMF_Metrics_and_activities_meetings/2014-08
The grantmaking slides seem to be limited to WMF employees though.
Jessie,
Can you make sure that your slides from yesterday are shared publicly so people can take a look at them? Right now they seem to be shared only to WMF employees.
Thanks!
Dan
On 1 August 2014 14:45, Gergo Tisza gtisza@wikimedia.org wrote:
On Fri, Aug 1, 2014 at 12:13 PM, Dan Garry dgarry@wikimedia.org wrote:
Slides from all the presentations are available here:
https://meta.wikimedia.org/wiki/WMF_Metrics_and_activities_meetings/2014-08
The grantmaking slides seem to be limited to WMF employees though. _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
In general, using Google to store Wikimedia slide decks is a bad idea as that's essentially temporary (and restricted-access) storage - it's much better to upload a copy to Commons so they are properly archived (hopefully indefinitely!) and available to all...
Thanks, Mike
On 1 Aug 2014, at 22:48, Dan Garry dgarry@wikimedia.org wrote:
Jessie,
Can you make sure that your slides from yesterday are shared publicly so people can take a look at them? Right now they seem to be shared only to WMF employees.
Thanks!
Dan
On 1 August 2014 14:45, Gergo Tisza gtisza@wikimedia.org wrote:
On Fri, Aug 1, 2014 at 12:13 PM, Dan Garry dgarry@wikimedia.org wrote:
Slides from all the presentations are available here:
https://meta.wikimedia.org/wiki/WMF_Metrics_and_activities_meetings/2014-08
The grantmaking slides seem to be limited to WMF employees though. _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
-- Dan Garry Associate Product Manager, Mobile Apps Wikimedia Foundation _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
I put the slides on commons immediately after the presentation: https://commons.wikimedia.org/wiki/File:Grantmaking_Impact_Assessment,_2013-...
As for the cost-benefit question: YES ABSOLUTELY we need all the costs involved! This is one of the major gaps we saw in reporting: we weren't able to capture full costs of project, including volunteer time or in-kind donations because those were not reported back to us. We are asking for this data in the upcoming program evaluation review which will have open data collection in September. Please report :) We're also working on a way within the grantmaking structures to gather that information without putting too much of an increased reporting burden on grantees.
And to Pine and all others who will be in London at Wikimania: please do stop by the Grantmaking booth in the community village! We'll have people there throughout the days, and it would be awesome to take advantage of the opportunity to engage in this conversation in person as possible (of course, online is an ongoing option).
On Sat, Aug 2, 2014 at 3:19 PM, Michael Peel email@mikepeel.net wrote:
In general, using Google to store Wikimedia slide decks is a bad idea as that's essentially temporary (and restricted-access) storage - it's much better to upload a copy to Commons so they are properly archived (hopefully indefinitely!) and available to all...
Thanks, Mike
On 1 Aug 2014, at 22:48, Dan Garry dgarry@wikimedia.org wrote:
Jessie,
Can you make sure that your slides from yesterday are shared publicly so people can take a look at them? Right now they seem to be shared only to WMF employees.
Thanks!
Dan
On 1 August 2014 14:45, Gergo Tisza gtisza@wikimedia.org wrote:
On Fri, Aug 1, 2014 at 12:13 PM, Dan Garry dgarry@wikimedia.org
wrote:
Slides from all the presentations are available here:
https://meta.wikimedia.org/wiki/WMF_Metrics_and_activities_meetings/2014-08
The grantmaking slides seem to be limited to WMF employees though. _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
-- Dan Garry Associate Product Manager, Mobile Apps Wikimedia Foundation _______________________________________________ Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines
Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
Hoi, I will be there :) Thanks, GerardM
On 4 August 2014 15:56, Jessie Wild jwild@wikimedia.org wrote:
I put the slides on commons immediately after the presentation:
https://commons.wikimedia.org/wiki/File:Grantmaking_Impact_Assessment,_2013-...
As for the cost-benefit question: YES ABSOLUTELY we need all the costs involved! This is one of the major gaps we saw in reporting: we weren't able to capture full costs of project, including volunteer time or in-kind donations because those were not reported back to us. We are asking for this data in the upcoming program evaluation review which will have open data collection in September. Please report :) We're also working on a way within the grantmaking structures to gather that information without putting too much of an increased reporting burden on grantees.
And to Pine and all others who will be in London at Wikimania: please do stop by the Grantmaking booth in the community village! We'll have people there throughout the days, and it would be awesome to take advantage of the opportunity to engage in this conversation in person as possible (of course, online is an ongoing option).
On Sat, Aug 2, 2014 at 3:19 PM, Michael Peel email@mikepeel.net wrote:
In general, using Google to store Wikimedia slide decks is a bad idea as that's essentially temporary (and restricted-access) storage - it's much better to upload a copy to Commons so they are properly archived
(hopefully
indefinitely!) and available to all...
Thanks, Mike
On 1 Aug 2014, at 22:48, Dan Garry dgarry@wikimedia.org wrote:
Jessie,
Can you make sure that your slides from yesterday are shared publicly
so
people can take a look at them? Right now they seem to be shared only
to
WMF employees.
Thanks!
Dan
On 1 August 2014 14:45, Gergo Tisza gtisza@wikimedia.org wrote:
On Fri, Aug 1, 2014 at 12:13 PM, Dan Garry dgarry@wikimedia.org
wrote:
Slides from all the presentations are available here:
https://meta.wikimedia.org/wiki/WMF_Metrics_and_activities_meetings/2014-08
The grantmaking slides seem to be limited to WMF employees though. _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
,
mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
-- Dan Garry Associate Product Manager, Mobile Apps Wikimedia Foundation _______________________________________________ Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines
Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
--
*Jessie Wild SnellerGrantmaking Learning & Evaluation * *Wikimedia Foundation*
Imagine a world in which every single human being can freely share in the sum of all knowledge. Help us make it a reality! Donate to Wikimedia https://donate.wikimedia.org/ _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
hi jessie, nice to hear this from you! how will the central cost be considered, e.g. grantmaking department of the wmf?
rupert Am 04.08.2014 15:56 schrieb "Jessie Wild" jwild@wikimedia.org:
I put the slides on commons immediately after the presentation:
https://commons.wikimedia.org/wiki/File:Grantmaking_Impact_Assessment,_2013-...
As for the cost-benefit question: YES ABSOLUTELY we need all the costs involved! This is one of the major gaps we saw in reporting: we weren't able to capture full costs of project, including volunteer time or in-kind donations because those were not reported back to us. We are asking for this data in the upcoming program evaluation review which will have open data collection in September. Please report :) We're also working on a way within the grantmaking structures to gather that information without putting too much of an increased reporting burden on grantees.
And to Pine and all others who will be in London at Wikimania: please do stop by the Grantmaking booth in the community village! We'll have people there throughout the days, and it would be awesome to take advantage of the opportunity to engage in this conversation in person as possible (of course, online is an ongoing option).
On Sat, Aug 2, 2014 at 3:19 PM, Michael Peel email@mikepeel.net wrote:
In general, using Google to store Wikimedia slide decks is a bad idea as that's essentially temporary (and restricted-access) storage - it's much better to upload a copy to Commons so they are properly archived
(hopefully
indefinitely!) and available to all...
Thanks, Mike
On 1 Aug 2014, at 22:48, Dan Garry dgarry@wikimedia.org wrote:
Jessie,
Can you make sure that your slides from yesterday are shared publicly
so
people can take a look at them? Right now they seem to be shared only
to
WMF employees.
Thanks!
Dan
On 1 August 2014 14:45, Gergo Tisza gtisza@wikimedia.org wrote:
On Fri, Aug 1, 2014 at 12:13 PM, Dan Garry dgarry@wikimedia.org
wrote:
Slides from all the presentations are available here:
https://meta.wikimedia.org/wiki/WMF_Metrics_and_activities_meetings/2014-08
The grantmaking slides seem to be limited to WMF employees though. _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
,
mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
-- Dan Garry Associate Product Manager, Mobile Apps Wikimedia Foundation _______________________________________________ Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines
Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
--
*Jessie Wild SnellerGrantmaking Learning & Evaluation * *Wikimedia Foundation*
Imagine a world in which every single human being can freely share in the sum of all knowledge. Help us make it a reality! Donate to Wikimedia https://donate.wikimedia.org/ _______________________________________________ Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines Wikimedia-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe
wikimedia-l@lists.wikimedia.org