[teampractices] FYI: Discovery time spent on maintenance

Antoine Musso hashar+wmf at free.fr
Fri Oct 30 23:14:06 UTC 2015


Le 30/10/2015 20:39, Kevin Smith a écrit :
> There is an initiative within the WMF to figure out how much time/effort
> teams spend on "new functionality" vs. "maintenance". As a pilot
> project, I have been tracking that in our Discovery Cirrus project[1]
> for a couple months.
> 
> As shown on this graph[2], we have been spending somewhere between 25%
> and 50% of our time on "maintenance". Note that this should not be
> considered at all scientific. For starters, there are several glaring
> issues with this graph:
> 
>   * Because we are not doing point estimation, this graph is based on
>     task counts, not actual effort.
>   * Data around Oct 1 is missing/funky due to the offsite.
>   * The bars are pure percentages, so 50% of 2 tasks completed would
>     look the same as 50% of 40 tasks completed. That 100% bar, in
>     particular, is misleading because I believe it is based on a single
>     task being resolved that week.
>   * The counts are based on my snap decision for each task, whether to
>     add the #worktype-new-functionality or the #worktype-maintenance tag.
> 
> Still, it's a higher fraction than I would have guessed.
> 
> Is it worth my time (or someone else's) to continue to track this data?
> 
> 
> [1]
> https://phabricator.wikimedia.org/tag/search-and-discovery-cirrus-sprint/
> [2] http://phlogiston.wmflabs.org/discir_maint_count_frac.png

Hello,

For Release Engineering, Greg Grossmeier came up with a spreadsheet that
lists as columns:

- our key projects (features
- maintenance
- non sense

And rows are the team members.

We each fill a percentage in each column (total 100%) and at the bottom
we have a sum of team overall time per project. Something like:

         | Scap3 | CI  | Maint. | Non sense |
---------------------------------------------
Antoine  |   0%  | 50% |  40%   |   10%     |
John Doe |  90%  |  5% |   0%   |    0%     |
---------------------------------------------
Total:   |  90%  | 55% |  40%   |   10%     | <-- max 200%
Average: |  45%  | 27% |  20%   |    5%     | <-- average
---------------------------------------------

That is done at the beginning of our weekly meeting and only takes a
dozen of seconds.

The main advantages are:

* easy to get the data
* fast to fill and actually fun since the spreadsheet is shared and show
activity of others
* time based

Cons:

* inaccurate, but as you said unless we keep track of what we do every
15 minutes...
* biased by human perception / not based on any fact
* a week-end passed

I still think it provides useful value. After all if a team perceives it
is spending lot of time on maintenance, that would explain why members
bitch about not being able to produce features.

Or if you get a ton of outages and issues filled but none of the team
members doing Maintenance, that would help refocus the team as well.

Organization wide, I believe all the inaccuracies offset each other and
the aggregate would probably ends up being accurate.


At another place, we filled a rough estimate of time spent whenever
commenting on a task.  At the end we could roughly estimate how much
time got spent and given the category/tags that could be aggregated.


-- 
Antoine "hashar" Musso




More information about the teampractices mailing list