Following up on the earlier thread by Rob [1], Rob and I kicked around
the question what metrics/targets for code review we want to surface
on an ongoing basis. We're not going to invest in a huge dashboard
project right now, but we'll try to get at least some of the key
metrics generated and visualized automatically. Help is appreciated,
starting with what the metrics are that we should look at.
Here's what we came up with, by priority:
1) Most important: Time series graph of # of open changesets
Target: Numer of open changesets should not exceed 200.
Optional breakdown:
- mediawiki/core
- mediawiki/extensions
- WMF-deployed extensions
- specific repos
2) Important: Aging trends.
- Time series graph of # open changesets older than a, b, c days
(to indicate troubling aging trends, e.g. a=3, b=5, c=7)
- Target: There should be 0 changes that haven't been looked at
all for more than 7 days.
- Including only: Changes which have not received a -1 review, -1
verification, or -2
- Optional breakdown as above
- Rationale: We're looking for tendencies of complete neglect of
submissions here, which is why we have to exclude -1s or -2s.
3) Possibly useful:
- Per-reviewer or reviewee(?) statistics regarding merge activity,
number of -1s, neglected code, etc.
Any obvious thinking errors in the above / do the targets make sense /
should we look at other metrics or approaches?
Erik
[1]
http://lists.wikimedia.org/pipermail/wikitech-l/2012-April/059940.html
--
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation
Support Free Knowledge:
https://wikimediafoundation.org/wiki/Donate