While perhaps not relevant to the original enquiry ...
For Wikipedia, do we need to know (or care) about the "death" of:
* a WikiProject * a Portal * a category * an article
While there are articles on historical topics which might, once written, arguably not need further updating, there are many articles, categories, portals, projects which do as they involve topics that are current in the real world. As a simple example, for a town, we report on population and temperatures. For an electorate, there are elections and new people elected. Sports results etc. If readers visit articles with out-of-date information, they may see less value in the article and therefore Wikipedia as a whole. Have we ever thought of having some system of tagging articles of this nature, either as a whole or in sections or in infobox fields that indicates when the information could be considered out of date. E.g. we have the Australian census every 5 years (the last being 2016). It takes them about a year to release the data to the public (so 2017 we had first access to the 2016 data), so we might think it desirable that all Australian places with census data have been updated by 2018 and certainly we would not think it acceptable if there were any with 2006 (or earlier) census data (except as historical information). Yet of course there are many such out-of-date Australian articles, but probably we don't have an easy way to know which ones. (Before anyone rushes to tell me about Wikidata solutions, I would point out that the average Australian Wikipedia editor neither knows nor cares about Wikidata and our attempt to add 2016 census data from Wikidata more-or-less collapsed from lack of community support). I'm thinking here about solutions that Wikipedians might understand, such as templates which have a tracking category that is activated when the article misses an update deadline based on some template in the article.
Of course, on Wikipedia, many articles have the illusion of being actively maintained in the sense of edits occurring, thanks to vandalism and reverts of vandalism, endless re-categorisation, automated changes of a trivial nature (e.g. dash length), the Internet Archive Bot and other bots, copyedits etc. As someone who does her watchlist diligently, I am seeing increasing activity over articles (my daily watchlist seems to be growing faster than the number of entries on my watchlist) which suggests we are more active, but, when I look at the edits, relatively few of them are updates to the information content. So activity should not be equated to information currency. Note, as anyone who deals with visible metrics soon learns, people game them and our edit counts are a classic example. I sometimes wonder what would happen if we suppressed that information. Or better still, counted something that we value more than "number of times clicked Save". What if we only counted the number of citations added (or counted it in addition to plain old edit count)? Would that drive behavioural change from less information-productive activities towards more information-productive activities?
But if we can have some measure of information-activity/inactivity for an article, then I presume we can aggregate this across any natural groupings of articles (e.g. Category trees, Portals, WikiProjects) to discover where we are stagnating and then let humans decide if that topic space is one that can stagnate (because it is historic) or one that must be updated periodically to be considered useful and whether the correct frequency of updates seems to be occurring, either macroscopically or (ideally) microscopically around particular time-sensitive factoids.
Can we measure "information growth"?
Kerry