On Tue, 28 Aug 2012 12:56:50 -0700, Krinkle <krinklemail(a)gmail.com> wrote:
Although for a different use case, I find myself
facing a related
problem with my bot on Commons.
https://commons.wikimedia.org/wiki/Commons:Auto-protected_files/wikipedia/zh
Those pages get updated periodically whenever a commons image starts or
ends being used on a Main Page.
Aside from the fact that this in particular could be a native
feature[1], this page will get a lot of revisions.
Now also aside is that these old revisions are quite useless. Even if
someone would make an edit in between, the bot will overwrite it. But I
don't care so much for the wasted space, since it is relatively small
waste.
The problem comes in when these pages need maintenance. I find myself
periodically checking up on my bot generated pages to make sure I move
them to /Archive_# and delete those and start clean. Because when the
revision count reaches 2,500, it can no longer be deleted because of the
limit we implemented.
We still need to actually fix that issue.
We've got an RFC on the topic:
https://www.mediawiki.org/wiki/Requests_for_comment/Page_deletion
The idea of deleted revisions being grouped together so that when you
delete, recreate, then delete a page the revisions from the separate
deletions are grouped separately rather than being lumped into a big pile
sounds really nice too.
So to keep them mobile and usable, I always fight the
limit by moving it
to a subpage (without a redirect) before it reaches that limit and
delete it there and start a new page on the old name.
Would be nice if these kind of bot-generated pages could avoid that. And
if in doing so we saving revision space, that's nice.
Other examples:
*
https://commons.wikimedia.org/wiki/Commons:Database_reports (and
subpages)
*
https://commons.wikimedia.org/wiki/Commons:Auto-protected_files (and
subpages)
-- Krinkle
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [
http://daniel.friesen.name]