Would it make sense to identify pages with "too many" revisions and have them archived and protected by the individual wikis? I just wonder what happens if - for example - private data is published on such a page, that should be deleted. Nobody is able to delete the revision without slowing down the whole server. How many pages are that are that large?
On 17/01/2008, Brion Vibber brion@wikimedia.org wrote:
A couple times a year (such as about an hour ago) somebody does something like trying to delete the Wikipedia:Sandbox on en.wikipedia.org, which reaaalllly bogs down the server due to the large number of revisions.
While there are warnings about this, I'm hacking in some limits which will restrict such deletions to keep the system from falling over accidentally.
At the moment I've set the limit at 5000 revisions (as $wgDeleteRevisionsLimit). The error message right now is generic and there's no override group with 'bigdelete' privilege live, but it should be prettified soon.
(Note -- the revision count is done with an index estimate currently, so could overestimate on some pages possibly.)
-- brion
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikitech-l