On 23/8/22 21:29, Martin Domdey wrote:
Hi,
please tell me, what is the thought behind the impossibility, that a normal admin can delete pages with more than 5000 revisions.
The introduction of the limit was announced in 2008 at WP:VPT archive 16 https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)/Archive_16#Deletion_restrictions_for_pages_with_long_histories. IIRC the main problem was replication lag.
Later, the queries were broken up into batches with a wait for replication. This meant that deleting large articles was merely slow (tens of seconds) and prone to failure, it didn't immediately break the whole site. The bigdelete right was created and was granted to some groups, but often, deleting articles with many revisions required the use of a server-side maintenance script, since a normal request would time out and the database writes would roll back.
In 2018, deleting pages with many revisions became asynchronous, deferred via the job queue (T198176 https://phabricator.wikimedia.org/T198176). So it became feasible to delete these pages via the web.
I don't think there has been a discussion since then on the value of $wgDeleteRevisionsLimit or the groups given the bigdelete right.
-- Tim Starling