Sebastian Moleski wrote:
On Wed, Jul 23, 2008 at 5:54 PM, Tim Starling tstarling@wikimedia.org wrote:
This feature was inspired by a meeting with White Cat at Wikimania, seeing the terrible conditions his bots are forced to work under, shoulder to shoulder in 15 tiled command prompt windows, fixing double redirects all day and all night, working their poor little fingers to the bone. It was very sad.
Comments would be appreciated.
In general, I think this is a very handy feature which relieves a lot of work that is currently done either manual or by bot. There are two minor issues at the moment though:
- When a user moves a page to the wrong target and moves it again to
fix that, the redirect fixer won't notice and still try to fix double redirects according to the first move. This happens quite often actually just by inserting a typo or using the wrong lettercase on part of the target.
Are you sure? It doesn't even record the destination of the move in the job queue, it just follows the redirect chain (using the master) before each edit, to make sure the job still has to be done.
- Once a page is moved that shouldn't, there's no way to stop the
system from updating all the double redirects. This opens a new form of vandalism which we have no way of stopping while it's occurring because the job queue is not editable (or viewable in fact). The pseudo-user redirect fixer does not pay attention to blocks so blocking it has no effect (tested yesterday on dewiki). It would also be good if local admins could kill individual jobs in the job queue, especially if there are future features that may make use of the job queue for automatic edits.
If you just move the page back, the creation of incorrect redirects will instantly stop, and new jobs will be queued up to fix the incorrect redirects. That's how it's meant to work anyway.
-- Tim Starling