Another dev commented that giving sysops access to the job que reduces
the simplicity that it was meant to have.
Perhaps the double redirect fixer should verify that everything is in
order before it does a fix. At first thought, I think something as
simple as making sure the page it is told to redirect to is not a
redirect itself should suffice.
~Daniel Friesen(Dantman, Nadir-Seen-Fire) of:
-The Nadir-Point Group (
)
--Games-G.P.S. (
On Wed, Jul 23, 2008 at 5:54 PM, Tim Starling
<tstarling(a)wikimedia.org> wrote:
This feature was inspired by a meeting with White
Cat at Wikimania, seeing
the terrible conditions his bots are forced to work under, shoulder to
shoulder in 15 tiled command prompt windows, fixing double redirects all
day and all night, working their poor little fingers to the bone. It was
very sad.
Comments would be appreciated.
In general, I think this is a very handy feature which relieves a lot
of work that is currently done either manual or by bot. There are two
minor issues at the moment though:
* When a user moves a page to the wrong target and moves it again to
fix that, the redirect fixer won't notice and still try to fix double
redirects according to the first move. This happens quite often
actually just by inserting a typo or using the wrong lettercase on
part of the target.
* Once a page is moved that shouldn't, there's no way to stop the
system from updating all the double redirects. This opens a new form
of vandalism which we have no way of stopping while it's occurring
because the job queue is not editable (or viewable in fact). The
pseudo-user redirect fixer does not pay attention to blocks so
blocking it has no effect (tested yesterday on dewiki). It would also
be good if local admins could kill individual jobs in the job queue,
especially if there are future features that may make use of the job
queue for automatic edits.
Sebastian