Hi, Would it be possible to improve the post-move page text to actually show the double redirects, rather than just asking the mover to check for them? In particular, could it show *just* the double redirects, and not all the other links? It's not very obvious, particularly to the novice, which of the following links are "bad":
Rank of hands (poker) (redirect page) * Draw poker * Poker * Poker/Hands (redirect page) o Talk:Poker * Bug (poker) * Ace-to-six low (redirect page) o Draw poker o Stud poker o Five-card stud * Deuce-to-seven low (redirect page) o Draw poker o Kansas City o List of poker terms o Deuce o Jennifer Harman o Dewey Tomko o Billy Baxter (poker player)
A trimmed-down display like the following would be much more helpful:
* Poker/Hands (linked 1 time) * Ace-to-six low (linked 3 times) * Deuce-to-seven low (linked 7 times)
Or if we really want to go the whole hog, could the move page just fix the double redirects automatically? Or maybe at least add them to some list that a bot is constantly monitoring?
Steve
On 12/28/06, Steve Bennett stevagewp@gmail.com wrote:
Or if we really want to go the whole hog, could the move page just fix the double redirects automatically?
Is there any reason we don't do this?
Or maybe at least add them to some list that a bot is constantly monitoring?
Like Special:Doubleredirects? :) (Granted, that's cached, at least right now.)
Simetrical wrote:
On 12/28/06, Steve Bennett stevagewp@gmail.com wrote:
Or if we really want to go the whole hog, could the move page just fix the double redirects automatically?
Is there any reason we don't do this?
There are cases where you don't really want this, perhaps because you intend to start a new article at the original title and want (most of) the existing redirects to point there.
Still, I'd support this as an optional feature, for example as a checkbox on the page move form (maybe even checked by default).
On 29/12/06, Ilmari Karonen nospam@vyznev.net wrote:
There are cases where you don't really want this, perhaps because you intend to start a new article at the original title and want (most of) the existing redirects to point there.
Still, I'd support this as an optional feature, for example as a checkbox on the page move form (maybe even checked by default).
Could use the job queue for it, although I'm pretty sure there was some previous discussion either on this list or the bug tracker regarding the issue, where the consensus was to let the user do the cleanup.
Rob Church
On 12/29/06, Rob Church robchur@gmail.com wrote:
Could use the job queue for it, although I'm pretty sure there was some previous discussion either on this list or the bug tracker regarding the issue, where the consensus was to let the user do the cleanup.
Why the job queue? Even if you have several dozen redirects, it should surely still be fast enough to do immediately.
Simetrical wrote:
On 12/29/06, Rob Church robchur@gmail.com wrote:
Could use the job queue for it, although I'm pretty sure there was some previous discussion either on this list or the bug tracker regarding the issue, where the consensus was to let the user do the cleanup.
Why the job queue? Even if you have several dozen redirects, it should surely still be fast enough to do immediately.
I'm sure we have a page with hundreds of redirects pointing to it somewhere on enwiki. But I suppose the reasonable solution is to do the fixing immediately if the number of redirects is reasonable (say, less than 200) and otherwise either dump it into the job queue or just flat out refuse to do it.
On 29/12/06, Ilmari Karonen nospam@vyznev.net wrote:
Simetrical wrote:
On 12/29/06, Rob Church robchur@gmail.com wrote:
Could use the job queue for it, although I'm pretty sure there was some previous discussion either on this list or the bug tracker regarding the issue, where the consensus was to let the user do the cleanup.
Why the job queue? Even if you have several dozen redirects, it should surely still be fast enough to do immediately.
I'm sure we have a page with hundreds of redirects pointing to it somewhere on enwiki.
Many. I've fixed them after page moves. Gah...
But I suppose the reasonable solution is to do the fixing immediately if the number of redirects is reasonable (say, less than 200) and otherwise either dump it into the job queue or just flat out refuse to do it.
hah!
Having done hundreds of redirects, I would strongly urge *not* to do it automatically. Semi-auto with a bot, maybe, but in the "hundreds" cases every one needs a human review IME.
- d.
On 12/29/06, Rob Church robchur@gmail.com wrote:
Could use the job queue for it, although I'm pretty sure there was some previous discussion either on this list or the bug tracker regarding the issue, where the consensus was to let the user do the cleanup.
This user refuses to do the cleanup. Hope this helps.
Steve
On 30/12/06, Steve Bennett stevagewp@gmail.com wrote:
This user refuses to do the cleanup. Hope this helps.
Superb attitude.
The problem we'd face if, for instance, 200 or even 2000 redirects needed to be corrected in an autonomous, atomic fashion, is that the software has to do the same as the user - open the page for editing, make the changes without disrupting other content on the page (e.g. categories, or cute little templates), save the page, handle edit conflicts, rinse and repeat.
It might be much faster on the server side, but it's still a time-consuming operation, which could well time out for large sets - leaving us with a worse mess to clean up. Then there's the problem that it might be desirable to have some of the redirects left intact, etc.
I'm not against some kind of automated assistance for such operations, but it will have to work in a robust fashion, taking into consideration that we don't want to make the situation worse.
Rob Church
On 12/31/06, Rob Church robchur@gmail.com wrote:
On 30/12/06, Steve Bennett stevagewp@gmail.com wrote:
This user refuses to do the cleanup. Hope this helps.
Superb attitude.
With Wikipedia, and wikis in general, one of the basic principles is that if some improvement can be broken down into several different independent improvement work units, then it's fine to carry out one of those units and not the others. There's no shame in adding {{stub}} to an article, and leaving the job of recategorising it {{brazil-music-stub}} to someone else. I improved a page by renaming it. Someone else - preferably a bot - can do the mind-numbingly tedious job of updating double redirects.
The problem we'd face if, for instance, 200 or even 2000 redirects needed to be corrected in an autonomous, atomic fashion, is that the software has to do the same as the user - open the page for editing, make the changes without disrupting other content on the page (e.g. categories, or cute little templates), save the page, handle edit conflicts, rinse and repeat.
How ever long it takes the server to do it, it takes a lot longer for the user, and pisses them off a hell of a lot more. That's why we even have computers, remember. Your paragraph might serve as an argument against renaming heavily redirected-to pages. It might serve as an argument for some improvement to MediaWiki. But how does it argue against getting a bot to update double redirects instead of a human?
It might be much faster on the server side, but it's still a time-consuming operation, which could well time out for large sets - leaving us with a worse mess to clean up. Then there's the problem that it might be desirable to have some of the redirects left intact, etc.
There are times when we want to leave double-redirects left intact? So that we're on the same page, we have redirects:
A->B
Page B gets renamed to C, creating a redirect at B:
A->B->C
Some page x that used to link to A can still link to A, even after the double-redirect is fixed:
x->A->C
Why would you want "some of the redirects left intact", ie, A still pointing to B?
I'm not against some kind of automated assistance for such operations, but it will have to work in a robust fashion, taking into consideration that we don't want to make the situation worse.
Worse than what? A bunch of double redirects?
Steve
Steve Bennett wrote:
On 12/31/06, Rob Church robchur@gmail.com wrote:
On 30/12/06, Steve Bennett stevagewp@gmail.com wrote:
This user refuses to do the cleanup. Hope this helps.
Superb attitude.
With Wikipedia, and wikis in general, one of the basic principles is that if some improvement can be broken down into several different independent improvement work units, then it's fine to carry out one of those units and not the others. There's no shame in adding {{stub}} to an article, and leaving the job of recategorising it {{brazil-music-stub}} to someone else. I improved a page by renaming it. Someone else - preferably a bot - can do the mind-numbingly tedious job of updating double redirects.
True enough. The part one might disagree with is whether moving a page without fixing the redirects is in fact an improvement; undoubtedly it often is, but in many cases the improvement derived from, say, moving [[Polar bear]] to [[Polar Bear]] or [[Color]] to [[Colour]], or vice versa, is marginal at best, and, even if actually positive, probably still not enough to offset the cost of having broken redirects.
The rest of your post I do generally agree with. If something can be done well by bots, there's not need to have it done by humans. If something can be done well by MediaWiki itself, there's no need to have it done by bots. We're only discussing how to best implement it in MediaWiki -- and waiting for someone ("who, me?") to get off their ass and code it.
There are times when we want to leave double-redirects left intact? So that we're on the same page, we have redirects:
A->B
Page B gets renamed to C, creating a redirect at B:
A->B->C
Why would you want "some of the redirects left intact", ie, A still pointing to B?
Perhaps because you want to start a new article at B, and most of the redirects pointing to it should in fact continue to do so. (In such cases, C is often in fact "B (whatever)" or "B in whatever", or perhaps "B/archive_NN".)
On 12/29/06, Steve Bennett stevagewp@gmail.com wrote:
- Poker/Hands (linked 1 time)
- Ace-to-six low (linked 3 times)
- Deuce-to-seven low (linked 7 times)
Or if we really want to go the whole hog, could the move page just fix the double redirects automatically? Or maybe at least add them to some list that a bot is constantly monitoring?
On 12/29/06, Ilmari Karonen nospam@vyznev.net wrote:
Still, I'd support this as an optional feature, for example as a checkbox on the page move form (maybe even checked by default).
To the proposition given by Steve, I would add "Correct all double redirects", to prevent not wanted pages to be re-redirected automatically.
Like this :
- Poker/Hands (linked 1 time - [[correct]])
- Ace-to-six low (linked 3 times - [[correct]])
- Deuce-to-seven low (linked 7 times - [[correct]])
[[Correct double redirects for these pages]]
Plyd
Plyd wrote:
To the proposition given by Steve, I would add "Correct all double redirects", to prevent not wanted pages to be re-redirected automatically.
Like this :
- Poker/Hands (linked 1 time - [[correct]])
- Ace-to-six low (linked 3 times - [[correct]])
- Deuce-to-seven low (linked 7 times - [[correct]])
[[Correct double redirects for these pages]]
That actually sounds like the best solution so far. Maybe with a checkbox for each redirect, like on [[Special:Undelete]] or [[Special:Watchlist/edit]]?
"Ilmari Karonen" nospam@vyznev.net wrote in message news:4595A54D.2070803@vyznev.net...
Plyd wrote:
To the proposition given by Steve, I would add "Correct all double redirects", to prevent not wanted pages to be re-redirected automatically.
Like this :
- Poker/Hands (linked 1 time - [[correct]])
- Ace-to-six low (linked 3 times - [[correct]])
- Deuce-to-seven low (linked 7 times - [[correct]])
[[Correct double redirects for these pages]]
That actually sounds like the best solution so far. Maybe with a checkbox for each redirect, like on [[Special:Undelete]] or [[Special:Watchlist/edit]]?
Another solution is for MW to be a bit cleverer, and to follow double (triple/quadruple/whatever) redirects automatically, which would make this a non-issue.
To fix the speed issues involved in following these redirects when they are encountered, the actual target of the redirect could be cached in the DB (and this might be useful to do for other reasons too, e.g. http://bugzilla.wikimedia.org/show_bug.cgi?id=6934).
If an infinite loop is encountered the whole loop should be logged to the DB in a 'recursive redirects' table, and shown on an associated redirect page to be fixed. If a user visits one of the pages in the loop then no redirects are followed and the requested page is shown, with a suitable error message allowing them to fix it there and then.
As far as I can see, from a user-interface point of view this is the ideal solution to this and several other problems (I've always wondered why the software doesn't handle double-redirects properly). I don't know whether the reason for only allowing single redirects is technical or just historical, however I'm sure that if there are technical reasons they can be overcome with a bit of creativity.
- Mark Clements (HappyDog)
"Mark Clements" gmane@kennel17.co.uk wrote in message news:enhgtv$1ro$1@sea.gmane.org... [SNIP]
If an infinite loop is encountered the whole loop should be logged to the
DB
in a 'recursive redirects' table, and shown on an associated redirect page to be fixed.
[SNIP]
Oops - I meant "associated special page", not "associated redirect page".
- Mark Clements (HappyDog)
wikitech-l@lists.wikimedia.org