Feature Requests item #3528379, was opened at 2012-05-20 04:31
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=3528379&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: ToAruShiroiNeko ()
Assigned to: xqt (xqt)
Summary: redirect.py logging of problems that cannot be fixed
Initial Comment:
redirect.py needs to log issues it is unable to fix and why on each wiki. There are several flavors of problems that appears on Special:Doubleredirects
1. Self redirects (redirects that point to themselves)
2. Redirect loops (redirects that go in circles)
3. Double redirects formed due to page protection.
4. Inter-wiki redirects (redirects that point to redirects in other wikis)
It would be a lot easier if I had a log of these pages and user can post it on the village pump or perhaps bot can do this monthly for a select number of wikis. The code already provides a warning on the console but when you are running it on 700 wikis like me that becomes a serious chore to follow.
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2012-07-06 06:16
Message:
mediawiki is (!) already able to distinguish double redirect problems. This
depends only by a config variable. This is not true for redirect loops
where a solution is comming and supports bot users to fix it.
Do you have any sample or threat where peoble needs additional informations
about broken or double redirects to fix them?
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-07-05 13:40
Message:
Sure. This would be ideal for me as well. However my proposal to prevent
the creation of self redirects didn't even fly:
https://bugzilla.wikimedia.org/show_bug.cgi?id=34932
redirect.py wouldn't be needed if mediawiki was able to distinguish double
redirect problems. I have been told that this is too costly which is why I
do not expect a resolution on the mediawiki end.
Local communities are willing to resolve them if they know what the problem
is and that bots cannot fix it. I want to periodically report to them.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-04 23:21
Message:
If there are any remaining items in the special page, this means the bot
cannot fix them. You may create separated logfiles for each language and
tag it to the village pump or elsewhere. You may use pagefromfile.py to tag
it. Anyway I do not see the sense. If people do not process the special
page what is your hope doing this by a another message to the community.
And in general the implementation of such a message must reflect that some
pages are already tagged by the previous run or previous bot. I guess it is
easier to open a bugzila bug to enable some more descriptions about the
items (locked pages, redirect loops). And at last: fell free to submitt a
patch - we will see.
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-07-04 11:17
Message:
The special page does not tell the user the type of problem that exists. It
does not denote that bots cannot fix them. I just want to be able to log
the problems the bot already displays on the cmd page so that I can review
it and point issues on the village pump. It takes me weeks of discussion
and nagging sysops to get them to fix the issues even on wikis like
en.wikipedia. Why are you opposing this?
Local communities do not want automatic speedy deletion tags. I do not want
to manually check all 700 wikis which are almost entirely in languages I
cannot even read.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-03 22:40
Message:
If there is nobody who can deal with these special pages, there is no
reason to post it again on the village pumb. I guess a better way is to
solve the remaining problems if possible. This means for redirect loops and
self links:
- check whether there is any possibility to solve the redirect link to a
new page
- otherwise tag it for speedy deletion.
I worked on that and the code is ready and I did som test edits in past. I
guess I'll commit it in autumn into rewrite.
example of the current working copy working:
>>> USB-on-the-go <<<
Links to: [[USB-on-the-go]].
Warning: Redirect target [[USB-on-the-go]] forms a redirect loop.
NOTE: Searching for USB-on-the-go
1 - ratio 0.692308 1 USB On-The-Go
20 - ratio 0.222222 90 Mobile operating system
18 - ratio 0.181818 99 Universal Serial Bus
10 - ratio 0.300000 33 USB 3.0
13 - ratio 0.285714 45 Live USB
17 - ratio 0.228571 74 USB Implementers Forum
10 - ratio 0.153846 65 Windows To Go
11 - ratio 0.357143 30 USB flash drive
19 - ratio 0.176471 107 Handheld game console
21 - ratio 0.157895 133 Features new to Windows 8
1 (1) USB On-The-Go
[[en:USB-on-the-go]] may lead to [[en:USB On-The-Go]]
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-07-02 06:47
Message:
When you are plowing through 700 wikis even simple tasks become difficult.
A compiled report would let me know which wikis to notify that human
intervention is necessarily on which pages which can denote the type of
intervention necessary. The prepared report could be language specific so
es.Wikipedia would get a report in Spanish, de.Wikipedia would get a report
in German, etc.
Protected redirects are a problem particularly as they appear like stuff
bots can fix but they can't because bots are unable to edit protected
pages. This is not one of your examples. Special:Doubleredirects makes no
distinction for this type of problem.
Redirect loops may be more than 2 pages. Among 200 entries such a thing
could be difficult to spot.
Also while how to deal with redirect loops is obvious to you and me, admins
in local communities are often more than uneasy in dealing with this issue
they are not familiar with.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-01 20:38
Message:
There are several bot who work on double redirects several time a day. The
remaining items might be redirect loops or self redirects. e.g.
Foo --> Foo --> Foo
are always self loops
Foo --> Bar --> Foo
Bar --> Foo --> Bar
are always redirect loops
is it difficult?
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-07-01 11:34
Message:
It isn't easy to distinguish they just appear like redirects bot can fix. I
want to have the option in the code to log that and post this to the
village pump for local communities attention.
It is very difficult for me to do that by hand on 700 wikis of which most
don't even need my attention.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-01 06:31
Message:
They remain in Special:DoubleRedirects and must be fixed by hand or deleted
by admins. It is easy to distinguish between multiple redirects and
redirect loops and there is no need to explain it outside.
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-06-30 17:20
Message:
How is keeping track of problems bots are unable to fix a duplication of
Special:DoubleRedirect's?
You asked me to file this bug request after I explained the problem I was
having to you.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-06-18 04:24
Message:
rejected. I do not see any sense for a list duplication of
Special:DoubleRedirects
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-05-20 06:36
Message:
1. - 3. are all listed by Special:DoubleRedirects. They are remaining after
the redirect bot cannot solve the problem.
4. Interwiki redirects normally are fixed be interwiki bots except there is
a __STATICREDIRECT__ in the redirect page.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=3528379&group_…
Feature Requests item #3528379, was opened at 2012-05-20 04:31
Message generated for change (Comment added) made by
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=3528379&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: ToAruShiroiNeko ()
Assigned to: xqt (xqt)
Summary: redirect.py logging of problems that cannot be fixed
Initial Comment:
redirect.py needs to log issues it is unable to fix and why on each wiki. There are several flavors of problems that appears on Special:Doubleredirects
1. Self redirects (redirects that point to themselves)
2. Redirect loops (redirects that go in circles)
3. Double redirects formed due to page protection.
4. Inter-wiki redirects (redirects that point to redirects in other wikis)
It would be a lot easier if I had a log of these pages and user can post it on the village pump or perhaps bot can do this monthly for a select number of wikis. The code already provides a warning on the console but when you are running it on 700 wikis like me that becomes a serious chore to follow.
----------------------------------------------------------------------
>Comment By: ToAruShiroiNeko ()
Date: 2012-07-05 13:40
Message:
Sure. This would be ideal for me as well. However my proposal to prevent
the creation of self redirects didn't even fly:
https://bugzilla.wikimedia.org/show_bug.cgi?id=34932
redirect.py wouldn't be needed if mediawiki was able to distinguish double
redirect problems. I have been told that this is too costly which is why I
do not expect a resolution on the mediawiki end.
Local communities are willing to resolve them if they know what the problem
is and that bots cannot fix it. I want to periodically report to them.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-04 23:21
Message:
If there are any remaining items in the special page, this means the bot
cannot fix them. You may create separated logfiles for each language and
tag it to the village pump or elsewhere. You may use pagefromfile.py to tag
it. Anyway I do not see the sense. If people do not process the special
page what is your hope doing this by a another message to the community.
And in general the implementation of such a message must reflect that some
pages are already tagged by the previous run or previous bot. I guess it is
easier to open a bugzila bug to enable some more descriptions about the
items (locked pages, redirect loops). And at last: fell free to submitt a
patch - we will see.
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-07-04 11:17
Message:
The special page does not tell the user the type of problem that exists. It
does not denote that bots cannot fix them. I just want to be able to log
the problems the bot already displays on the cmd page so that I can review
it and point issues on the village pump. It takes me weeks of discussion
and nagging sysops to get them to fix the issues even on wikis like
en.wikipedia. Why are you opposing this?
Local communities do not want automatic speedy deletion tags. I do not want
to manually check all 700 wikis which are almost entirely in languages I
cannot even read.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-03 22:40
Message:
If there is nobody who can deal with these special pages, there is no
reason to post it again on the village pumb. I guess a better way is to
solve the remaining problems if possible. This means for redirect loops and
self links:
- check whether there is any possibility to solve the redirect link to a
new page
- otherwise tag it for speedy deletion.
I worked on that and the code is ready and I did som test edits in past. I
guess I'll commit it in autumn into rewrite.
example of the current working copy working:
>>> USB-on-the-go <<<
Links to: [[USB-on-the-go]].
Warning: Redirect target [[USB-on-the-go]] forms a redirect loop.
NOTE: Searching for USB-on-the-go
1 - ratio 0.692308 1 USB On-The-Go
20 - ratio 0.222222 90 Mobile operating system
18 - ratio 0.181818 99 Universal Serial Bus
10 - ratio 0.300000 33 USB 3.0
13 - ratio 0.285714 45 Live USB
17 - ratio 0.228571 74 USB Implementers Forum
10 - ratio 0.153846 65 Windows To Go
11 - ratio 0.357143 30 USB flash drive
19 - ratio 0.176471 107 Handheld game console
21 - ratio 0.157895 133 Features new to Windows 8
1 (1) USB On-The-Go
[[en:USB-on-the-go]] may lead to [[en:USB On-The-Go]]
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-07-02 06:47
Message:
When you are plowing through 700 wikis even simple tasks become difficult.
A compiled report would let me know which wikis to notify that human
intervention is necessarily on which pages which can denote the type of
intervention necessary. The prepared report could be language specific so
es.Wikipedia would get a report in Spanish, de.Wikipedia would get a report
in German, etc.
Protected redirects are a problem particularly as they appear like stuff
bots can fix but they can't because bots are unable to edit protected
pages. This is not one of your examples. Special:Doubleredirects makes no
distinction for this type of problem.
Redirect loops may be more than 2 pages. Among 200 entries such a thing
could be difficult to spot.
Also while how to deal with redirect loops is obvious to you and me, admins
in local communities are often more than uneasy in dealing with this issue
they are not familiar with.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-01 20:38
Message:
There are several bot who work on double redirects several time a day. The
remaining items might be redirect loops or self redirects. e.g.
Foo --> Foo --> Foo
are always self loops
Foo --> Bar --> Foo
Bar --> Foo --> Bar
are always redirect loops
is it difficult?
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-07-01 11:34
Message:
It isn't easy to distinguish they just appear like redirects bot can fix. I
want to have the option in the code to log that and post this to the
village pump for local communities attention.
It is very difficult for me to do that by hand on 700 wikis of which most
don't even need my attention.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-01 06:31
Message:
They remain in Special:DoubleRedirects and must be fixed by hand or deleted
by admins. It is easy to distinguish between multiple redirects and
redirect loops and there is no need to explain it outside.
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-06-30 17:20
Message:
How is keeping track of problems bots are unable to fix a duplication of
Special:DoubleRedirect's?
You asked me to file this bug request after I explained the problem I was
having to you.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-06-18 04:24
Message:
rejected. I do not see any sense for a list duplication of
Special:DoubleRedirects
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-05-20 06:36
Message:
1. - 3. are all listed by Special:DoubleRedirects. They are remaining after
the redirect bot cannot solve the problem.
4. Interwiki redirects normally are fixed be interwiki bots except there is
a __STATICREDIRECT__ in the redirect page.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=3528379&group_…
Bugs item #3521751, was opened at 2012-04-26 12:45
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3521751&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Christoffer (njardarlogar)
Assigned to: Nobody/Anonymous (nobody)
Summary: en page wrongly identified as disambig
Initial Comment:
Version.py:
Pywikipedia trun (r10154, 2012/04/25, 21:06:18)
Python 2.7.2 (default, Jun 12 2011, 14:24:46) [MSC v.1500 64 bit (AMD64)]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
Command:
interwiki.py nn:Agnes -auto -ignore:fi:Agneta -ignore:sl:Neza
gives after a while
Getting 1 page from wikipedia:en...
NOTE: Ignoring link from non-disambiguation page [[nn:Agnes]] to disambiguation
[[en:Agnes (name)]]
I see no disambiguation: http://en.wikipedia.org/wiki/Agnes_(name)
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2012-07-05 00:02
Message:
{{given name}} is interpreted as a disambiguation template, see
en:MediaWiki:DisambiguationsPage
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3521751&group_…
Feature Requests item #3528379, was opened at 2012-05-20 04:31
Message generated for change (Comment added) made by xqt
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=3528379&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: ToAruShiroiNeko ()
Assigned to: xqt (xqt)
Summary: redirect.py logging of problems that cannot be fixed
Initial Comment:
redirect.py needs to log issues it is unable to fix and why on each wiki. There are several flavors of problems that appears on Special:Doubleredirects
1. Self redirects (redirects that point to themselves)
2. Redirect loops (redirects that go in circles)
3. Double redirects formed due to page protection.
4. Inter-wiki redirects (redirects that point to redirects in other wikis)
It would be a lot easier if I had a log of these pages and user can post it on the village pump or perhaps bot can do this monthly for a select number of wikis. The code already provides a warning on the console but when you are running it on 700 wikis like me that becomes a serious chore to follow.
----------------------------------------------------------------------
>Comment By: xqt (xqt)
Date: 2012-07-04 23:21
Message:
If there are any remaining items in the special page, this means the bot
cannot fix them. You may create separated logfiles for each language and
tag it to the village pump or elsewhere. You may use pagefromfile.py to tag
it. Anyway I do not see the sense. If people do not process the special
page what is your hope doing this by a another message to the community.
And in general the implementation of such a message must reflect that some
pages are already tagged by the previous run or previous bot. I guess it is
easier to open a bugzila bug to enable some more descriptions about the
items (locked pages, redirect loops). And at last: fell free to submitt a
patch - we will see.
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-07-04 11:17
Message:
The special page does not tell the user the type of problem that exists. It
does not denote that bots cannot fix them. I just want to be able to log
the problems the bot already displays on the cmd page so that I can review
it and point issues on the village pump. It takes me weeks of discussion
and nagging sysops to get them to fix the issues even on wikis like
en.wikipedia. Why are you opposing this?
Local communities do not want automatic speedy deletion tags. I do not want
to manually check all 700 wikis which are almost entirely in languages I
cannot even read.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-03 22:40
Message:
If there is nobody who can deal with these special pages, there is no
reason to post it again on the village pumb. I guess a better way is to
solve the remaining problems if possible. This means for redirect loops and
self links:
- check whether there is any possibility to solve the redirect link to a
new page
- otherwise tag it for speedy deletion.
I worked on that and the code is ready and I did som test edits in past. I
guess I'll commit it in autumn into rewrite.
example of the current working copy working:
>>> USB-on-the-go <<<
Links to: [[USB-on-the-go]].
Warning: Redirect target [[USB-on-the-go]] forms a redirect loop.
NOTE: Searching for USB-on-the-go
1 - ratio 0.692308 1 USB On-The-Go
20 - ratio 0.222222 90 Mobile operating system
18 - ratio 0.181818 99 Universal Serial Bus
10 - ratio 0.300000 33 USB 3.0
13 - ratio 0.285714 45 Live USB
17 - ratio 0.228571 74 USB Implementers Forum
10 - ratio 0.153846 65 Windows To Go
11 - ratio 0.357143 30 USB flash drive
19 - ratio 0.176471 107 Handheld game console
21 - ratio 0.157895 133 Features new to Windows 8
1 (1) USB On-The-Go
[[en:USB-on-the-go]] may lead to [[en:USB On-The-Go]]
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-07-02 06:47
Message:
When you are plowing through 700 wikis even simple tasks become difficult.
A compiled report would let me know which wikis to notify that human
intervention is necessarily on which pages which can denote the type of
intervention necessary. The prepared report could be language specific so
es.Wikipedia would get a report in Spanish, de.Wikipedia would get a report
in German, etc.
Protected redirects are a problem particularly as they appear like stuff
bots can fix but they can't because bots are unable to edit protected
pages. This is not one of your examples. Special:Doubleredirects makes no
distinction for this type of problem.
Redirect loops may be more than 2 pages. Among 200 entries such a thing
could be difficult to spot.
Also while how to deal with redirect loops is obvious to you and me, admins
in local communities are often more than uneasy in dealing with this issue
they are not familiar with.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-01 20:38
Message:
There are several bot who work on double redirects several time a day. The
remaining items might be redirect loops or self redirects. e.g.
Foo --> Foo --> Foo
are always self loops
Foo --> Bar --> Foo
Bar --> Foo --> Bar
are always redirect loops
is it difficult?
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-07-01 11:34
Message:
It isn't easy to distinguish they just appear like redirects bot can fix. I
want to have the option in the code to log that and post this to the
village pump for local communities attention.
It is very difficult for me to do that by hand on 700 wikis of which most
don't even need my attention.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-01 06:31
Message:
They remain in Special:DoubleRedirects and must be fixed by hand or deleted
by admins. It is easy to distinguish between multiple redirects and
redirect loops and there is no need to explain it outside.
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-06-30 17:20
Message:
How is keeping track of problems bots are unable to fix a duplication of
Special:DoubleRedirect's?
You asked me to file this bug request after I explained the problem I was
having to you.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-06-18 04:24
Message:
rejected. I do not see any sense for a list duplication of
Special:DoubleRedirects
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-05-20 06:36
Message:
1. - 3. are all listed by Special:DoubleRedirects. They are remaining after
the redirect bot cannot solve the problem.
4. Interwiki redirects normally are fixed be interwiki bots except there is
a __STATICREDIRECT__ in the redirect page.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=3528379&group_…
Feature Requests item #3528379, was opened at 2012-05-20 04:31
Message generated for change (Comment added) made by
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=3528379&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: ToAruShiroiNeko ()
Assigned to: xqt (xqt)
Summary: redirect.py logging of problems that cannot be fixed
Initial Comment:
redirect.py needs to log issues it is unable to fix and why on each wiki. There are several flavors of problems that appears on Special:Doubleredirects
1. Self redirects (redirects that point to themselves)
2. Redirect loops (redirects that go in circles)
3. Double redirects formed due to page protection.
4. Inter-wiki redirects (redirects that point to redirects in other wikis)
It would be a lot easier if I had a log of these pages and user can post it on the village pump or perhaps bot can do this monthly for a select number of wikis. The code already provides a warning on the console but when you are running it on 700 wikis like me that becomes a serious chore to follow.
----------------------------------------------------------------------
>Comment By: ToAruShiroiNeko ()
Date: 2012-07-04 11:17
Message:
The special page does not tell the user the type of problem that exists. It
does not denote that bots cannot fix them. I just want to be able to log
the problems the bot already displays on the cmd page so that I can review
it and point issues on the village pump. It takes me weeks of discussion
and nagging sysops to get them to fix the issues even on wikis like
en.wikipedia. Why are you opposing this?
Local communities do not want automatic speedy deletion tags. I do not want
to manually check all 700 wikis which are almost entirely in languages I
cannot even read.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-03 22:40
Message:
If there is nobody who can deal with these special pages, there is no
reason to post it again on the village pumb. I guess a better way is to
solve the remaining problems if possible. This means for redirect loops and
self links:
- check whether there is any possibility to solve the redirect link to a
new page
- otherwise tag it for speedy deletion.
I worked on that and the code is ready and I did som test edits in past. I
guess I'll commit it in autumn into rewrite.
example of the current working copy working:
>>> USB-on-the-go <<<
Links to: [[USB-on-the-go]].
Warning: Redirect target [[USB-on-the-go]] forms a redirect loop.
NOTE: Searching for USB-on-the-go
1 - ratio 0.692308 1 USB On-The-Go
20 - ratio 0.222222 90 Mobile operating system
18 - ratio 0.181818 99 Universal Serial Bus
10 - ratio 0.300000 33 USB 3.0
13 - ratio 0.285714 45 Live USB
17 - ratio 0.228571 74 USB Implementers Forum
10 - ratio 0.153846 65 Windows To Go
11 - ratio 0.357143 30 USB flash drive
19 - ratio 0.176471 107 Handheld game console
21 - ratio 0.157895 133 Features new to Windows 8
1 (1) USB On-The-Go
[[en:USB-on-the-go]] may lead to [[en:USB On-The-Go]]
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-07-02 06:47
Message:
When you are plowing through 700 wikis even simple tasks become difficult.
A compiled report would let me know which wikis to notify that human
intervention is necessarily on which pages which can denote the type of
intervention necessary. The prepared report could be language specific so
es.Wikipedia would get a report in Spanish, de.Wikipedia would get a report
in German, etc.
Protected redirects are a problem particularly as they appear like stuff
bots can fix but they can't because bots are unable to edit protected
pages. This is not one of your examples. Special:Doubleredirects makes no
distinction for this type of problem.
Redirect loops may be more than 2 pages. Among 200 entries such a thing
could be difficult to spot.
Also while how to deal with redirect loops is obvious to you and me, admins
in local communities are often more than uneasy in dealing with this issue
they are not familiar with.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-01 20:38
Message:
There are several bot who work on double redirects several time a day. The
remaining items might be redirect loops or self redirects. e.g.
Foo --> Foo --> Foo
are always self loops
Foo --> Bar --> Foo
Bar --> Foo --> Bar
are always redirect loops
is it difficult?
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-07-01 11:34
Message:
It isn't easy to distinguish they just appear like redirects bot can fix. I
want to have the option in the code to log that and post this to the
village pump for local communities attention.
It is very difficult for me to do that by hand on 700 wikis of which most
don't even need my attention.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-01 06:31
Message:
They remain in Special:DoubleRedirects and must be fixed by hand or deleted
by admins. It is easy to distinguish between multiple redirects and
redirect loops and there is no need to explain it outside.
----------------------------------------------------------------------
Comment By: ToAruShiroiNeko ()
Date: 2012-06-30 17:20
Message:
How is keeping track of problems bots are unable to fix a duplication of
Special:DoubleRedirect's?
You asked me to file this bug request after I explained the problem I was
having to you.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-06-18 04:24
Message:
rejected. I do not see any sense for a list duplication of
Special:DoubleRedirects
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-05-20 06:36
Message:
1. - 3. are all listed by Special:DoubleRedirects. They are remaining after
the redirect bot cannot solve the problem.
4. Interwiki redirects normally are fixed be interwiki bots except there is
a __STATICREDIRECT__ in the redirect page.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=3528379&group_…
Bugs item #3540201, was opened at 2012-07-04 07:41
Message generated for change (Comment added) made by malafaya
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3540201&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Resolution: None
Priority: 6
Private: No
Submitted By: André Malafaya Baptista (malafaya)
Assigned to: Nobody/Anonymous (nobody)
Summary: Category work sequence not correct in es.wiktionary
Initial Comment:
When processing the whole lot of categories in es.wiktionary (-start:Category:!), I noticed it suddenly ends near letter E. After some poking around, I noticed it jumps right after Category:EN:V** to Category:Á**.
A test I made:
interwiki.py -lang:es -start:"Category:EN:V" -async -pt:1 -cleanup -auto -family:wiktionary
NOTE: Number of pages queued is 0, trying to add 60 more.
Getting 14 pages from wiktionary:es...
Sleeping for 4.3 seconds, 2012-07-04 15:33:01
Dump es (wiktionary) appended.
(I hit Ctrl-C to inspect which categories are being worked on)
First thing to note is that it only retrieves 14 categories when you can check at es.wiktionary's Special:Categories that there are **hundreds** of categories after Category:EN:V.
When checking the dump, I have:
[[Categoría:África]]
[[Categoría:Álava]]
[[Categoría:Árabe]]
[[Categoría:Árabe-Español]]
[[Categoría:Árabe egipcio]]
[[Categoría:Árabe egipcio-Español]]
[[Categoría:Árboles]]
[[Categoría:Índice del inglés]]
[[Categoría:Índices ortográficos]]
[[Categoría:Ñeengatú]]
[[Categoría:Ñeengatú-Español]]
[[Categoría:Óptica]]
[[Categoría:ǃxóõ]]
[[Categoría:ǃxóõ-Español]]
As you can see, it jumped to letter Á but it also contains categories starting with !, which would go further into the beginning of processing.
After those 14 categories are worked, the bot stops. The categories between Category:ENM:** and Category:Z* were not processed.
Pywikipedia trunk/pywikipedia/ (r10435, 2012/07/01, 11:47:26)
Python 2.7.2 (default, Jun 12 2011, 14:24:46) [MSC v.1500 64 bit (AMD64)]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
----------------------------------------------------------------------
>Comment By: André Malafaya Baptista (malafaya)
Date: 2012-07-04 08:18
Message:
Possibly related to MediaWiki bug #38165 (filed by myself):
https://bugzilla.wikimedia.org/38165
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3540201&group_…
Bugs item #3540201, was opened at 2012-07-04 07:41
Message generated for change (Tracker Item Submitted) made by malafaya
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3540201&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Resolution: None
Priority: 6
Private: No
Submitted By: André Malafaya Baptista (malafaya)
Assigned to: Nobody/Anonymous (nobody)
Summary: Category work sequence not correct in es.wiktionary
Initial Comment:
When processing the whole lot of categories in es.wiktionary (-start:Category:!), I noticed it suddenly ends near letter E. After some poking around, I noticed it jumps right after Category:EN:V** to Category:Á**.
A test I made:
interwiki.py -lang:es -start:"Category:EN:V" -async -pt:1 -cleanup -auto -family:wiktionary
NOTE: Number of pages queued is 0, trying to add 60 more.
Getting 14 pages from wiktionary:es...
Sleeping for 4.3 seconds, 2012-07-04 15:33:01
Dump es (wiktionary) appended.
(I hit Ctrl-C to inspect which categories are being worked on)
First thing to note is that it only retrieves 14 categories when you can check at es.wiktionary's Special:Categories that there are **hundreds** of categories after Category:EN:V.
When checking the dump, I have:
[[Categoría:África]]
[[Categoría:Álava]]
[[Categoría:Árabe]]
[[Categoría:Árabe-Español]]
[[Categoría:Árabe egipcio]]
[[Categoría:Árabe egipcio-Español]]
[[Categoría:Árboles]]
[[Categoría:Índice del inglés]]
[[Categoría:Índices ortográficos]]
[[Categoría:Ñeengatú]]
[[Categoría:Ñeengatú-Español]]
[[Categoría:Óptica]]
[[Categoría:ǃxóõ]]
[[Categoría:ǃxóõ-Español]]
As you can see, it jumped to letter Á but it also contains categories starting with !, which would go further into the beginning of processing.
After those 14 categories are worked, the bot stops. The categories between Category:ENM:** and Category:Z* were not processed.
Pywikipedia trunk/pywikipedia/ (r10435, 2012/07/01, 11:47:26)
Python 2.7.2 (default, Jun 12 2011, 14:24:46) [MSC v.1500 64 bit (AMD64)]
config-settings:
use_api = True
use_api_login = True
unicode test: ok
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3540201&group_…
Bugs item #3539407, was opened at 2012-07-02 03:01
Message generated for change (Comment added) made by reza1615
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3539407&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: cosmetic changes
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: reza (reza1615)
Assigned to: Nobody/Anonymous (nobody)
Summary: cosmetic_changes bug on citation's number and punctuation
Initial Comment:
class fixArabicLetters() changes Latin citation's number and punctuation (,) to Persian number and punctuation (،) and it is not correct please set it if the text around the number is in Latin do not convert numbers.
http://fa.wikipedia.org/w/index.php?title=%D8%A7%D8%B1%DB%8C%DA%A9_%D8%AA%D…
----------------------------------------------------------------------
>Comment By: reza (reza1615)
Date: 2012-07-04 05:10
Message:
defining regularity for date or address in external urls is not simple
the best rule is when number is inside English or latin text it should be
English Number and others that are in Farsi text should be convert to Farsi
Numbers.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2012-07-03 21:46
Message:
Is there any regularity for these citations e.g. "\(<en-fullmonthname>
\d{2}, \d{4}\)"?
----------------------------------------------------------------------
Comment By: reza (reza1615)
Date: 2012-07-02 03:13
Message:
in fa.wiki we have gadget that works fine it has function (digits () )
that convert numbers correctly my be it will useful for solving this bug
http://fa.wikipedia.org/wiki/%D9%85%D8%AF%DB%8C%D8%A7%D9%88%DB%8C%DA%A9%DB%…
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=3539407&group_…