Feature Requests item #1993062, was opened at 2008-06-13 16:47
Message generated for change (Comment added) made by btongminh
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Priority: 7
Private: No
Submitted By: Melancholie (melancholie)
Assigned to: Nobody/Anonymous (nobody)
Summary: Use API module 'parse' for retrieving interwiki links
Initial Comment:
Currently pages are retrieved in a batch by using Special:Export.
Although being fast (as only one request is done), there is a huge data overhead with this method!
Why not use the API with its 'parse' module? Only interwiki links can be fetched with that, reducing traffic (overhead) a lot!
See:
http://de.wikipedia.org/w/api.php?action=parse&format=xml&page=Test&prop=la…
Outputs could be downloaded in parallel to virtualize a batch (faster).
----
At least make this method optional (config.py) for being able of reducing data traffic, if wanted. API is just more efficient.
----------------------------------------------------------------------
>Comment By: Bryan (btongminh)
Date: 2008-06-13 20:44
Message:
Logged In: YES
user_id=1806226
Originator: NO
Backwards compatibility with non Wikimedia wikis?
----------------------------------------------------------------------
Comment By: Melancholie (melancholie)
Date: 2008-06-13 17:20
Message:
Logged In: YES
user_id=2089773
Originator: YES
For not being misusable of confusing bots, the yet to be set up MediaWiki
message could contain [[foreigncode:{{CURRENTTIMESTAMP}}]] (cache issue?)
(sorry for spamming with this request ;-)
----------------------------------------------------------------------
Comment By: Melancholie (melancholie)
Date: 2008-06-13 17:08
Message:
Logged In: YES
user_id=2089773
Originator: YES
Important note for getting pages' interwikis in a batch:
http://de.wikipedia.org/w/api.php?action=parse&text={{:Test}}{{:Bot}}{{:Hau…
Either the bot could figure out what interwikis belong together then, or
maybe a marker could placed in between:
http://de.wikipedia.org/w/api.php?action=parse&text={{:Test}}{{MediaWiki:Iw…
[[MediaWiki:Iwmarker]] (or 'Llmarker'?) would have to be set up by the
MediaWiki developers with [[en:/de:Abuse-save-mark]] as content (but this
is potentially misusable).
----------------------------------------------------------------------
Comment By: Melancholie (melancholie)
Date: 2008-06-13 16:51
Message:
Logged In: YES
user_id=2089773
Originator: YES
Note: Maybe combine it with 'generator'.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Feature Requests item #1993062, was opened at 2008-06-13 16:47
Message generated for change (Comment added) made by melancholie
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Priority: 7
Private: No
Submitted By: Melancholie (melancholie)
Assigned to: Nobody/Anonymous (nobody)
Summary: Use API module 'parse' for retrieving interwiki links
Initial Comment:
Currently pages are retrieved in a batch by using Special:Export.
Although being fast (as only one request is done), there is a huge data overhead with this method!
Why not use the API with its 'parse' module? Only interwiki links can be fetched with that, reducing traffic (overhead) a lot!
See:
http://de.wikipedia.org/w/api.php?action=parse&format=xml&page=Test&prop=la…
Outputs could be downloaded in parallel to virtualize a batch (faster).
----
At least make this method optional (config.py) for being able of reducing data traffic, if wanted. API is just more efficient.
----------------------------------------------------------------------
>Comment By: Melancholie (melancholie)
Date: 2008-06-13 17:20
Message:
Logged In: YES
user_id=2089773
Originator: YES
For not being misusable of confusing bots, the yet to be set up MediaWiki
message could contain [[foreigncode:{{CURRENTTIMESTAMP}}]] (cache issue?)
(sorry for spamming with this request ;-)
----------------------------------------------------------------------
Comment By: Melancholie (melancholie)
Date: 2008-06-13 17:08
Message:
Logged In: YES
user_id=2089773
Originator: YES
Important note for getting pages' interwikis in a batch:
http://de.wikipedia.org/w/api.php?action=parse&text={{:Test}}{{:Bot}}{{:Hau…
Either the bot could figure out what interwikis belong together then, or
maybe a marker could placed in between:
http://de.wikipedia.org/w/api.php?action=parse&text={{:Test}}{{MediaWiki:Iw…
[[MediaWiki:Iwmarker]] (or 'Llmarker'?) would have to be set up by the
MediaWiki developers with [[en:/de:Abuse-save-mark]] as content (but this
is potentially misusable).
----------------------------------------------------------------------
Comment By: Melancholie (melancholie)
Date: 2008-06-13 16:51
Message:
Logged In: YES
user_id=2089773
Originator: YES
Note: Maybe combine it with 'generator'.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Feature Requests item #1993062, was opened at 2008-06-13 16:47
Message generated for change (Comment added) made by melancholie
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Priority: 7
Private: No
Submitted By: Melancholie (melancholie)
Assigned to: Nobody/Anonymous (nobody)
Summary: Use API module 'parse' for retrieving interwiki links
Initial Comment:
Currently pages are retrieved in a batch by using Special:Export.
Although being fast (as only one request is done), there is a huge data overhead with this method!
Why not use the API with its 'parse' module? Only interwiki links can be fetched with that, reducing traffic (overhead) a lot!
See:
http://de.wikipedia.org/w/api.php?action=parse&format=xml&page=Test&prop=la…
Outputs could be downloaded in parallel to virtualize a batch (faster).
----
At least make this method optional (config.py) for being able of reducing data traffic, if wanted. API is just more efficient.
----------------------------------------------------------------------
>Comment By: Melancholie (melancholie)
Date: 2008-06-13 17:08
Message:
Logged In: YES
user_id=2089773
Originator: YES
Important note for getting pages' interwikis in a batch:
http://de.wikipedia.org/w/api.php?action=parse&text={{:Test}}{{:Bot}}{{:Hau…
Either the bot could figure out what interwikis belong together then, or
maybe a marker could placed in between:
http://de.wikipedia.org/w/api.php?action=parse&text={{:Test}}{{MediaWiki:Iw…
[[MediaWiki:Iwmarker]] (or 'Llmarker'?) would have to be set up by the
MediaWiki developers with [[en:/de:Abuse-save-mark]] as content (but this
is potentially misusable).
----------------------------------------------------------------------
Comment By: Melancholie (melancholie)
Date: 2008-06-13 16:51
Message:
Logged In: YES
user_id=2089773
Originator: YES
Note: Maybe combine it with 'generator'.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Feature Requests item #1993062, was opened at 2008-06-13 16:47
Message generated for change (Settings changed) made by melancholie
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Priority: 7
Private: No
Submitted By: Melancholie (melancholie)
Assigned to: Nobody/Anonymous (nobody)
>Summary: Use API module 'parse' for retrieving interwiki links
Initial Comment:
Currently pages are retrieved in a batch by using Special:Export.
Although being fast (as only one request is done), there is a huge data overhead with this method!
Why not use the API with its 'parse' module? Only interwiki links can be fetched with that, reducing traffic (overhead) a lot!
See:
http://de.wikipedia.org/w/api.php?action=parse&format=xml&page=Test&prop=la…
Outputs could be downloaded in parallel to virtualize a batch (faster).
----
At least make this method optional (config.py) for being able of reducing data traffic, if wanted. API is just more efficient.
----------------------------------------------------------------------
Comment By: Melancholie (melancholie)
Date: 2008-06-13 16:51
Message:
Logged In: YES
user_id=2089773
Originator: YES
Note: Maybe combine it with 'generator'.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Feature Requests item #1993062, was opened at 2008-06-13 16:47
Message generated for change (Comment added) made by melancholie
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Priority: 7
Private: No
Submitted By: Melancholie (melancholie)
Assigned to: Nobody/Anonymous (nobody)
Summary: Use API module parse for retrieving interwiki links
Initial Comment:
Currently pages are retrieved in a batch by using Special:Export.
Although being fast (as only one request is done), there is a huge data overhead with this method!
Why not use the API with its 'parse' module? Only interwiki links can be fetched with that, reducing traffic (overhead) a lot!
See:
http://de.wikipedia.org/w/api.php?action=parse&format=xml&page=Test&prop=la…
Outputs could be downloaded in parallel to virtualize a batch (faster).
----
At least make this method optional (config.py) for being able of reducing data traffic, if wanted. API is just more efficient.
----------------------------------------------------------------------
>Comment By: Melancholie (melancholie)
Date: 2008-06-13 16:51
Message:
Logged In: YES
user_id=2089773
Originator: YES
Note: Maybe combine it with 'generator'.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Feature Requests item #1993062, was opened at 2008-06-13 16:47
Message generated for change (Settings changed) made by melancholie
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
>Priority: 7
Private: No
Submitted By: Melancholie (melancholie)
Assigned to: Nobody/Anonymous (nobody)
Summary: Use API module parse for retrieving interwiki links
Initial Comment:
Currently pages are retrieved in a batch by using Special:Export.
Although being fast (as only one request is done), there is a huge data overhead with this method!
Why not use the API with its 'parse' module? Only interwiki links can be fetched with that, reducing traffic (overhead) a lot!
See:
http://de.wikipedia.org/w/api.php?action=parse&format=xml&page=Test&prop=la…
Outputs could be downloaded in parallel to virtualize a batch (faster).
----
At least make this method optional (config.py) for being able of reducing data traffic, if wanted. API is just more efficient.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Feature Requests item #1993062, was opened at 2008-06-13 16:47
Message generated for change (Tracker Item Submitted) made by Item Submitter
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Priority: 5
Private: No
Submitted By: Melancholie (melancholie)
Assigned to: Nobody/Anonymous (nobody)
Summary: Use API module parse for retrieving interwiki links
Initial Comment:
Currently pages are retrieved in a batch by using Special:Export.
Although being fast (as only one request is done), there is a huge data overhead with this method!
Why not use the API with its 'parse' module? Only interwiki links can be fetched with that, reducing traffic (overhead) a lot!
See:
http://de.wikipedia.org/w/api.php?action=parse&format=xml&page=Test&prop=la…
Outputs could be downloaded in parallel to virtualize a batch (faster).
----
At least make this method optional (config.py) for being able of reducing data traffic, if wanted. API is just more efficient.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062&group_…
Patches item #1880140, was opened at 2008-01-26 03:48
Message generated for change (Comment added) made by cosoleto
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=1880140&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: AndreasJS (andreasjs)
Assigned to: Nobody/Anonymous (nobody)
Summary: Decode Esperanto titles
Initial Comment:
Titles have to be decoded from x convention for liks to work properly.
Here is the patch:
Index: wikipedia.py
===================================================================
--- wikipedia.py (revision 4939)
+++ wikipedia.py (working copy)
@@ -454,6 +454,8 @@
pass
if underscore:
title = title.replace(' ', '_')
+ if self.site().lang == 'eo':
+ title = decodeEsperantoX(title)
return title
def titleWithoutNamespace(self, underscore=False):
----------------------------------------------------------------------
>Comment By: Francesco Cosoleto (cosoleto)
Date: 2008-06-13 12:19
Message:
Logged In: YES
user_id=181280
Originator: NO
Applied in r5305, undone in r5563.
See [ 1988771 ] Encoding issues with Esperanto
(https://sourceforge.net/tracker/index.php?func=detail&aid=1988771&group_id=…)
----------------------------------------------------------------------
Comment By: André Malafaya Baptista (malafaya)
Date: 2008-01-26 15:50
Message:
Logged In: YES
user_id=1037345
Originator: NO
This seems to be working fine using "redirect.py -lang:eo double".
----------------------------------------------------------------------
Comment By: AndreasJS (andreasjs)
Date: 2008-01-26 15:37
Message:
Logged In: YES
user_id=1738850
Originator: YES
File Added: wikipedia.diff
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603140&aid=1880140&group_…