Feature Requests item #1993062, was opened at 2008-06-13 16:47 Message generated for change (Comment added) made by xqt You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062...
Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: interwiki Group: None Status: Open
Priority: 1
Private: No Submitted By: Melancholie (melancholie) Assigned to: Nobody/Anonymous (nobody) Summary: Use API module 'parse' for retrieving interwiki links
Initial Comment: Currently pages are retrieved in a batch by using Special:Export. Although being fast (as only one request is done), there is a huge data overhead with this method!
Why not use the API with its 'parse' module? Only interwiki links can be fetched with that, reducing traffic (overhead) a lot!
See: http://de.wikipedia.org/w/api.php?action=parse&format=xml&page=Test&...
Outputs could be downloaded in parallel to virtualize a batch (faster).
---- At least make this method optional (config.py) for being able of reducing data traffic, if wanted. API is just more efficient.
----------------------------------------------------------------------
Comment By: xqt (xqt)
Date: 2010-09-21 18:35
Message: parse mode is deactivated due to overloading the squids. Nothing to do now.
----------------------------------------------------------------------
Comment By: Multichill (multichill) Date: 2008-11-13 12:46
Message: We are working on a rewrite. The rewrite uses the api as much as possible.
----------------------------------------------------------------------
Comment By: Melancholie (melancholie) Date: 2008-06-15 01:27
Message: Logged In: YES user_id=2089773 Originator: YES
See http://meta.wikimedia.org/wiki/Interwiki_bot_access_protocol concerning disambiguations and redirects:
http://de.wikipedia.org/w/api.php?action=parse&format=xml&text=%7B%7...
----------------------------------------------------------------------
Comment By: Melancholie (melancholie) Date: 2008-06-14 16:38
Message: Logged In: YES user_id=2089773 Originator: YES
Backwards compatibility?
That's no reason for not making software more efficient, where possible ;-) That's also why I wrote something about "optional", too. Because for current MediaWiki wikis there is a much more efficient way of retrieving (only) certain contents (langlinks, categories), there should be a method of using that advantage! Will reduce load (bot owner's and server's)...
----------------------------------------------------------------------
Comment By: Bryan (btongminh) Date: 2008-06-13 20:44
Message: Logged In: YES user_id=1806226 Originator: NO
Backwards compatibility with non Wikimedia wikis?
----------------------------------------------------------------------
Comment By: Melancholie (melancholie) Date: 2008-06-13 17:20
Message: Logged In: YES user_id=2089773 Originator: YES
For not being misusable of confusing bots, the yet to be set up MediaWiki message could contain [[foreigncode:{{CURRENTTIMESTAMP}}]] (cache issue?)
(sorry for spamming with this request ;-)
----------------------------------------------------------------------
Comment By: Melancholie (melancholie) Date: 2008-06-13 17:08
Message: Logged In: YES user_id=2089773 Originator: YES
Important note for getting pages' interwikis in a batch: http://de.wikipedia.org/w/api.php?action=parse&text=%7B%7B:Test%7D%7D%7B...
Either the bot could figure out what interwikis belong together then, or
maybe a marker could placed in between: http://de.wikipedia.org/w/api.php?action=parse&text=%7B%7B:Test%7D%7D%7B...
[[MediaWiki:Iwmarker]] (or 'Llmarker'?) would have to be set up by the MediaWiki developers with [[en:/de:Abuse-save-mark]] as content (but this is potentially misusable).
----------------------------------------------------------------------
Comment By: Melancholie (melancholie) Date: 2008-06-13 16:51
Message: Logged In: YES user_id=2089773 Originator: YES
Note: Maybe combine it with 'generator'.
----------------------------------------------------------------------
You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1993062...
pywikipedia-bugs@lists.wikimedia.org