Bugs item #2619054, was opened at 2009-02-20 08:04 Message generated for change (Comment added) made by sf-robot You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2619054...
Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: rewrite Group: None
Status: Closed
Resolution: Fixed Priority: 5 Private: No Submitted By: NicDumZ Nicolas Dumazet (nicdumz) Assigned to: Russell Blau (russblau) Summary: clarify between limit, number, batch and step parameters
Initial Comment: I had a strange behavior of replace.py -weblink: that I couldn't quite diagnose: some pages were not treated.
First of all, those detailed logs are a great gift. They are a bit messy to understand at first, but thanks to those I found the bug and fixed it in r6386 ( http://svn.wikimedia.org/viewvc/pywikipedia?view=rev&revision=6386 ).
I believe that this parameter confusion is a very bad habit we have from the old framework. (the only reason there we have those bugs is because we merged pagegenerators from trunk.) We need to agree on common parameters for generators that have a global meaning, and stick to it.
I personally think that -limit might be a bit confusing (is it an api limit, a limit enforced by the local application on a huge fetched set, etc ?), while -number appears a bit more clear. But it's a personal opinion =) What about -number for "number of items to retrieve", and -step, or -maxstep for the maximum number of items to retrieve at once ? Actually, I don't mind about the names; we just need to agree on something meaningful enough, and document them in the file headings.
On a sidenote, replace.py -fix:yu-tld -weblink:*.yu is actually running on fr.wp. No issues sighted. =)
----------------------------------------------------------------------
Comment By: SourceForge Robot (sf-robot)
Date: 2010-01-21 02:20
Message: This Tracker item was closed automatically by the system. It was previously set to a Pending status, and the original submitter did not respond within 14 days (the time period specified by the administrator of this Tracker).
----------------------------------------------------------------------
Comment By: Russell Blau (russblau) Date: 2010-01-06 18:59
Message: This was fixed a while back but I neglected to close the bug; please reopen if any continuing problems exist.
----------------------------------------------------------------------
Comment By: NicDumZ Nicolas Dumazet (nicdumz) Date: 2009-02-22 04:50
Message: Well I think that one of the first steps here is to consider what is currently done in the old pagegenerators =) Here's a small summary of the "limits" enforced by our old pagegenerators.
The overall internal naming consistency factor is quite low for now, not to mention the surprising facts I found :s
I've considered for each generator, the pagegenerators function, and its Site/Page/Image/Category counterpart: unless noted, both function parameter namings are consistent.
* shortpages, new(pages|images), unusedfiles, withoutinterwiki, uncategorized(images|categories|pages), unwatchedpages, ancientpages, deadendpages, longpages, shortpages, search They use "number" (meant as "batch"/"max") + boolean "repeat". Overall, you can get either "number" items, or all.
* random(page|redirect) are good examples of inconsistencies: they use number (batch/max) + repeat, but since Special:Random gives only one page at a time, the actual "batch" parameter is always 1. (behavior is "for _ in range(number), fetch one page") And if repeat=True ... those functions never stop, if I'm right. irrrk !!
* filelinks, imagelinks, interwiki they scrap the article wikipage, and yield everything in one step from the wikitext
* categorymembers, subcategories they scrap category pages. No parameter is available, since the UI doesn't let us customize the number of displayed links. Follows the (next) links on the category page. Stops when all the items have been retrieved.
* allpages, prefixindex, getReferences no function parameters. They use config.special_page_limit as "batch/max", and all items are retrieved through repeated queries. if special_page_limit > 999, getReferences sets it back to 999. (?!)
* linksearch pagegenerators has a "step=500" parameter, the corresponding Site function uses "limit=500". Meant as "batch/batch": all the links are retrieved through repeated queries
* usercontribs number=250, meant as "batch/max". All the contribs are retrieved through repeated queries. if number>500, sets it back to 500
It seems that the most common used combination is number+repeat. But I really don't think that it is the way to go, since you cannot accurately describe the total number of items you want to retrieve: either number, either all items... I think a "batch" + "total" integer parameters could be more useful here (namings are illustrative)
On the other hand, users should be able to say "I want to retrieve all the items": looking into the code, I see that a "-1" convention is used now. If I understand things correctly, it is used in a "batch" context: if we call set_maximum_items(-1), in most of the cases, the API uses its default xxlimit number. We could use such a convention for our "total" parameter too. Be it -1, or None, whatever, but I think that with such a policy, we should cover all the use cases.
Given what I found, I really don't think that backwards compatibility should be a priority here. I would rather introduce a breaking change in namings, so that people don't expect the new limits to work "as in the old framework"... because in the old framework, limit behaviors were not even internally consistent...
----------------------------------------------------------------------
Comment By: Russell Blau (russblau) Date: 2009-02-20 15:00
Message: A good point. A query can have two different types of limits: the limit on the number of pages/links/whatever retrieved from the API in a single request (defaults to "max"), and the limit on the total number of items to be retrieved from a repeated query. We should do this in a way that is (a) internally consistent among all generators, and (b) as much as possible, backwards-compatible with the old pagegenerators module (but this is secondary to getting something that works).
----------------------------------------------------------------------
You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2619054...