Aklapper added a project: pywikibot-compat.
TASK DETAIL
https://phabricator.wikimedia.org/T57238
REPLY HANDLER ACTIONS
Reply to comment or attach files, or !close, !claim, !unsubscribe or !assign <username>.
To: Aklapper
Cc: pywikipedia-bugs, Legoktm, jayvdb
Aklapper added a project: pywikibot-compat.
TASK DETAIL
https://phabricator.wikimedia.org/T74667
REPLY HANDLER ACTIONS
Reply to comment or attach files, or !close, !claim, !unsubscribe or !assign <username>.
To: Aklapper
Cc: pywikipedia-bugs, JAnD
Aklapper added a project: pywikibot-compat.
TASK DETAIL
https://phabricator.wikimedia.org/T61500
REPLY HANDLER ACTIONS
Reply to comment or attach files, or !close, !claim, !unsubscribe or !assign <username>.
To: Aklapper
Cc: pywikipedia-bugs, jayvdb, Xqt
Aklapper added a project: pywikibot-compat.
TASK DETAIL
https://phabricator.wikimedia.org/T67976
REPLY HANDLER ACTIONS
Reply to comment or attach files, or !close, !claim, !unsubscribe or !assign <username>.
To: Aklapper
Cc: pywikipedia-bugs, Ricordisamoa, Liuxinyu970226
jayvdb created this task.
jayvdb claimed this task.
jayvdb added subscribers: pywikipedia-bugs, jayvdb.
jayvdb added a project: pywikibot-core.
jayvdb changed Security from none to none.
TASK DESCRIPTION
QueryGenerator.__iter__ populates self.normalized with normalised titles data from the API, but this isnt used anywhere in the repo. It could be used by subclasses outside of the repo, and it probably _should_ be used by our own subclasses.
APISite.getredirtarget() performs its own info property query, and uses the same normalized data. It would probably be inefficient to convert this to using QueryGenerator
APISite.preloadpages already uses QueryGenerator, but uses self.sametitle instead of QueryGenerator.normalized to detect normalised titles.
TASK DETAIL
https://phabricator.wikimedia.org/T76144
REPLY HANDLER ACTIONS
Reply to comment or attach files, or !close, !claim, !unsubscribe or !assign <username>.
To: jayvdb
Cc: pywikipedia-bugs, jayvdb
jayvdb created this task.
jayvdb claimed this task.
jayvdb added a subscriber: jayvdb.
jayvdb added a project: pywikibot-core.
jayvdb changed Security from none to none.
TASK DESCRIPTION
When set_maximum_items is set to -1, as is done to fetch revisions, or in the default None state, the argument given to set_query_increment is ignored.
This means it does not always act like 'step' which is commonly available to the site API , and APISite._generator 'step' argument is put into a set_query_increment call.
This effects the responsiveness of the generators, and other aspects like network & memory utilisation. If the caller wants to process a lot of properties for a large list of items, but prefers to do this in small batches, QueryGenerator will attempt to process all items (it will actually send more items than the API limit for titles, pageids, etc), and will iterate over them all.
QueryGenerator, or a subclass, should send batches of the requested query increment only, and continue sending batches until the requested set of items is processed.
TASK DETAIL
https://phabricator.wikimedia.org/T76141
REPLY HANDLER ACTIONS
Reply to comment or attach files, or !close, !claim, !unsubscribe or !assign <username>.
To: jayvdb
Cc: pywikipedia-bugs, jayvdb
Liuxinyu970226 added a subscriber: Liuxinyu970226.
TASK DETAIL
https://phabricator.wikimedia.org/T67976
REPLY HANDLER ACTIONS
Reply to comment or attach files, or !close, !claim, !unsubscribe or !assign <username>.
To: Liuxinyu970226
Cc: pywikipedia-bugs, Ricordisamoa, Liuxinyu970226