jayvdb added a comment.
To me, 'step' feels like it is breaking a batch into non overlapping subsets, which isnt strictly true if each 'step' is a new random sequence, especially if each batch contains only unique items (which means the server algorithm is slightly reducing the randomness, when a duplicate appears).
If we look at a very small wiki, the underlying generator doesnt repeat if the limit isnt reached. https://www.molnac.unisa.it/BioTools/mdcons/api.php?action=query&generat... https://www.molnac.unisa.it/BioTools/mdcons/index.php/Special:ListFiles
IMO, in site.randompages, we are trying to expose the underlying MediaWiki API, and it doesnt have continuation. A caller cant know that limit 20 is two batches of 10 from the server algorithm, or a single batch of 20 from the server algorithm (which is unique?). The only way to have any chance of knowing that is to obtain the API limit from paraminfo , and use that.
However, I dont know the underlying randomness algorithm well enough to speak with much authority about that, or how many of our users are wanting to 'see' the underlying randomness vs happy with any randomness that has slight oddities introduced by multiple disjoint batches.
Whatever we do, we need to update the docstring to explain what we are doing in case the caller cares.
TASK DETAIL https://phabricator.wikimedia.org/T84944
REPLY HANDLER ACTIONS Reply to comment or attach files, or !close, !claim, !unsubscribe or !assign <username>.
EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: jayvdb Cc: gerritbot, valhallasw, jayvdb, Aklapper, Mpaa, pywikipedia-bugs