Say your wiki is small, only 5000 pages, and you have apihighlimits.
With gaplimit=max&pllimit=max&pllinks=Foo, right now no continuation
will be required because each page will return just zero or one link.
With your proposed change to force gaplimit=max to mean something
lower than the normal max would require continuation to get all pages.
You STILL have to do a plcontinue -- because without it you will not know
if the
<page> element is empty because it has no links, or because the
server didn't get to it yet!!! And with gaplimit == pllimit == 5000, having
1 or less links per page on average guarantees that you will get all 5000
pages without any plcontinue, which is the same as my proposal!
As for lowering gaplimit, it was just an idea, in case you get MANY links
per page, and I suggested that the server may scale back if it sees that
many <page> will not be populated. But this won't change anything - you
still have to decide - you need many pages, or few pages!
Really, so far I have not heard of any reasonable bot or use case that
would need this at all.