In r46845 [1], the issue raised in bug 11430 [2] a year and a half ago was finally addressed: when the API was asked to produce huge amounts of data (for instance the content of 500 revisions at 280 KB each), it would run out of memory trying to store and process it. To prevent this from happening, the amount of data the API can return is now limited. This means that the behavior of requests that used to run out of memory has changed: they will return fewer results than the limit, even though there are more results available (they'll still set query-continue right, though). For instance, the aforementioned request would return about 300 revisions and set a query-continue for the rest.
Roan Kattouw (Catrope)
[1] http://www.mediawiki.org/wiki/Special:Code/MediaWiki/46845 [2] https://bugzilla.wikimedia.org/show_bug.cgi?id=11430