print len(page.getVersionHistory(getAll=True, reverseOrder=True))
print len(page.fullVersionHistory(getAll=True, reverseOrder=True))
Where did I make a mistake?
Framework is of 1 January, did something change since then concerning
Actually it would, each new file introduces more disk read time, file open,
file close time. Then depending on the language it may have to create a new
json parser, parse the file, and then destroy the json parser. With 1-2
files its not that big of a deal, but as it scales up the issue becomes
more and more of a bottle neck
On Wed, Apr 2, 2014 at 1:29 PM, <bugzilla-daemon(a)wikimedia.org> wrote:
> --- Comment #2 from Niklas Laxström <niklas.laxstrom(a)gmail.com> ---
> It shouldn't matter too much whether N messages are in 50 or 1000 files
> up numbers) on the time how much it takes to parse them.
> You are receiving this mail because:
> You are the assignee for the bug.
> Pywikipedia-bugs mailing list
I admit that I've been very confused by svn shutdown, and migration to
Gerrit; I see that difficult-to-understand talks pop out about problems
and bugs about gerrit+pywikipedia, so I feel myself a little bit less
stupid. Confusion has been enhanced by double version of pywikipedia and by
renaming (WHY?) of two main branches of it.
Can please be implemented a banal, simple mean (pip? other?) to keep
pywikipedia routines aligned and running both on Tool Labs and windows and
any other environment? A good, old KISS approach? Thanks.