print len(page.getVersionHistory(getAll=True, reverseOrder=True))
print len(page.fullVersionHistory(getAll=True, reverseOrder=True))
3838
258
Where did I make a mistake?
Framework is of 1 January, did something change since then concerning
fullVersionHistory?
--
Bináris
Actually it would, each new file introduces more disk read time, file open,
file close time. Then depending on the language it may have to create a new
json parser, parse the file, and then destroy the json parser. With 1-2
files its not that big of a deal, but as it scales up the issue becomes
more and more of a bottle neck
On Wed, Apr 2, 2014 at 1:29 PM, <bugzilla-daemon(a)wikimedia.org> wrote:
> https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
>
> --- Comment #2 from Niklas Laxström <niklas.laxstrom(a)gmail.com> ---
> It shouldn't matter too much whether N messages are in 50 or 1000 files
> (made
> up numbers) on the time how much it takes to parse them.
>
> --
> You are receiving this mail because:
> You are the assignee for the bug.
> _______________________________________________
> Pywikipedia-bugs mailing list
> Pywikipedia-bugs(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/pywikipedia-bugs
>
I admit that I've been very confused by svn shutdown, and migration to
Gerrit; I see that difficult-to-understand talks pop out about problems
and bugs about gerrit+pywikipedia, so I feel myself a little bit less
stupid. Confusion has been enhanced by double version of pywikipedia and by
renaming (WHY?) of two main branches of it.
Can please be implemented a banal, simple mean (pip? other?) to keep
pywikipedia routines aligned and running both on Tool Labs and windows and
any other environment? A good, old KISS approach? Thanks.
Alex brollo