Parsing is an extremely slow operation, so I don't think thatOn 03.11.2013, 0:05 jeph wrote:
> Hi,
> As part of the visualisation tool I'm building I'm fetching the
> parsed revisions of an article. When the article is of a
> considerable size , eg latest revisions of Barack Obama it takes 10+
> seconds. As the tool is interactive and it shows the edits made to
> an article as an animation the time taken by the server does not
> bode well. (The requests are only read )
> I'm currently not making parallel requests. What would a reasonable
> degree of parallel requests. Are there other ways to get around this latency issue ?
> https://meta.wikimedia.org/wiki/Grants:IEG/Replay_Edits talks about the tool and the project.
making parallel requests would be wise - please use dumps
<https://meta.wikimedia.org/wiki/Dumps>.
--
Best regards,
Max Semenik ([[User:MaxSem]])
_______________________________________________
Mediawiki-api mailing list
Mediawiki-api@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api