Hi, This is a demo of the tool I'm building demohttps://googledrive.com/host/0B1hJO1N6piYFTTVZdW1mU2c0S28/visualise.html. It needs to be interactive and be able to show the latest edits too, hence I dont think dumps would serve the purpose. Is there any other alternative you could suggest. In the past I have tried javascript wikitext parsers without much success.
Thanks Jeph
On Sun, Nov 3, 2013 at 4:45 PM, Max Semenik maxsem.wiki@gmail.com wrote:
On 03.11.2013, 0:05 jeph wrote:
Hi, As part of the visualisation tool I'm building I'm fetching the parsed revisions of an article. When the article is of a considerable size , eg latest revisions of Barack Obama it takes 10+ seconds. As the tool is interactive and it shows the edits made to an article as an animation the time taken by the server does not bode well. (The requests are only read )
I'm currently not making parallel requests. What would a reasonable degree of parallel requests. Are there other ways to get around this
latency issue ?
https://meta.wikimedia.org/wiki/Grants:IEG/Replay_Edits talks about the
tool and the project.
Parsing is an extremely slow operation, so I don't think that making parallel requests would be wise - please use dumps https://meta.wikimedia.org/wiki/Dumps.
-- Best regards, Max Semenik ([[User:MaxSem]])
Mediawiki-api mailing list Mediawiki-api@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-api