Hi, As part of the visualisation tool I'm building I'm fetching the parsed revisions of an article. When the article is of a considerable size , eg latest revisions of Barack Obama it takes 10+ seconds. As the tool is interactive and it shows the edits made to an article as an animation the time taken by the server does not bode well. (The requests are only read )
I'm currently not making parallel requests. What would a reasonable degree of parallel requests. Are there other ways to get around this latency issue ?
https://meta.wikimedia.org/wiki/Grants:IEG/Replay_Edits talks about the tool and the project.
Thanks Jeph
On 03.11.2013, 0:05 jeph wrote:
Hi, As part of the visualisation tool I'm building I'm fetching the parsed revisions of an article. When the article is of a considerable size , eg latest revisions of Barack Obama it takes 10+ seconds. As the tool is interactive and it shows the edits made to an article as an animation the time taken by the server does not bode well. (The requests are only read )
I'm currently not making parallel requests. What would a reasonable degree of parallel requests. Are there other ways to get around this latency issue ?
https://meta.wikimedia.org/wiki/Grants:IEG/Replay_Edits talks about the tool and the project.
Parsing is an extremely slow operation, so I don't think that making parallel requests would be wise - please use dumps https://meta.wikimedia.org/wiki/Dumps.
Hi, This is a demo of the tool I'm building demohttps://googledrive.com/host/0B1hJO1N6piYFTTVZdW1mU2c0S28/visualise.html. It needs to be interactive and be able to show the latest edits too, hence I dont think dumps would serve the purpose. Is there any other alternative you could suggest. In the past I have tried javascript wikitext parsers without much success.
Thanks Jeph
On Sun, Nov 3, 2013 at 4:45 PM, Max Semenik maxsem.wiki@gmail.com wrote:
On 03.11.2013, 0:05 jeph wrote:
Hi, As part of the visualisation tool I'm building I'm fetching the parsed revisions of an article. When the article is of a considerable size , eg latest revisions of Barack Obama it takes 10+ seconds. As the tool is interactive and it shows the edits made to an article as an animation the time taken by the server does not bode well. (The requests are only read )
I'm currently not making parallel requests. What would a reasonable degree of parallel requests. Are there other ways to get around this
latency issue ?
https://meta.wikimedia.org/wiki/Grants:IEG/Replay_Edits talks about the
tool and the project.
Parsing is an extremely slow operation, so I don't think that making parallel requests would be wise - please use dumps https://meta.wikimedia.org/wiki/Dumps.
-- Best regards, Max Semenik ([[User:MaxSem]])
Mediawiki-api mailing list Mediawiki-api@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
On 04/11/13 18:22, jeph wrote:
Hi, This is a demo of the tool I'm building demo https://googledrive.com/host/0B1hJO1N6piYFTTVZdW1mU2c0S28/visualise.html. It needs to be interactive and be able to show the latest edits too, hence I dont think dumps would serve the purpose. Is there any other alternative you could suggest. In the past I have tried javascript wikitext parsers without much success.
Thanks Jeph
The right way would be to integrate that using parsoid.
Hi, Is there a wikitext to html parser in js thats part of parsoind that I can use ?
Thanks Jeph
On Tue, Nov 5, 2013 at 12:14 AM, Platonides platonides@gmail.com wrote:
On 04/11/13 18:22, jeph wrote:
Hi, This is a demo of the tool I'm building demo <https://googledrive.com/host/0B1hJO1N6piYFTTVZdW1mU2c0S28/visualise.html
.
It needs to be interactive and be able to show the latest edits too, hence I dont think dumps would serve the purpose. Is there any other alternative you could suggest. In the past I have tried javascript wikitext parsers without much success.
Thanks Jeph
The right way would be to integrate that using parsoid.
https://www.mediawiki.org/wiki/Parsoid
Mediawiki-api mailing list Mediawiki-api@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
Hi Jeph,
On Sat, Nov 9, 2013 at 6:41 AM, jeph jephpaul@gmail.com wrote:
Hi, Is there a wikitext to html parser in js thats part of parsoind that I can use ?
Indeed, you can use parsoid as just that, a tool that takes wikitext and outputs html. There's more info at https://www.mediawiki.org/wiki/Parsoid and the Parsoid team is usually in #mediawiki-parsoid on freenode.
Cheers, Marc
Thanks Jeph
On Tue, Nov 5, 2013 at 12:14 AM, Platonides platonides@gmail.com wrote:
On 04/11/13 18:22, jeph wrote:
Hi, This is a demo of the tool I'm building demo https://googledrive.com/host/0B1hJO1N6piYFTTVZdW1mU2c0S28/ visualise.html.
It needs to be interactive and be able to show the latest edits too, hence I dont think dumps would serve the purpose. Is there any other alternative you could suggest. In the past I have tried javascript wikitext parsers without much success.
Thanks Jeph
The right way would be to integrate that using parsoid.
https://www.mediawiki.org/wiki/Parsoid
Mediawiki-api mailing list Mediawiki-api@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
Mediawiki-api mailing list Mediawiki-api@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
mediawiki-api@lists.wikimedia.org