Dirk Riehle wrote:
Here an interesting alternative implementation for
MediaWiki/Wikipedia:
*
http://armstrongonsoftware.blogspot.com/2008/06/itching-my-programming-nerv…
*
http://video.google.com/videoplay?docid=6981137233069932108 (Wikipedia
discussion starts 30min into the video)
Basically a p2p backend that claims order of magnitude performance gains
for writing pages. They ignore the front caches etc. Done in Erlang (+Java).
I was trying to figure out whether this would really be feature parity
but couldn't fully see it.
For the rendering, they use plog4u---does someone know whether this has
feature parity with Mediawiki (markup)? We used JAMWiki (Java
implementation of MediaWiki) only to see later that there was no
ParserFunctions extension available. (Why is this an extension rather
than a core part in the first place?)
If the only thing missing from JAMWiki was ParserFunctions, that would be
very impressive. ParserFunctions is simple. And indeed, there's a lot of
really impressive code in there, although it's easy to find edge cases
that don't work the same way.
But I thought I'd better test its performance, before I got too excited
and started integrating it into MediaWiki. It turns out that it's full of
O(N^2) cases, which made my usual testing method using repeated text to
measure loop performance rather difficult.
For example, for the test text str_repeat("'''b''' ",
1000), JAMWiki
showed O(N^2) performance:
1000 iterations: 1148ms
2000 iterations: 3916ms
4000 iterations: 15320ms
For str_repeat("[http://a] ", 1000), it took so long that I gave up
waiting. MediaWiki does either of these things in linear time, on the
order of hundreds of microseconds per loop.
It's unfortunate that a modern parser generator for a supposedly fast
language like Java can't match hand-optimised PHP for speed. It's not like
we've set a high bar here.
-- Tim Starling