Did anyone ever publish a performance evaluation of Mediawiki that you could point me to? I'd be curious to learn more about the 800ms etc. --Dirk
On 7/29/07, Simetrical Simetrical+wikilist@gmail.com wrote:
On 7/29/07, Edward Z. Yang edwardzyang@thewritingpot.com wrote:
That being said, I imagine the biggest problems with a fully XHTML backend is making it performance efficient (there are already tools out there that can validate HTML quite well http://htmlpurifier.org). Parsing/instantiating DOMs are quite expensive.
Um . . . you do realize that at last count it takes *800 ms* to parse a page of wikitext? This is not the right context for complaints about XML being complicated. ;)
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikitech-l