I've disabled the tokenizer in the 1.3 parser for now. The old doQuotes function seems to perform better.
Simplified benchmark results (ab -n100 -c2): 1.2: 0.84/s 1.3 with tokenizer: 0.43/s 1.3 with doQuotes: 0.74/s
There are mainly two issues that aren't done yet: * timelines -> strip()? * french space/ number space formatting
The {{foo{{bar}}}} issue is still outstanding, there are ideas on how to tackle this but it's not yet done.
Gabriel Wicke
wikitech-l@lists.wikimedia.org