On Wed, Jan 19, 2011 at 10:57 PM, Magnus Manske
<magnusmanske(a)googlemail.com> wrote:
On Wed, Jan 19, 2011 at 8:25 PM, Platonides
<Platonides(a)gmail.com> wrote:
Magnus Manske wrote:
On my usual test article [[Paris]], the slowest
section ("History")
parses in ~5 sec (Firefox 3.6.13, MacBook Pro). Chrome 10 takes 2
seconds. I believe these will already be acceptable to average users;
optimisation should improve that further.
Cheers,
Magnus
What about long tables?
Worst-case-scenario I could find:
http://en.wikipedia.org/wiki/Table_of_nuclides_(sorted_by_half-life)#Nuclid…
4.7 sec in Chrome 10 on my iMac.
6.2 sec in Firefox 4 beta 9.
10.7 sec in Firefox 3.6.
Could be worse, I guess...
Another update that might be of interest (if not, tell me :-)
I just went through my first round of code optimisation. Parsing speed
has improved considerably, especially for "older" browsers: Firefox
3.6 now parses [[Paris]] in 10 sec instead of 32 sec (YMMV).
Also, it is now loading the wikitext and the image information from
the API in parallel, which reduces pre-parsing time.
For small and medium-size articles, editing in WYSIFTW mode now often
loads (and parses) faster than the normal edit page takes to load
(using Chrome 10).
Cheers,
Magnus