On Tue, Jul 23, 2013 at 12:05 AM, Tim Starling <tstarling(a)wikimedia.org> wrote:
I tried editing [[Argentina]] on my laptop just now,
it took 45
seconds of CPU time and 51 seconds of wall clock time before the
percentage CPU usage began to drop. It's pretty slow.
Yes, that's why I said "performance on long pages can absolutely be
prohibitively poor", and I would qualify this 150K document as such.
:P About 30 seconds in Chrome on this system until I can start making
formatting changes, BTW.
For comparison, I'd also suggest copying the document into another
rich-text editing environment and observing performance
characteristics. Google Docs, which is generally regarded as
state-of-the-art in this regard, took about 40 seconds (with a "tab
crashed" warning) when attempting to paste this entire article in
before becoming responsive (it is significantly more responsive than
VE, although still sluggish, once the document is active). It also
throws a warning about too many images.
Point being, it's a legitimately hard problem. And, to be fair, the
equivalent of performing document-level operations within wikitext
(loading the whole page and previewing your changes before saving)
isn't exactly lightning-fast. An AJAX live-preview on that page takes
about 12 seconds to generate.
Is there any estimate as to how much development time
it might take to
improve performance by an order of magnitude or so, as seems to be
required?
I'm not sure that goal is fully attainable, but I'd suggest folks from
the VE team weigh in with some of their thoughts on performance
strategies. As I understand it, one of the near term improvements is
to target selective activation of the editing surface (in a manner
that's transparent to the user) which could reduce CPU and memory
footprint quite significantly for operations that don't span the
entire document.
Erik
--
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation