I don't know if anyone has ever suggested this before, but couldn't
the software be adapted to lay a larger responsability on the client?
I'm not saying this would be a trivial instant solution, but as far as
I can tell it could make a huge difference in the long run.
Just for a start, imagine the possibilities of having a JavaScript
wikicode parser. Preview pages could be created on the fly without the
need for a page reload. Pages could be sent to the client in raw
wikicode state (processing only message inclusions) and parsed
afterwards by the client, they could even be switched to edit mode
without sending a new request. Having such a feature implemented would
for sure reduce the load on the servers.
I don't have much experience with JavaScript and the DOM, but I'll
begin doing some experiments this week on my free time. I have seen
XML parsers written in JS, so a wikicode parser shouldn't be that hard
to make, or it should be at least possible.
I know the common objections to an idea like this: it depends on
JS-enabled browsers and also some JS interpreters might not work the
same as others. This should not be seen as a replacement for the
server-side processor, just an off-by-default alternative. If it were
there and it was proven to be useful and to relieve the servers I bet
most caring users would give it a shot.
Just my 2 cents. By the way, forgive my broken English.
-Pedro Fayolle
On Mon, 17 Jan 2005 10:48:01 -0800 (PST), Rich Holton
<rich_holton(a)yahoo.com> wrote:
We are now well into the fourth day since I originally
expressed my
concern about the performance issues on en.wikipedia. While the
situation may have improved a bit, and some of the ugliest error
messages seem to be reduced, the situation is still far from being
resolved.
Attempting to do any useful work on Wikipedia has become futile.
Article view time has become sporadic, and frequently very bad to
intolerable. For any site other than Wikipedia, I would never know that
the site was working at all, since I would fairly quickly write it off
as "down".
I am committed to Wikipedia and its objectives. I will tolerate these
performance delays. But I believe that there must be many who will not
tolerate them, and will write-off Wikipedia. Is this what we want?
While it appears that the developers are very busy attempting to solve
the technical issues, there seems to be little actual progress. How
long will we allow this situation to continue without attempting
solutions of a different sort? A good operational definition of
insanity is "continuing the same actions expecting different results."
If our developers lack time, resources, or skill needed to resolve the
issues, then let us immediately hire a temporary consultant to focus on
this problem and resolve it.
We owe our developers great thanks, appreciation, and respect. But a
big part of that lies in recognizing when we are expecting them to
accomplish what is impossible. Given their restrictions on time,
resources, and skill, are we asking them to do the impossible?
With respect, concern, and frustration
Rich Holton
(en.Wikipedia:User:Rholton)
__________________________________
Do you Yahoo!?
The all-new My Yahoo! - Get yours free!
http://my.yahoo.com
_______________________________________________
foundation-l mailing list
foundation-l(a)wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/foundation-l