On Tue, Jan 3, 2012 at 11:36 AM, Roan Kattouw <roan.kattouw(a)gmail.com>wrote;wrote:
On Tue, Jan 3, 2012 at 8:24 PM, Brion Vibber
<brion(a)pobox.com> wrote:
I would rather have real, working client-side
processing support;
Me too, but I don't like the idea of Yet Another Miniparser
that
parses a narrow subset of wikitext. I'm in a meeting now and we've
decided to just go ahead with the existing client-side parser while I
experiment with moving stuff server-side.
among
other things for offline behavior it's nice not to have to rely on
preprocessing. (If the preprocessing is done during message loading, then
probably less of a deal for MediaWiki itself, which still relies on a
server to load up the messages in the first place.)
Preprocessing would definitely be done in the message loading phase,
which means the result is heavily cached in Varnish, and that we
wouldn't have to, say, make an AJAX request back to the server for
each invocation (that would be crazy).
+1 that should work pretty well.
However we
also use client-side JS localization for our Android
application
(soon to migrate to other platforms), which uses
a pruned-down
mediawiki.js
for its localization framework. Having plural
work there would be a plus.
Server-side processing doesn't preclude PLURAL from working. You could
still feed messages with PLURAL to mw.msg(), the syntax used to
specify the message would just be a bit weird: you'd essentially have
to feed it a mini parse tree where the PLURAL invocation is a separate
element. But preprocessing messages by hand like that is not hard.
Probably we can throw a preprocessor into the app's localization loader
stub, and that'll work for both cases. That should cover the cases I'm
worried about just fine.
I'll write another post to this thread where I
outline my proposed
approach. I was gonna Just Do It, but it probably can't be pulled off
in time, so we're just gonna deploy the existing JS stuff to the site
(next week, I think?).
whee!
-- brion