Tim Starling wrote:
Michael Dale wrote:
That is part of the idea of centrally hosting
reusable client-side
components so we control the jquery version and plugin set. So a
new version won't "come along" until its been tested and
integrated.
You can't host every client-side component in the world in a
subdirectory of the MediaWiki core. Not everyone has commit access to
it. Nobody can hope to properly test every MediaWiki extension.
Most extension developers write an extension for a particular site,
and distribute their code as-is for the benefit of other users. They
have no interest in integration with the core. If they find some
jQuery plugin on the web that defines an interface that conflicts with
MediaWiki, say jQuery.load() but with different parameters, they're
not going to be impressed when you tell them that to make it work with
MediaWiki, they need to rewrite the plugin and get it tested and
integrated.
Different modules should have separate namespaces. This is a key
property of large, maintainable systems of code.
Right.. I agree the client side code needs more deployable modularly.
If designing a given component as a jquery plug-in, then I think it
makes sense to put it in the jQuery namespace ... otherwise you won't be
able to reference jquery things in a predictable way. Alternativly you
I agree that
the present system of parsing top of the javascipt
file on every script-loader generation request is un-optimized.
(the idea is those script-loader generations calls happen rarely
but even still it should be cached at any number of levels. (ie
checking the filemodifcation timestamp, witting out a php or
serialized file .. or storing it in any of the other cache levels
we have available, memcahce, database, etc )
Actually it parses the whole of the JavaScript file, not the top, and
it does it on every request that invokes WebStart.php, not just on
mwScriptLoader.php requests. I'm talking about
jsAutoloadLocalClasses.php if that's not clear.
Ah right... previously I had it in php. I wanted to avoid listing it
twice but obviously thats a pretty costly way to do that.
This will make more sense to put in php if we start splitting up
components into the extension folders and generate the path list
dynamically for a given feature set.
Have you looked at the profiling? On the Wikimedia app
servers,
even the simplest MW request takes 23ms, and gen=js takes 46ms. A
static file like wikibits.js takes around 0.5ms. And that's with
APC. You say MW on small sites is OK, I think it's slow and
resource-intensive.
That's not to say I'm sold on the idea of a static file cache, it
brings its own problems, which I listed.
yea... but almost all script-loader request will be cached. it
does not need to check the DB or anything its just a key-file
lookup (since script-loader request pass a request key either its
there in cache or its not ...it should be on par with the simplest
MW request. Which is substantially shorter then around trip time
for getting each script individually, not to mention gziping which
can't otherwise be easily enabled for 3rd party installations.
I don't think that that comparison can be made so lightly. For the
server operator, CPU time is much more expensive than time spent
waiting for the network. And I'm not proposing that the client fetches
each script individually, I'm proposing that scripts be concatentated
and stored in a cache file which is then referenced directly in the HTML.
I understand. We could even check gziping support at page output time
and point to the gziped cached versions (analogous to making direct
links to the /script-cache folder of the of the present script-loader
setup )
My main question is how will this work for dynamic groups of scripts set
post page load that are dictated by user interaction or client state?
Its not as easy to setup static combined output files to point to when
you don't know what set of scripts you will be requesting ahead of time.
$wgSquidMaxage is set to 31 days (2678400 seconds) for
all wikis
except
wikimediafoundation.org. It's necessary to have a very long
expiry time in order to fill the caches and achieve a high hit rate,
because Wikimedia's access pattern is very broad, with the "long tail"
dominating the request rate.
oky... so to preserve high cache level you could then have a single
static file that lists versions of js with a low expire and the rest
with high expire? Or maybe its so cheep to serve static files that it
does not mater and just leave everything with a low expire?
--michael