Erm... never, EVER! try to write a long e-mail on a Wii... Especially when occasionally your Wi-Fi disconnects which ends up killing everything you've written.
Well onto the topic...
I've notice that the number of extensions using JS Libraries has increased recently. Notably Semantic MediaWiki/Semantic Forms, and SocialProfile. Additionally I was contracted to create a new mp3 playing extension because all the current ones break the lines (The requester wants to be able to let the music play inline, basically beside a normal link to an audio file, instead of needing a plugin or something on their computer, or a big player that takes up unneeded space)... So I found the mp3inline http://pjoe.net/software/mp3inline/ Wordpress plugin, and intend to adapt some of it into a MediaWiki extension which will automatically let audio links be playable inline with an icon located cleanly beside the link. Of course, the note on this topic is that the player uses Scriptaculous which is another JS Library which would be put into MW.
Various extensions use different methods of including the libraries they need. Mostly just requiring the user to find a way to put it in. However SocialProfile includes a YUI extension which can be used. This extension however is basically just a short bit that includes a single script which is basically a minified part of the basic required YUI code, and an unminified version of the animation package (Why they used the minified version for one half, and the full version of another part is beyond me though)...
The biggest issue with any of these that I see... Is load time. For all of them you need to add a bunch of script tags to the page for them to work, and suddenly you drastically increase the number of HTTP calls for stuff on your site.
Since things are growing, I was thinking it would be a good idea to add some stuff to core to allow extensions to add use of JS libraries in an intuitive way. I started an allinone.php extension awhile ago (inspired by Wikia's allinone.js idea) and was thinking I should probably rewrite it and make something good for core. The idea is to have a single script in the page which contains all of the needed JS Libraries... And even wikibits.js inside it... All of them minified to compact space... Of course, if you need to debug any errors or anything, simply reload the page with &allinone=0 and the system automatically includes the separate non-minified files in individual script tags for debugging. Perhaps even a no-allinone preference for those doing heavy debugging in areas where they have a post request they can't add &allinone=0 to.
Additionally, the system would have a understanding of the structure of a js library. Basically, a sort of definition module would be created for each library that people may use (YUI, jQuery, Scriptaculous, Prototype, etc...) which would outline things like the different parts of the system (Core file, individual parts of the system like ui or other things only needed sometimes, separation of full/minified files (perhaps a notion of debug like what YUI has), and files like YUI's utilities.js or yahoo-dom-event.js which are minified versions of a grouping of various parts of the library.)
And using calls like, say... Making the thing handling this called "JSLibs" just for this example... JSLibs::addUse( 'YUI', 'animation' ); which would note that YUI's animation bit is required for use in the page. And so it'll automatically know that the 'yahoo' bit is also needed, additionally if various other things like the dom, event, etc... bits are needed it'll automatically use one of the combined files instead of individual ones.
Of course, there is a little bit of optimization by use that things using the libs need to do... Primarily this is because some things are needed at some times, and not at others... But if you don't define good times that it should be included, then the number of varying types of allinone groups you have increases and you end up with more stuff for the browser to cache and more requests to the server.
So basically: * Skins... For the JS Libraries that they require, they should include the libraries all the time when inside of that skin. (There'll be code to let Skins define what they need inside of the definition of where to put the stuff) * Site scripts... When JS Libraries are wanted for site scripting, the stuff should be included using calls inside of LocalSettings.php and included all the time. * Extensions... It depends on what kind of extension... ** For low use things inside articles, like perhaps a TagCloud which is likely only to be used on a few major pages, this should be only included when needed (ie: The thing needing it is parsed into existence) ** For special page stuff, and things meant for only edit pages and the like the libraries should always be included while on these pages, but not in general while reading articles. ** For high use things, like SMW's attributes, factboxes, and such... The libraries should be included 100% of the time... Of course, if you really want you can put in some exclusions for when on special pages... But these are used a high amount of times, and can add up the number of variations easily.
If you don't understand what I'm meaning... It occurs when multiple extensions of different types are used... For example... Say we had a low use tag cloud, and something like SMW which included dynamic stuff every time an attribute was used... If the tag cloud loaded only when needed, and SMW included only when an attribute was used... then we'd have the variations: * One for when tag cloud, and SMW attributes are used (main pages mostly) * One for when tag cloud isn't used, but SMW attributes are used (most article pages) * One for when tag cloud is used, but SMW attributes are not (extremely rare case) * And one for when the tag cloud isn't used, and SMW attribues are not (another rare case) Those last two shouldn't exist... They only exist because one extension didn't define when stuff should be included right. If the example SMW had loaded it's libraries 100% of the time when on articles because of the high use of it... Then there would only be two variations, one for with tag cloud, and one for when it's not...
Another issue, is minification... Not everything comes with a minified counterpart... I was hoping to make this something which could be done automatically. However, I found out that most of the minification programs that seam to be good, run in other languages like Java, rather than having PHP ports. So perhaps a toolserver service would be nice, one allowing extension authors to submit a page of code in a form to the toolserver, and have it return them a minified version using the best program for the job, that way people developing scripts and stuff for use can distribute the extension with pre-minified code, rather than requiring the people using the extension to download something to minify the code on their own. ^_^ And yes, of course we'd have a minified version of wikibits.js... We include it 100% of the time, why waste bytes on the comments and whitespace? Especially when using a non-minified/minified split allows us to put nice literate documentation inside of the code, while still making end use of something extremely compact.
On Tue, Mar 18, 2008 at 7:13 AM, DanTMan dan_the_man@telus.net wrote:
The idea is to have a single script in the page which contains all of the needed JS Libraries... And even wikibits.js inside it... All of them minified to compact space... Of course, if you need to debug any errors or anything, simply reload the page with &allinone=0 and the system automatically includes the separate non-minified files in individual script tags for debugging. Perhaps even a no-allinone preference for those doing heavy debugging in areas where they have a post request they can't add &allinone=0 to.
Maybe. Does anyone have any hard figures on how much good minification actually does, given that the files are being compressed? Even if you tout these extra URL options, where are a pain for developers, every user who reports a JS error will report it as being on line 1 of the file, etc. I'm not convinced it will be worth it if we gzip, especially if we use consistent tab and brace conventions (having "if( " all the time will compress better than sometimes "if( ", sometimes "if (", sometimes "if(", sometimes "if ( ").
I assume compression, though, when we don't seem to be doing it. IE6 apparently has issues with it under some circumstances. http://support.microsoft.com/kb/321722/EN-US/ isn't an issue for wikibits.js and so on, since we rename that when we update it, but it might be an issue for generated CSS/JS. http://support.microsoft.com/default.aspx?scid=kb;en-us;Q312496 is solved in the latest SP for IE6, and anyway seems like it would break the HTML page too. At any rate it should be safe to compress for anything but IE6, unless there are reported problems with IE7 or non-IE browsers. I'm guessing IE6 only accounts for something like 20% or 30% of our traffic by now? This avenue should definitely be looked at before minification, IMO.
The extra HTTP hits are an issue, however, mainly because browsers will only send two HTTP requests at once to the same server. (Some people have gone so far as to host each couple of includes on a different domain.) Combining things into the same file where possible would certainly be a win.
If the example SMW had loaded it's libraries 100% of the time when on articles because of the high use of it... Then there would only be two variations, one for with tag cloud, and one for when it's not...
If a large chunk of included CSS/JS (henceforth, "includes") is needed rarely enough that we don't want to load it on the first view unless necessary, then it would almost certainly be better to include it as a second file in the case when it is needed, not maintain a different version of the file. This allows many fewer bytes to be transferred, assuming that the stock includes will often be cached already, and is large.
In some cases there may be some includes that must *not* be loaded on some page views, particularly skin and user includes. Skin-specific includes should be bundled with wikibits.js, etc., with multiple versions maintained for each skin. User-specific includes that really are different from user to user probably need to be served as a separate file.
On Tue, Mar 18, 2008 at 10:01 AM, Simetrical Simetrical+wikilist@gmail.com wrote:
Maybe. Does anyone have any hard figures on how much good minification actually does, given that the files are being compressed? Even if you tout these extra URL options, where are a pain for developers, every user who reports a JS error will report it as being on line 1 of the file, etc. I'm not convinced it will be worth it if we gzip, especially if we use consistent tab and brace conventions (having "if( " all the time will compress better than sometimes "if( ", sometimes "if (", sometimes "if(", sometimes "if ( ").
I forgot to mention the bug for this: https://bugzilla.wikimedia.org/show_bug.cgi?id=12250
min = YUI Compressor jsmin = JSMin pak = Packer
normal gzipped reduction wikibits.js 40.9 KB 12.1 KB 70.4% wikibits.min.js 25.5 KB 7.1 KB 72.1% reduction 37.6% 41.2% 82.6% wikibits.jsmin.js 26.0 KB 7.3 KB 71.9% reduction 36.4% 39.6% 82.2% wikibits.pak.js 25.5 KB 7.2 KB 71.8% reduction 37.7% 40.5% 82.4%
I may have made a mistake on a few of the percentages (just rounding errors)... But minifying does shave at least 1/4 the size off the file. Additionally, that would grow even more if we decided to document everything inside of wikibits.js nicely in a format we could throw something similar to MW's doxygen through, to get some nice documentation on the JS inside of wikibits. Additionally, as you can see, when gzipped rather than shaving 1/4 the file, we shave nearly 1/2 the file off. And honestly, as noted, we can't send a gzipped file 100% of the time, so having minification is good since we still get some reduction even when the browser does not support gzip. And when it does, we end up with less than 1/5 the size of the original file.
Do remember that also, gziping should improve even more when we group the files together. So file grouping + minification + gzip where possible will result in some astonishing file size differences. (Also, if php has the gzip module setup, I can probably get it to send in gzip when supported... Even if mod_gzip isn't on the webserver.)
Hmmm... multiple includes rather than one file... Ok... but that's going to need a bit more complex logic than before to make sure that dependencies are included in the right order. Additionally I'll have to setup some sort of notion of a dependency tree, where including one thing will make sure that other needed things are included, and that they are included inside the current scope, but not moved to the current scope if they are already included in a broader scope.
Additionally it would probably be best to collapse a few more of those groups together. The final set, in order of load should probably be: - 100% (wikibits, site scripts [libraries wanted for site scripts, or things like sortable or collapse, not common.js], things actively used in articles [ie: SMW, but not TagCloud]) - skin stuff (Any libraries that the skin uses to do dynamic stuff) - per-page (Any library or additionally the actual scripts which pages use, with a dependency chain)
The important notes would be: * Having the Articles set, which was previously included in all, but not special pages, isn't such a good idea unless you are putting it all inside one file. This adds a side scope, and is the most likely case where a dependency issue will arise where either a script will be loaded before the stuff it needs are loaded, or the dependency logic will force an alteration to a more general scope causing multiple variations of them as a result of the dependency logic trying to prevent dependency load issues. So, it would be best to have anything which would be used for articles (which make up most of the views anyways) something that is loaded all the time. * The notion of user includes is similar to article only includes... It's a side scope, and is likely to result in issues with dependency logic for library use cases. So, normal user stuff may end up on it's own... But this will be loaded after everything else, and outside the dependency logic. So it will not support conditionally loading certain JS libraries for user use. If a site wants to let it's user's make user scripts and use those, then it would be best for them to include the libraries they want to let users use inside of the 100% loads group so they are always there for use.
~Daniel Friesen(Dantman) of: -The Gaiapedia (http://gaia.wikia.com) -Wikia ACG on Wikia.com (http://wikia.com/wiki/Wikia_ACG) -and Wiki-Tools.com (http://wiki-tools.com)
Simetrical wrote:
On Tue, Mar 18, 2008 at 7:13 AM, DanTMan dan_the_man@telus.net wrote:
The idea is to have a single script in the page which contains all of the needed JS Libraries... And even wikibits.js inside it... All of them minified to compact space... Of course, if you need to debug any errors or anything, simply reload the page with &allinone=0 and the system automatically includes the separate non-minified files in individual script tags for debugging. Perhaps even a no-allinone preference for those doing heavy debugging in areas where they have a post request they can't add &allinone=0 to.
Maybe. Does anyone have any hard figures on how much good minification actually does, given that the files are being compressed? Even if you tout these extra URL options, where are a pain for developers, every user who reports a JS error will report it as being on line 1 of the file, etc. I'm not convinced it will be worth it if we gzip, especially if we use consistent tab and brace conventions (having "if( " all the time will compress better than sometimes "if( ", sometimes "if (", sometimes "if(", sometimes "if ( ").
I assume compression, though, when we don't seem to be doing it. IE6 apparently has issues with it under some circumstances. http://support.microsoft.com/kb/321722/EN-US/ isn't an issue for wikibits.js and so on, since we rename that when we update it, but it might be an issue for generated CSS/JS. http://support.microsoft.com/default.aspx?scid=kb;en-us;Q312496 is solved in the latest SP for IE6, and anyway seems like it would break the HTML page too. At any rate it should be safe to compress for anything but IE6, unless there are reported problems with IE7 or non-IE browsers. I'm guessing IE6 only accounts for something like 20% or 30% of our traffic by now? This avenue should definitely be looked at before minification, IMO.
The extra HTTP hits are an issue, however, mainly because browsers will only send two HTTP requests at once to the same server. (Some people have gone so far as to host each couple of includes on a different domain.) Combining things into the same file where possible would certainly be a win.
If the example SMW had loaded it's libraries 100% of the time when on articles because of the high use of it... Then there would only be two variations, one for with tag cloud, and one for when it's not...
If a large chunk of included CSS/JS (henceforth, "includes") is needed rarely enough that we don't want to load it on the first view unless necessary, then it would almost certainly be better to include it as a second file in the case when it is needed, not maintain a different version of the file. This allows many fewer bytes to be transferred, assuming that the stock includes will often be cached already, and is large.
In some cases there may be some includes that must *not* be loaded on some page views, particularly skin and user includes. Skin-specific includes should be bundled with wikibits.js, etc., with multiple versions maintained for each skin. User-specific includes that really are different from user to user probably need to be served as a separate file.
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
of the time, so having minification is good since we still get some reduction even when the browser does not support gzip. And when it does, we end up with less than 1/5 the size of the original file.
How does that compare to total bytes per page impression? My guess would be: negligable. One thumbnail on the page and the JS savings become marginal in comparison. Plus the JS should be cached by the brpwser (as opposed to new thmbnails on different pages) Bottom line, minification makes JS debugging a pain in the a.. for a comparatively low benefit.
On Wed, Mar 19, 2008 at 1:10 AM, DanTMan dan_the_man@telus.net wrote:
I may have made a mistake on a few of the percentages (just rounding errors)... But minifying does shave at least 1/4 the size off the file. Additionally, that would grow even more if we decided to document everything inside of wikibits.js nicely in a format we could throw something similar to MW's doxygen through, to get some nice documentation on the JS inside of wikibits. Additionally, as you can see, when gzipped rather than shaving 1/4 the file, we shave nearly 1/2 the file off. And honestly, as noted, we can't send a gzipped file 100% of the time, so having minification is good since we still get some reduction even when the browser does not support gzip. And when it does, we end up with less than 1/5 the size of the original file.
What it looks like to me is we save about 5 KB on a first page load. Say 10 KB if you count the CSS too. A first page load of the enwiki main page is probably around 80 KB right now, at a rough guess, if you count everything. So we're talking what, 15% savings?
Hmmm... multiple includes rather than one file... Ok... but that's going to need a bit more complex logic than before to make sure that dependencies are included in the right order.
You need to make sure they're concatenated in the right order anyway.
Ya, they need to be included in the right order. But the more complex logic is that we now need to deal with it through multiple scopes and make sure that it's loaded in the correct scope, not overrided in the lower scopes... Without the multiple scopes it's a simple task of adding things to load on calls, adding their dependencies to that list, and ordering them correctly.
Well, whether or not we minify wikibits.js the system is going to support minification, because if it doesn't there will be load issues when you start including the big libraries like YUI. Though, being able to serve minified versions of the core css and js would be nice at least as a configuration option. It may only be 15%, but don't forget that even that slowly adds up to many gigs of bandwidth on sites with high visitor traffic. Oh ya... and as for cache: http://yuiblog.com/blog/2007/01/04/performance-research-part-2/ "40-60% of Yahoo!'s users have an empty cache experience and ~20% of all page views are done with an empty cache. ... It says that even if your assets are optimized for maximum caching, there are a significant number of users that will /always/ have an empty cache." While that was targeted at Yahoo!, that fact could easily apply elsewhere. Especially with how many different people visit wiki.
~Daniel Friesen(Dantman) of: -The Gaiapedia (http://gaia.wikia.com) -Wikia ACG on Wikia.com (http://wikia.com/wiki/Wikia_ACG) -and Wiki-Tools.com (http://wiki-tools.com)
Simetrical wrote:
On Wed, Mar 19, 2008 at 1:10 AM, DanTMan dan_the_man@telus.net wrote:
I may have made a mistake on a few of the percentages (just rounding errors)... But minifying does shave at least 1/4 the size off the file. Additionally, that would grow even more if we decided to document everything inside of wikibits.js nicely in a format we could throw something similar to MW's doxygen through, to get some nice documentation on the JS inside of wikibits. Additionally, as you can see, when gzipped rather than shaving 1/4 the file, we shave nearly 1/2 the file off. And honestly, as noted, we can't send a gzipped file 100% of the time, so having minification is good since we still get some reduction even when the browser does not support gzip. And when it does, we end up with less than 1/5 the size of the original file.
What it looks like to me is we save about 5 KB on a first page load. Say 10 KB if you count the CSS too. A first page load of the enwiki main page is probably around 80 KB right now, at a rough guess, if you count everything. So we're talking what, 15% savings?
Hmmm... multiple includes rather than one file... Ok... but that's going to need a bit more complex logic than before to make sure that dependencies are included in the right order.
You need to make sure they're concatenated in the right order anyway.
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Wed, Mar 19, 2008 at 5:12 PM, DanTMan dan_the_man@telus.net wrote:
Well, whether or not we minify wikibits.js the system is going to support minification, because if it doesn't there will be load issues when you start including the big libraries like YUI.
You are not going to overload Wikimedia's bandwidth serving CSS and JS files. The question is more one of responsiveness.
Oh ya... and as for cache: http://yuiblog.com/blog/2007/01/04/performance-research-part-2/ "40-60% of Yahoo!'s users have an empty cache experience and ~20% of all page views are done with an empty cache. ... It says that even if your assets are optimized for maximum caching, there are a significant number of users that will /always/ have an empty cache." While that was targeted at Yahoo!, that fact could easily apply elsewhere. Especially with how many different people visit wiki.
The figure is somewhat lower for Wikipedia, if I remember correctly (more obsessive browsing, less one-shot search). It's still at least 20%, I think. I don't have the figures on hand.
On Dienstag, 18. März 2008, DanTMan wrote:
Erm... never, EVER! try to write a long e-mail on a Wii... Especially when occasionally your Wi-Fi disconnects which ends up killing everything you've written.
Never, ever write 4 screen pages of text if you need a quick answer ;-)
For the record: (you may know that already, I did not follow the thread; all apologies)
* SMW has code that uses several hooks to include JavaScript upon request (during parsing or page creation), such that requested JavaScript ends up in the header of wiki pages (page cache) as well as in the header of special pages (no cache, direct HTML output). The function is smwfRequireHeadItem() in our SMW_GlobalFunctions.php. We found this very handy, since you can reuse JavaScript-dependent code in various contexts without caring.
* SMW uses the Timeline-Scripts (quite large and slow, hence used only on-demand). These scripts have an own internal loading mechanism for including required JavaScript (i.e. the JavaScript itself does extra inclusions).
* We found that loading many small JavaScripts slows down page display, since many files need to be fetched. If too many extensions use own JavaScripts, it might be good to have some MediaWiki-side service that merges requested files to ship them in one request. Even combining them manually into one file might help.
-- Markus
Well onto the topic...
I've notice that the number of extensions using JS Libraries has increased recently. Notably Semantic MediaWiki/Semantic Forms, and SocialProfile. Additionally I was contracted to create a new mp3 playing extension because all the current ones break the lines (The requester wants to be able to let the music play inline, basically beside a normal link to an audio file, instead of needing a plugin or something on their computer, or a big player that takes up unneeded space)... So I found the mp3inline http://pjoe.net/software/mp3inline/ Wordpress plugin, and intend to adapt some of it into a MediaWiki extension which will automatically let audio links be playable inline with an icon located cleanly beside the link. Of course, the note on this topic is that the player uses Scriptaculous which is another JS Library which would be put into MW.
Various extensions use different methods of including the libraries they need. Mostly just requiring the user to find a way to put it in. However SocialProfile includes a YUI extension which can be used. This extension however is basically just a short bit that includes a single script which is basically a minified part of the basic required YUI code, and an unminified version of the animation package (Why they used the minified version for one half, and the full version of another part is beyond me though)...
The biggest issue with any of these that I see... Is load time. For all of them you need to add a bunch of script tags to the page for them to work, and suddenly you drastically increase the number of HTTP calls for stuff on your site.
Since things are growing, I was thinking it would be a good idea to add some stuff to core to allow extensions to add use of JS libraries in an intuitive way. I started an allinone.php extension awhile ago (inspired by Wikia's allinone.js idea) and was thinking I should probably rewrite it and make something good for core. The idea is to have a single script in the page which contains all of the needed JS Libraries... And even wikibits.js inside it... All of them minified to compact space... Of course, if you need to debug any errors or anything, simply reload the page with &allinone=0 and the system automatically includes the separate non-minified files in individual script tags for debugging. Perhaps even a no-allinone preference for those doing heavy debugging in areas where they have a post request they can't add &allinone=0 to.
Additionally, the system would have a understanding of the structure of a js library. Basically, a sort of definition module would be created for each library that people may use (YUI, jQuery, Scriptaculous, Prototype, etc...) which would outline things like the different parts of the system (Core file, individual parts of the system like ui or other things only needed sometimes, separation of full/minified files (perhaps a notion of debug like what YUI has), and files like YUI's utilities.js or yahoo-dom-event.js which are minified versions of a grouping of various parts of the library.)
And using calls like, say... Making the thing handling this called "JSLibs" just for this example... JSLibs::addUse( 'YUI', 'animation' ); which would note that YUI's animation bit is required for use in the page. And so it'll automatically know that the 'yahoo' bit is also needed, additionally if various other things like the dom, event, etc... bits are needed it'll automatically use one of the combined files instead of individual ones.
Of course, there is a little bit of optimization by use that things using the libs need to do... Primarily this is because some things are needed at some times, and not at others... But if you don't define good times that it should be included, then the number of varying types of allinone groups you have increases and you end up with more stuff for the browser to cache and more requests to the server.
So basically:
- Skins... For the JS Libraries that they require, they should include
the libraries all the time when inside of that skin. (There'll be code to let Skins define what they need inside of the definition of where to put the stuff)
- Site scripts... When JS Libraries are wanted for site scripting, the
stuff should be included using calls inside of LocalSettings.php and included all the time.
- Extensions... It depends on what kind of extension...
** For low use things inside articles, like perhaps a TagCloud which is likely only to be used on a few major pages, this should be only included when needed (ie: The thing needing it is parsed into existence) ** For special page stuff, and things meant for only edit pages and the like the libraries should always be included while on these pages, but not in general while reading articles. ** For high use things, like SMW's attributes, factboxes, and such... The libraries should be included 100% of the time... Of course, if you really want you can put in some exclusions for when on special pages... But these are used a high amount of times, and can add up the number of variations easily.
If you don't understand what I'm meaning... It occurs when multiple extensions of different types are used... For example... Say we had a low use tag cloud, and something like SMW which included dynamic stuff every time an attribute was used... If the tag cloud loaded only when needed, and SMW included only when an attribute was used... then we'd have the variations:
- One for when tag cloud, and SMW attributes are used (main pages mostly)
- One for when tag cloud isn't used, but SMW attributes are used (most
article pages)
- One for when tag cloud is used, but SMW attributes are not (extremely
rare case)
- And one for when the tag cloud isn't used, and SMW attribues are not
(another rare case) Those last two shouldn't exist... They only exist because one extension didn't define when stuff should be included right. If the example SMW had loaded it's libraries 100% of the time when on articles because of the high use of it... Then there would only be two variations, one for with tag cloud, and one for when it's not...
Another issue, is minification... Not everything comes with a minified counterpart... I was hoping to make this something which could be done automatically. However, I found out that most of the minification programs that seam to be good, run in other languages like Java, rather than having PHP ports. So perhaps a toolserver service would be nice, one allowing extension authors to submit a page of code in a form to the toolserver, and have it return them a minified version using the best program for the job, that way people developing scripts and stuff for use can distribute the extension with pre-minified code, rather than requiring the people using the extension to download something to minify the code on their own. ^_^ And yes, of course we'd have a minified version of wikibits.js... We include it 100% of the time, why waste bytes on the comments and whitespace? Especially when using a non-minified/minified split allows us to put nice literate documentation inside of the code, while still making end use of something extremely compact.
wikitech-l@lists.wikimedia.org