-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Gregory Maxwell wrote:
On Sun, Oct 5, 2008 at 12:15 PM, howard chen
<howachen(a)gmail.com> wrote:
Results is quite surprising, Grade F (47). Of
course lower mark does
not always means bad, but there are some room for improvement, e.g.
[snip]
Probably pointless. It's small enough already that the load time is
going to be latency bound for any user not sitting inside a Wikimedia
data center. On ones which are above the latency bound window (of
roughly 8k), gzipping should get them back under it.
Minification can actually decrease sizes significantly even with
gzipping. Particularly for low-bandwidth and mobile use this could be a
serious plus.
The big downside of minification, of course, is that it makes it harder
to read and debug the code.
The page text is gzipped. CSS/JS are not. Many of the CSS/JS are
small enough that gzipping would not be a significant win (see above)
but I don't recall the reason the the CSS/JS are not. Is there a
client compatibility issue here?
CSS/JS generated via MediaWiki are gzipped. Those loaded from raw files
are not, as the servers aren't currently configured to do that.
Hm. There are expire headers on the skin provided images, but not ones
from upload. It does correctly respond with 304 not modified, but a
not-modified is often as time consuming as sending the image. Firefox
doesn't IMS these objects every time in any case.
The primary holdup for serious expires headers on file uploads is not
having unique per-version URLs. With a far-future expires header, things
get horribly confusing when a file has been replaced, but everyone still
sees the old cached version.
Anyway, these are all known issues.
Possible remedies for CSS/JS files:
* Configue Apache to compress them on the fly (probably easy)
* Pre-minify them and have Apache compress them on the fly (not very hard)
* Run them through MediaWiki to compress them (slightly harder)
* Run them through MediaWiki to compress them *and* minify them *and*
merge multiple files together to reduce number of requests (funk-ay!)
Possible remedies for image URLs better caching:
* Stick a version number on the URL in a query string (probably easy --
grab the timestamp from the image metadata and toss it on the url?)
* Store files with unique filenames per version (harder since it
requires migrating files around, but something I'd love us to do)
- -- brion
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (Darwin)
Comment: Using GnuPG with Mozilla -
http://enigmail.mozdev.org
iEYEARECAAYFAkjqQzkACgkQwRnhpk1wk45kKgCgtMsxUkrfbCFzCrWswdK6ucTb
WdUAnRz1MvUNziq4SDMyPWtWDw9tB6IW
=iHEn
-----END PGP SIGNATURE-----