As mentioned already, I'm not sure if localization is the best for candidate for being held in JavaScript, but other things mentioned, e.g. single request, minified and infinitely cached JS is what I'm looking at for overall MW infrastructure - so far it's a big performance problem for MW - examples of waterfall diagram i posted for the infinite image cache also show main issue that I'm trying to attack for a while and might need more help with - there are too many JS and CSS requests are made to the server by MediaWiki: http://performance.webpagetest.org:8080/result/090218_132826127ab7f254499631... only 18th request is first image to be loaded and as you can see, JavaScript loads are blocking, meaning no parallel loading is happening.
I think it's worth investing resources into creating some process for better handling of this. Right now it's possible to cut down this configuring MediaWiki not to use user scripts and stylesheets and manually combining JS and CSS files for skin, Ajax framework and all things needed by extensons - I did quite a lot for specific installations, but it seems that it needs more systematic approach.
Good news is that MW already has some wrappers for style and script insertion that extensions use to refer to external files. It's a little bit lest fortunate with script loading sequence (e.g. it's ideal to load scripts only when the rest of the page is loaded), but that might be a much bigger challenge.
It's also worth mentioning that reducing the amount of PHP that handles JavaScript and CSS is a good idea as serving static resources is much easier then starting up fullblown PHP engine even with opcode and variable caches.
I think that there is a way to reduce the start-render delay as well as overall loading time plus, very likely to save some traffic by attacking front-end and will be happy to participate more in this.
How do we go about doing this? Can it be tied into Usability project ( http://usability.wikimedia.org/)?
Thank you,
Sergey
Hoi, Sergey, you may also want to take a visit at the folks at http://translatewiki.net. This is where the internationalisation and localisation effort for MediaWiki and its extensions and all the rest is concentrated. In the past translatewiki,net has been instrumental in bringing best practices to the internationalisation of MW extensions. It is likely that when best practices become apparent for JS and CSS they will again play this role. Thanks, GerardM
2009/2/25 Sergey Chernyshev sergey.chernyshev@gmail.com
As mentioned already, I'm not sure if localization is the best for candidate for being held in JavaScript, but other things mentioned, e.g. single request, minified and infinitely cached JS is what I'm looking at for overall MW infrastructure - so far it's a big performance problem for MW - examples of waterfall diagram i posted for the infinite image cache also show main issue that I'm trying to attack for a while and might need more help with - there are too many JS and CSS requests are made to the server by MediaWiki:
http://performance.webpagetest.org:8080/result/090218_132826127ab7f254499631... only 18th request is first image to be loaded and as you can see, JavaScript loads are blocking, meaning no parallel loading is happening.
I think it's worth investing resources into creating some process for better handling of this. Right now it's possible to cut down this configuring MediaWiki not to use user scripts and stylesheets and manually combining JS and CSS files for skin, Ajax framework and all things needed by extensons - I did quite a lot for specific installations, but it seems that it needs more systematic approach.
Good news is that MW already has some wrappers for style and script insertion that extensions use to refer to external files. It's a little bit lest fortunate with script loading sequence (e.g. it's ideal to load scripts only when the rest of the page is loaded), but that might be a much bigger challenge.
It's also worth mentioning that reducing the amount of PHP that handles JavaScript and CSS is a good idea as serving static resources is much easier then starting up fullblown PHP engine even with opcode and variable caches.
I think that there is a way to reduce the start-render delay as well as overall loading time plus, very likely to save some traffic by attacking front-end and will be happy to participate more in this.
How do we go about doing this? Can it be tied into Usability project ( http://usability.wikimedia.org/)?
Thank you,
Sergey
-- Sergey Chernyshev http://www.sergeychernyshev.com/
On Fri, Feb 20, 2009 at 8:07 PM, Gregory Maxwell gmaxwell@gmail.com wrote:
On Fri, Feb 20, 2009 at 5:51 PM, Brion Vibber brion@wikimedia.org
wrote:
[snip]
On the other hand we don't want to delay those interactions; it's probably cheaper to load 15 messages in one chunk after showing the wizard rather than waiting until each tab click to load them 5 at a
time.
But that can be up to the individual component how to arrange its
loads...
Right. It's important to keep in mind that in most cases the user is *latency bound*. That is to say that the RTT between them and the datacenter is the
primary
determining factor in the load time, not how much data is sent.
Latency determines the connection time, it also influences how quickly rwin can grow and get you out of slow-start. When you send more at once you'll also be sending more of it with a larger rwin.
So in terms of user experience you'll usually improve results by sending more data if doing so is able to save you a second request.
Even ignoring the users experience— connections aren't free. There is byte-overhead in establishing a connection. Byte-overhead in lost compression by
working
with smaller objects. Byte-overhead in having more partially filled IP
packets.
CPU overhead from processing more connections, etc.
Obviously there is a line to be drawn— You wouldn't improve performance by sending the whole of Wikipedia on the first request. But you will most likely not be conserving *anything* by avoiding sending another kilobyte of compressed user interface text for an application a user has already invoked, even if only a few percent use the additional messages.
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 2/24/09 4:17 PM, Sergey Chernyshev wrote:
As mentioned already, I'm not sure if localization is the best for candidate for being held in JavaScript, but other things mentioned, e.g. single request, minified and infinitely cached JS is what I'm looking at for overall MW infrastructure - so far it's a big performance problem for MW -
[snip]
Mmm, did you see the thread "Javascript localization, minify, gzip & cache forever" including proof of concept code which is covering this ground?
How do we go about doing this? Can it be tied into Usability project ( http://usability.wikimedia.org/)?
That is far outside the scope of the usability project.
-- brion
On Tue, Feb 24, 2009 at 7:29 PM, Brion Vibber brion@wikimedia.org wrote:
On 2/24/09 4:17 PM, Sergey Chernyshev wrote:
As mentioned already, I'm not sure if localization is the best for
candidate
for being held in JavaScript, but other things mentioned, e.g. single request, minified and infinitely cached JS is what I'm looking at for overall MW infrastructure - so far it's a big performance problem for MW
[snip]
Mmm, did you see the thread "Javascript localization, minify, gzip & cache forever" including proof of concept code which is covering this ground?
Yes, of course - I checked it out and that's why I quoted it in my original email. My brief overview made me feel that it wasn't enough.
I just didn't want this to be only in context of localization as performance is more related to overall user experience then to multilingual support.
I'll try to summarize what might be the goals of front-end performance optimizations.
How do we go about doing this? Can it be tied into Usability project (
That is far outside the scope of the usability project.
Yes, I took a look at the project and it seems that UI performance wasn't deemed important in this Usability initiative. This is not necessarily good, but I can understand why it concentrates on navigation, colors and editing UI.
-- brion
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Sergey Chernyshev wrote:
Yes, of course - I checked it out and that's why I quoted it in my original email. My brief overview made me feel that it wasn't enough.
I just didn't want this to be only in context of localization as performance is more related to overall user experience then to multilingual support.
I'll try to summarize what might be the goals of front-end performance optimizations.
hmm ... We can use the same script grouping/serving engine on all javascript whether the script has localized msg or not. In a reply to that thread I mentioned some details about path towards mediaWIki wide deployment, it involved adding in wiki title support to the script server, and grouping all the style requests, and modifying skins or the header output script. Thanks for your summary of why front end performance is important this will help prioritize that patch :)
Yes, I took a look at the project and it seems that UI performance wasn't deemed important in this Usability initiative. This is not necessarily good, but I can understand why it concentrates on navigation, colors and editing UI.
The Usability initiative (like any software project) will have to take performance issues into consideration. Having this script server in place will benefit that effort as well.
--michael
On Tue, Feb 24, 2009 at 7:17 PM, Sergey Chernyshev sergey.chernyshev@gmail.com wrote:
How do we go about doing this? Can it be tied into Usability project ( http://usability.wikimedia.org/)?
It doesn't seem usability-related. If you have commit access, you could just start committing code (probably hidden behind disabled-by-default config variables to start with until it's tested and complete enough). If you don't have commit access, you could ask for it.
If you meant "could Wikimedia resources be allocated to this?", then Brion is the one to talk to.
On Tue, Feb 24, 2009 at 7:31 PM, Aryeh Gregor <Simetrical+wikilist@gmail.comSimetrical%2Bwikilist@gmail.com
wrote:
On Tue, Feb 24, 2009 at 7:17 PM, Sergey Chernyshev sergey.chernyshev@gmail.com wrote:
How do we go about doing this? Can it be tied into Usability project ( http://usability.wikimedia.org/)?
It doesn't seem usability-related.
Actually it is very related to usability - performance is a very important factor in usability and that is why Google, Yahoo and Amazon made research how it affects usage and figured out a few very interesting numbers that for them convert to hard cash:
I'm preparing a presentation at New York Web Standards Meetup about web performance - it's not ready yet, but I came across a good presentation by Nicole Sullivan from Yahoo! targeted at designers and UI experts - http://www.techpresentations.org/Design_Fast_Websites
I understand why http://usability.wikimedia.org/ doesn't have front-end performance as one of it's goals, but I think it should at least be mentioned to all people working on redesign of such major site.
If you have commit access, you could just start committing code (probably hidden behind disabled-by-default config variables to start with until it's tested and complete enough). If you don't have commit access, you could ask for it.
Please don't give me this attitude - "you need it, you do it".
I have commit access and is going to work on this, but my motivation with my small projects is nothing compared to Wikipedia's and my setups are much smaller and more controllable so I think I'll leave pitching for resources for this idea to those who need it, but I'll be happy to talk about it in person to Brion or anyone else who wants to do something about it.
BTW, are there any Wikipedia-related events in SF on the week of Web 2.0 conference (March 30 - April 3)? I'll be in SF for the conference and will be happy to come by.
Thank you,
Sergey
If you meant "could Wikimedia resources be allocated to this?", then
Brion is the one to talk to.
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Forgot to mention the numbers (you can still see them in Nicole's presentation):
- +100ms → -1% sales (Amazon) - +400ms → -5-9% abandonment (full-page traffic Yahoo! Autos front page) - +500ms → -20% searches (Google)
On Wed, Feb 25, 2009 at 11:03 AM, Sergey Chernyshev < sergey.chernyshev@gmail.com> wrote:
On Tue, Feb 24, 2009 at 7:31 PM, Aryeh Gregor < Simetrical+wikilist@gmail.com Simetrical%2Bwikilist@gmail.com> wrote:
On Tue, Feb 24, 2009 at 7:17 PM, Sergey Chernyshev sergey.chernyshev@gmail.com wrote:
How do we go about doing this? Can it be tied into Usability project ( http://usability.wikimedia.org/)?
It doesn't seem usability-related.
Actually it is very related to usability - performance is a very important factor in usability and that is why Google, Yahoo and Amazon made research how it affects usage and figured out a few very interesting numbers that for them convert to hard cash: [snip]
On Wed, Feb 25, 2009 at 11:03 AM, Sergey Chernyshev sergey.chernyshev@gmail.com wrote:
Actually it is very related to usability - performance is a very important factor in usability and that is why Google, Yahoo and Amazon made research how it affects usage and figured out a few very interesting numbers that for them convert to hard cash:
Well, it depends on how broadly you construe "usability". If you take "good usability" to mean "more use/sales", then it seems like virtually any improvement to the software could could as a usability improvement. I would normally interpret "good usability" to mean "users are able to easily figure out how to use the software properly". That seems to be how the usability grant interprets it. Latency is not relevant to that kind of usability. Which is not to say that reducing latency isn't an excellent idea, I just wouldn't classify it as usability.
Please don't give me this attitude - "you need it, you do it".
I wasn't sure if you were asking to contribute or asking for Wikimedia help. If you were asking how to contribute personally, my answer was perfectly appropriate. It seems that's not what you were asking, so the next answer ("ask Brion") is the appropriate one for you.
wikitech-l@lists.wikimedia.org