There is an issue with running a foreground JS thread that is super fast and might send a lot of request to the server. Heavy processing on the client side would alleviate the load from the server (if possible) but it might push another load on the server (in the presented example of sending emails to uses).
I have worked on an AJAX application that sends email using a Javascript application and it turns out that the server was denying the JS requests because it went beyond the allowed limit of connections from a single host.
A better approach might be to start the task at the client side and save it ina queue at the server side for another process (server side) to take care of it later on in FIFO mode.
On Wed, Apr 22, 2009 at 12:18 PM, Brion Vibber brion@wikimedia.org wrote:
Perhaps... but note that the i/o for XMLHTTPRequest is asynchronous to begin with -- it's really only if you're doing heavy client-side _processing_ that you're likely to benefit from a background worker thread.
On 4/17/09 6:45 PM, Marco Schuster wrote:
You mean...stuff like bots written in Javascript, using the XML API? I could imagine also sending mails via Special:Emailuser in the background to reach multiple recipients - that's a PITA if you want send mails to multiple users.