Tim Starling wrote:
> Would it be possible to generate some profiling data for the live wiki?
> Say, turning on $wgProfiling for one in every thousand requests to
> wiki.phtml?
You only need to profile the slow requests. That's what I do on
susning.nu. It works for me.
--
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se/
Jason - could you forward wikiquote.org and wikiquote.com to
http://quote.wikipedia.org and send Brion the access info to manage those
domains? We are going to be distributing a press release that mentions and
links to Wikiquote via http://wikiquote.org next week.
Brion - could you change the "Wikipedia Foundation" link at
http://wikimedia.org to point to http://wikimediafoundation.org and get rid
of the "Coming soon..." text?
And what the heck is up with http://www.nupedia.org/ ? Who is "THE TV
CLUBHOUSE"? Please, if Nupedia is dead can we de-link it from the
wikimedia.org page?
Thank you both! :-)
FYI: I'm leaving for a four day field study in the morning and won't be
responding to email until early Monday UTC.
-- Daniel Mayer (aka mav)
Hello,
Apologies in advance if this isn't "tech" enough for wikitech-l.
I have written a perl script which parses
http://www.wikipedia.org/wiki/Wikipedia:Announcements and creates an RSS
file suitable for a news aggregator. This was purely to scratch an itch: I'm
not a hugely regular contributor but I like to keep up-to-date on the
various announcements and found myself visiting that page a lot. Anyway,
this file is available at http://jeays.net/wikipedia/announcements.xml and
is updated 4 times a day (therefore only adding 4 page views a day to the
server). If you don't have a news aggregator or similar software, the feed
might look something like this:
http://jade.mcli.dist.maricopa.edu/feed/build.php?src=http%3A%2F%2Fjeays.net
%2Fwikipedia%2Fannouncements.xml&chan=yes&num=0&desc=yes&date=yes&preview=pr
eview+the+feed
Of course, since it's a wiki-editable page, it could well "break" every so
often. I've tried to make it reasonably robust, but I'm not expecting this
feed to be used for making earth-shattering decisions and I will try to fix
it if I notice it breaks.
I'm open to ideas / suggestions or different directions for this. I'm
perfectly willing to share what code I have, or update the RSS feed on the
wikipedia server instead, or other possibilities. However, I have a
reasonable amount of bandwidth per month available on my server, so I could
certainly handle a fair number of subscribers.
Regards,
Mark Jeays (Dze27 on Wikipedia)
The new CPUs are still waiting to be shipped, but the RAM upgrade (4gig)
for pliny is in, and Jason took down the server for a bit to get it
installed.
Unfortunately, whether the problem is software or hardware, it didn't
work. :( With the new memory in, mysqld would crash instead of
running... Attached log output from the last try.
The old memory (2gig) is back for now.
We don't currently have another machine which can hold the ram to sit
and run a memory tester program, and more downtime isn't really
acceptable just now.
The kernel (2.4.20) is configured with CONFIG_HIGHMEM4G and
CONFIG_HIGHMEM, and everything's lovely at 2 gig.
-- brion vibber (brion @ pobox.com)
Hi,
I have put a new LanguageDe.php in the CVS, added the hidden input for
the google search, like done in Language.php.
Greetings Smurf
--
Smurf
Human brain has no ECC and looses more data than silicon based memory.
smurf(a)AdamAnt.mud.de
------------------------- Anthill inside! ---------------------------
This server really seems to have a small problem...
We may have to do more tests on this computer.
> -----Original Message-----
> From: Brion Vibber [mailto:brion@pobox.com]
> Sent: 24 September 2003 10:00
> To: maveric149(a)yahoo.com; Wikimedia developers
> Subject: Re: [Wikitech-l] wikimediafoundation.org/fundraising
>
>
> On Tue, 2003-09-23 at 23:21, Daniel Mayer wrote:
> > Could somebody with access to the wikimediafoundation.org
> domain copy
> > http://meta.wikipedia.org/upload/6/68/Fundraising.html onto
> that website and
> > have it display for users when they click on
> > http://wikimediafoundation.org/fundraising ?
>
> Done.
>
> > Of course, I will stop the presses if our upgrades won't be
> online by that
> > time. When is that going to get done?
>
> Jason has just installed a RAM upgrade on pliny, but this caused
> mysterious crashes so we've put the old RAM back until it can
> be tested
> more thoroughly.
>
> As for the CPU upgrades for larousse, Jimmy tells me there's
> some delay
> with the shipping of parts, so we're looking at "early next week". :(
>
> The server that had held the old Wikipedias and Nupedia and some other
> stuff is back online, and we may be able to set it up temporarily as
> another Wikipedia front-end web server to help with the load.
>
> -- brion vibber (brion @ pobox.com)
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)Wikipedia.org
> http://mail.wikipedia.org/mailman/listinfo/wikitech-l
>
Brion wrote:
>> Could somebody with access to the wikimediafoundation.org
>>domain copy http://meta.wikipedia.org/upload/6/68/Fundraising.html
>>onto that website and have it display for users when they click on
>> http://wikimediafoundation.org/fundraising ?
>
>Done.
Sweet! Thanks Brion.
>Jason has just installed a RAM upgrade on pliny, but
>this caused mysterious crashes so we've put the old RAM
>back until it can be tested more thoroughly.
Too bad. :-(
>As for the CPU upgrades for larousse, Jimmy tells me
>there's some delay with the shipping of parts, so we're
>looking at "early next week". :(
More bad news. :-(
>The server that had held the old Wikipedias and Nupedia
>and some other stuff is back online, and we may be able
>to set it up temporarily as another Wikipedia front-end web
>server to help with the load.
Hm. If that is the case, then can the non-English Wikipedias start to
distribute their versions of the press release on Monday, or should we just
put everything on hold for a week to give everybody some breathing room?
-- Daniel Mayer (aka mav)
Hello.
I've been following the discussions on the problems with the web
servers. It seems that there are too many concurrent processes for the
limited hardware we are currently using. I believe I read numbers
somewhere of a steady load of 20+ processes in the run queue on one of
the servers. Of course there will be a lot of context switching going
on, with some overhead. But worse, it will guarantee that all processes
get the worst possible total running time.
Example: n processes arrive at the same time. If the necessary CPU time
for every process is t seconds, all processes will finish after t*n
seconds in a concurrent system.
If, on the other hand, the processes run completely serialized, one
after another, the first process will finish after t seconds, the second
one after 2*t seconds and so on up to n*t seconds for the n:th process.
This gives an average running time of t*n/2 seconds, with a worst case
time of t*n.
In other words, I think we could gain a significant performance gain by
reducing the number of concurrent processes. I admit I have no idea how
this works today, if there is such a limit in place already. Nor do I
know whether this can be easily solved with some apache or PHP directive
(although I find it very likely).
With this in mind, you can have a look at the attached code, which
should illustrate the concept. It can surely be optimized, I don't need
pointed out what kind of atrocity it is to implement counting semaphores
by using MySQL named locks. Nevertheless, it was quickly implemented and
works quite well. Also, any overhead from implementation inefficiencies
is too small to be easily measured on my not-too-powerful PC. If it has
the predicted effect, any overhead would probably be neglible compared
to the gains.
Should anyone care to test this out on a live wiki (I haven't got a
clone running, so it's untested in that aspect), it's quite easy. Just
copy the two semaphore functions somewhere where they are included, and
then edit wiki.phtml so that it calls the functions before and after
main work, as well as starting and flushing the buffer. Like so:
ob_start();
wait_for_semaphore();
// Page generation - wikicode parsing
release_semaphore();
ob_flush();
Yours,
E23, swedish wikipedia.
Could somebody with access to the wikimediafoundation.org domain copy
http://meta.wikipedia.org/upload/6/68/Fundraising.html onto that website and
have it display for users when they click on
http://wikimediafoundation.org/fundraising ?
I'll try to put together a bare basic homepage for the foundation before I
leave for my four day Yosemite/Mammoth area trip on Thursday.
Both of these things are needed in order to start distributing the press
release on Monday.
Of course, I will stop the presses if our upgrades won't be online by that
time. When is that going to get done?
-- Daniel Mayer (aka mav)