I can barely stand to refer to the current wiki server as "the current
wiki server", and I don't even know how I'll refer to the new server.
I'd like to give the servers official names, just so I can refer to
them in a single word.
Any suggestions?
--
"Jason C. Richey" <jasonr(a)bomis.com>
At Nick's suggestion, I recompiled Apache with mod_mmap_static (and
upgraded PHP to 4.3 while I was at it). It tested well on Piclab, so
I installed it on the server and made the logo, icon, stylesheets,
and robots.txt memory mapped.
--
Lee Daniel Crocker <lee(a)piclab.com> <http://www.piclab.com/lee/>
"All inventions or works of authorship original to me, herein and past,
are placed irrevocably in the public domain, and may be used or modified
for any purpose, without permission, attribution, or notification."--LDC
How about organizing a chat this week about the ongoing Wikipedia
performance crisis and how to solve it? Talking to people can provide
additional motivation for getting things done, and help us organize our
priorities. It might also reduce some frustration. If we do this, all the
relevant people should be present:
- Jimbo
- Jason
- Brion
- Lee
- Magnus
- ...
It might be best to meet on the weekend, so that work does not interfere.
My suggestion would be Saturday, 20:00 UTC.
What do you think?
Regards,
Erik
Jason wrote:
>I'll be able to make the trip to San Diego on Friday.
>The server should be ready to use (ssh-able for all
>people with accounts on the current server) on
Saturday.
It might be more trouble than it is worth, but do you
want me to send you the AMD 500 Linux box I have it
serve as the new mail server? If you can use it I'm
more than happy to send it but if another solution is
easier/makes more sense (like having Lee's ISP host
WikiTech-L) then I'm cool with that too.
I don't want to force hardware on anybody - all I want
is for the developers to be able to talk to each other
via a mailing list when the Wikipedia server is down.
--Daniel Mayer (aka mav)
__________________________________
Do you Yahoo!?
The New Yahoo! Search - Faster. Easier. Bingo.
http://search.yahoo.com
I know there was some discussion about kernel upgrade of the current
wiki server. I'll be in front of the servers Friday evening, so I
could easily make such an upgrade.
I am familiar with the kernel and compiling and installing (I use
LILO, if that's okay with everyone) and such. So, if we can reach a
consensus about the kernel version, I can manage. Alternatively, if
there is already one compiled and ready, I can just install that.
--
"Jason C. Richey" <jasonr(a)bomis.com>
Apologies if this is something obvious.
I tried to install the latest CVS phase3 code on RH Linux 7.2, MySQL
3.23.54, PHP 4.3.1. I got the following error when running "php
install.php":
Enter the root password here: ***
Creating database...
Fatal error: Call to a member function on a non-object in
/var/www/kovi/www/wiki/User.php on line 287
/var/www/kovi/www/wiki/User.php(287) : Fatal error - Call to a member
function on a non-object
My guess is that it comes from populatedata() in install.php which calls:
$sql = "INSERT INTO user (user_name, user_password, user_rights)"
.
"VALUES ('WikiSysop','" . User::encryptPassword(
$wgDBadminpassword ) .
"','sysop'),('WikiDeveloper','" . User::encryptPassword(
$wgDBadminpassword ) . "','sysop,developer')";
wfQuery( $sql, $fname );
User::encryptPassword() is:
function encryptPassword( $p )
{
return $this->addSalt( md5( $p ) );
}
and User::addSalt() is:
function addSalt( $p )
{
return md5( "wikipedia{$this->mId}-{$p}" );
}
Now, I know next to nothing about PHP but what the error seems to be
saying that encryptPassword is being called as a static function (in C++
meaning) while it isn't because it calls addSalt() which isn't static
because tries to access object data so it needs an instance of the class
to operate on.
Is this a known problem or am I doing something wrong?
Thanks,
On the "every little bit helps" theory, I've offloaded wiki.png to a
different machine for now. At least that's going to free up some
httpd's to do something else.
If I recall correctly, both of the servers have serial ports
available. I don't know much (or anything) about serial terminals,
but I can install a null modem cable between the two. I'll look into
what is involved in setting up a serial terminal...
--
"Jason C. Richey" <jasonr(a)bomis.com>
I forced the site into SuperMiserMode to deal with the overloading
this morning. Maybe it was in SuperMiserMode already, though.
# TEMPORARILY ADDED BY JIMBO 4/29/03
$wgSuperMiserMode = true;
$wgMiserMode = true;
In LocalSettings.php
This is just an emergency thing, so I just edited the live file,
I have nothing useful for CVS!
Hi!
With all the talk going on, I wanted to give you my little thoughts.
I am more on the wiki user side, so keep that in mind (I do know php, so I
am not a complete dork)
1) how about a query cache (for edits and searches).
http://www.vbulletin.com/ (not OpenSource), does something like that for
search queries:
http://forum.doom9.org/search.php
Basically you instantly see a html page, which tells you that the search is
in progress. This looks good for the user. The page reload can probably be
configured in intervals. That way the best server load (e.g: 4 searches /
second) can be obtained and it will never be used more that what you specify
I think http://www.phpbb.com/downloads.php (GPL'ed) can do the same, but I
am not sure.
The thing is, when users see, that the server is doing something, they will
not try to hit refresh every few seconds, because they think it helps (which
it doesn't). User feedback is and always should be top priority.
2) What is really slowing down wikipedia
As I see it, not everyone agrees on that. I am also guessing it's the db,
and not the php-code that does the rendering. Maybe someone can somehow put
this together, like searches (7/second avg. - serverload: 30%),
edit(1/second: - serverload: 5%), recentchanges ( 5/second, 25% load), etc.
maybe some real numbers will help focus on the problem.
So If I am guessing right, and the sql-query for the searches takes the most
time, building up a queue for searches like above might satisfy users (at
least they know know something is happening) and it will reduce serverload.
I hope this helps the discussion somewhat.
Cheers
Leo