[Mediawiki-l] Maintenance problems

Brion Vibber brion at pobox.com
Mon Nov 29 10:50:14 UTC 2004


On Nov 29, 2004, at 1:06 AM, Plamen Gradinarov wrote:
>> Can you confirm that the database username and password in
> AdminSettings.php are correct?
>
> Thank you very much, Brion, you are very helpful! Of course, it was my
> fault, there were my wikiadmin logins in the AdminSettings.php, not the
> wikiDB ones. Now the real count started by 50 and seems to take several
> hours (probably more than 36 at the present rate).

Building the link tables takes a long time as it requires parsing and 
rendering every page in the database to determine which links are used.

If you're just pulling the Wikipedia database dumps, you might consider 
importing the links table backups that we provide with the dumps. At 
http://download.wikimedia.org/archives/en/ you'll see 
en_links_table.sql.gz, en_brokenlinks_table.sql.gz, etc.

> The workaround (I'm afraid I took it for a standalone replacement of 
> the
> entire rebuild problem, while it should be applied only if the 
> commandline
> solution works but still returns 0 articles) is found here:
> http://meta.wikimedia.org/wiki/Documentation:Administration
> starting with: "To rebuild the article count manually"

This explains how to set the article count; it's not related to 
rebuilding the link tables at all, but is a separate operation. This 
doesn't take _too_ long, but again if you want to skip it you can pull 
eg en_site_stats_table.sql.gz.

> BTW, how much time and space will take the creation of a SearchIndex 
> for the
> last dump of the English Wiki?

The searchindex table for en.wikipedia.org and its fulltext index 
together are about 5.2GB on our server. Generating the search index 
should go much faster than building the link tables, as the entries are 
only lightly processed, but may still take a few hours for that large a 
database depending on your system's speed and other system load while 
it's running. We don't currently make public backups of the search 
index table due to the size and the relative ease of regenerating it.

>  Now the CPU load is about 3.60 (Dual Xeon 2.8
> HT). Probably it will be less with an additional 1 GB RAM... Are there 
> any
> hardware requirements for running, say, an English cur replica of 
> Wikimedia?

Requirements will depend mainly on usage. You should probably install a 
PHP optimizer such as Turck MMCache (a very easy way to make any PHP 
apps run faster), and if load is high consider the other caching 
options (file cache mode, squid reverse proxy with squid mode turned 
on, memcached+parser cache).

-- brion vibber (brion @ pobox.com)
-------------- next part --------------
A non-text attachment was scrubbed...
Name: PGP.sig
Type: application/pgp-signature
Size: 186 bytes
Desc: This is a digitally signed message part
Url : http://lists.wikimedia.org/pipermail/mediawiki-l/attachments/20041129/bcc26bd1/attachment.pgp 


More information about the MediaWiki-l mailing list