Jason's got the new machine installed (thanks!). It's at 130.94.122.204, but it needs a name... Among the previous suggestions to name machines after were:
http://en.wikipedia.org/wiki/Denis_Diderot http://en.wikipedia.org/wiki/Mortimer_Adler
Any preference for 'diderot' or 'adler' or something else?
I can import the last backup dump and test it tonight; if all seems well any objection to moving the database over tomorrow night? (Early morning UTC on December 3 == evening US December 2.) The databases will need to be frozen/read-only for the duration of copying, and I'm not sure how long this will take so I don't want to spring it on everyone unannounced.
-- brion vibber (brion @ pobox.com)
I vote for diderot.
Brion Vibber wrote:
Jason's got the new machine installed (thanks!). It's at 130.94.122.204, but it needs a name... Among the previous suggestions to name machines after were:
http://en.wikipedia.org/wiki/Denis_Diderot http://en.wikipedia.org/wiki/Mortimer_Adler
Any preference for 'diderot' or 'adler' or something else?
I can import the last backup dump and test it tonight; if all seems well any objection to moving the database over tomorrow night? (Early morning UTC on December 3 == evening US December 2.) The databases will need to be frozen/read-only for the duration of copying, and I'm not sure how long this will take so I don't want to spring it on everyone unannounced.
-- brion vibber (brion @ pobox.com)
Wikitech-l mailing list Wikitech-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
On Mon, 1 Dec 2003, Brion Vibber wrote:
Jason's got the new machine installed (thanks!). It's at 130.94.122.204, but it needs a name... Among the previous suggestions to name machines after were:
http://en.wikipedia.org/wiki/Denis_Diderot http://en.wikipedia.org/wiki/Mortimer_Adler
Any preference for 'diderot' or 'adler' or something else?
I would feel for a non-western name, to underline the international character of Wikipedia. What about Yongle? (http://en.wikipedia.org/wiki/Yongle_Encyclopedia)
But the more important thing here is not to get delayed by these trivial considerations...
Andre Engels
On Tue, Dec 02, 2003 at 11:17:38AM +0100, Andre Engels wrote:
I would feel for a non-western name, to underline the international character of Wikipedia. What about Yongle? (http://en.wikipedia.org/wiki/Yongle_Encyclopedia)
That's an excellent idea. I vote for it.
Importing the cur and old tables for the English Wikipedia's last backup dump took 2 hours and 45 minutes, at about 20-25% CPU usage. Most of the time spent was waiting on disk output, and most of the CPU that was used was for decompression.
Write speed was around ~15k blocks/sec (1KB blocks -> 15MB/s?)... the InnoDB data store has grown to about 18gb, so a straight write of the data at that rate should have taken about 20 minutes. Clearly it's updating more than it needs to; some tweaking of the MySQL configuration should help streamline this, increasing the buffers and flushing the log less often. It's now on the defaults... I just slapped in a MySQL 4.0.16 Max binary install for AMD64.
(I'm not sure if just copying the datastore files from pliny would work; I have the impression that the on-disk format is platform-dependent, and we've got a 32/64-bit difference here.) Note also that the network seems to be 100-megabit, so a live transfer may be limited by network speed, but that doesn't look like the biggest bottleneck right now.
If I can get PHP up and running I'll try rebuilding the links table and see how long that takes...
-- brion vibber (brion @ pobox.com)
On Dec 2, 2003, at 04:50, Brion Vibber wrote:
If I can get PHP up and running I'll try rebuilding the links table and see how long that takes...
Okay, got that started: ... 6000 of 352641 articles scanned (58 articles and 1302 titles per second) Title cache hits: 37% ...
IO is fairly light, and the processes seem CPU-bound; usage bounces between about 50/50 mysqld/php to more like 75/25. Between them they only use a single processor, though, since PHP waits for MySQL to process and return an entire query before continuing.
I've put in the stock my-huge.cnf with slight modifications; bumped key cache and innodb buffer cache up to 1 gig each, innodb log file to 128meg (good? bad?), and set the transaction log to only force disk flushes at most once per second.
-- brion vibber (brion @ pobox.com)
On Dec 2, 2003, at 05:24, Brion Vibber wrote:
On Dec 2, 2003, at 04:50, Brion Vibber wrote:
If I can get PHP up and running I'll try rebuilding the links table and see how long that takes...
Okay, got that started: ... 6000 of 352641 articles scanned (58 articles and 1302 titles per second) Title cache hits: 37%
It did get through the index rebuild, but seems to have croaked rebuilding the search index fields (via rebuildall.php):
Rebuilding index fields for 352641 pages... 500 [snip] 231000 <br /> <b>Fatal error</b>: Call to a member function on a non-object in <b>/home/brion/src/wiki/phase3/includes/SearchUpdate.php</b> on line <b>15</b><br />
D'oh!
Took an hour and 42 minutes to get to that point.
-- brion vibber (brion @ pobox.com)
wikitech-l@lists.wikimedia.org