Hello to this list !
I downloaded enwiki,dewiki,frwiki and itwiki dumps and imported them correctly into Mediawiki. Table text is the table that contains the actual wikipedia pages (field old_text) and it was created as MyISAM for adding fulltext search capabilities. dewiki.text contains 1.248.933 rows of > 2,8 GB size.
I issued this command on mysql (v5.0.45) command line: ALTER TABLE dewiki.text ADD FULLTEXT (old_text);
This command is running now for approx. 20 hours (!) not showing up any errors. Apparently MySQL server is up and running. (W2kSP4; MySQL v5.0.45; Intel Pentium M 1,7 GHz, 512 RAM; mysql-nt.exe is using 152.760 kb constantly)
There is only 1 active thread: Command: Query; Time 74059; State: copy to tmp_table; Info: ALTER TABLE dewiki.text ADD FULLTEXT (old_text);
If interested, see MySQL variables set in this recent post to this list: http://article.gmane.org/gmane.science.linguistics.wikipedia.technical/32872...
I know, this is not a Wikitech question in strict sense. Cannot find sufficient answers for optimizing MySQL for large files as Wikipedia's in Mysql forums or elsewhere...
Any suggestions welcome !
Alex Hoelzel, http://www.meshine.info
Alexander Hölzel CEO EUTROPA AG
=========================== EUTROPA Aktiengesellschaft Oelmüllerstrasse 9, D-82166 Gräfelfing, Tel 089 87130900, Fax 089 87130902 ===========================