On Thu, Oct 6, 2011 at 12:54 PM, Fred Zimmerman <zimzaz.wfz(a)gmail.com>wrote;wrote:
Hi,
I am hoping that someone here can help me - I realize there is an
xmlwikidumps mailing list but it is pretty low volume and expertise
relative
to this one. THere is a lot of conflicting advice on the mirroring
wikipedia
page. C I am setting up a local mirror of english Wikipedia
pages-articles...xml and I have been stopped by repeated failures in mySQL
configuration.
I have downloaded the .gz data from the dumped and run it through mwdumper
to create an SQL file without problems, but things keep breaking down on
the
way into mysql. I have had a lot of agony with inno_db log files, etc. and
have learned how to make them bigger, but I'm still apaprently missing some
pieces.
What fails, exactly? Do you get error messages of some kind? Without knowing
what's going wrong, there's little advice that can be given.
my target machine is an AWS instance 32 bit Ubuntu
1.7GB RAM, 1 Core, 200
GB, which is only for this project. I can make it bigger if necessary.
Can someone take a look at this my.cnf file and tell me what I need to
change to get this to work?
this is what my.cnf file looks like:
The only requirement I can think of is making sure max_packet_size is
biggish so all the pages import; you appear to have set it to 128M which
should be more than big enough.
Other settings I would assume should depend on your available memory,
workload, etc.
-- brion