On Wed, May 30, 2012 at 12:49 AM, chepukha <ndkhanh824(a)gmail.com> wrote:
I think there are files missing in the source code.
directory src/org/apache/commons/compress/ is empty.
I see mwdumper has not been under development for years.
Not quite right, it did have one commit ~3 months ago. but the last
before that was >2 yrs back.
It looks like the original problem you mailed about and this new
problem were both caused by that latest commit. That commit removed
the bzip2 code which is now the source of your latest error. Maybe
Oren (cc'd) can shed some light on why it was removed or how the
current code should be compiled or run. The message for that commit
just says it was removed but not why or what the new way to get that
code is; also there were no updates to the README.
In the meantime you could just try checking out the older version
(immediately before that last commit) and testing with that.
What do people
use to load (wiki) xml data into database?
I don't know, maybe this is the only tool. Maybe someone else will chime in.
I saw MySQLDumper has restore script that can read
data from .gz file
back into database but not sure if it does what I need. I'll take a look at
Please avoid using that script if you can. If you're using a halfway
decent host (or running your own server/working locally) then there's
much better ways to do an import. If you need help importing a MySQL
dump you can tell us more about what you're trying to do (.gz is not
indication of dump format or structure, just compression format; we'd
need more details) or come visit either #mediawiki or #mysql (both on
freenode) and ask there.