Hello everybody,
First thread in the list, i'm relatively new to wikipedia/media usage (1
month).
I followed the doc (RTFM as usual) and every steps seems to finish in a
wall.
My simple question, how to correctly install a wikipedia mirror from dumps
in mediawiki ?
My goal:
Create an offline wikimedia server, from FR, EN or PT dumps, (1 language
only).
I did some tests with the wowiki which is small enough for testing purpose
but I encountered problems:
The tools mentioned in the doc are 404 for most of them or outdated. for
example the xml2sql or mwdumper.
the original importer of wikimedia takes 20 minutes to import the wowiki
1.6Mo dump which means approximatly a decade for the FR one...
I tried with the last MWDumper found on github, it can quickly generate a
good sql file but it seems that special characters in page text are not
"slashed" so \n return SQL error...
Some columns are missing in original mediawiki installer, the
"page_counter" in "page" table for example, is there some needed extensions
to install dumps?
How can i found/get medias (images low res are enough)
How to do with interwiki medias/links/pages?
To follow next step will be updates procedure, incremental or complete...
Seems french wikimedia team are stuck with my questions..
I'm not closed to development (python, PHP or JEE) but i need an entry
point to begin if nobody work on it...
Thank you for your work, i know you are quite busy with the dump generation
but our project is quite serious and lots of people want it and we need
quick answers for the feasability.
Best regards,
Yoni