Dear all,
I am trying to set up a multilanguage Wikipedia for offline use from xml dumps using mediawiki. To do this, I have downloaded dumps from wikimedia foundation and tried to set up a Mysql server following the instructions given here:
http://modzer0.cs.uaf.edu/~dev2c/wiki/How_to_mirror_Wikipedia
I managed to go all the way through these instructions, and the mwimport line
<dawiki-<date>.xml mwimport | mysql -f -u <admin name> -p <database name> ,
(where <database name> was wikidb).
finished without errors. But I can't find my (Danish) wikipedia now !
From a similar instructions page as the one above, I would expect to find it in http://localhost/wikidb/ ,
but there seems is nothing there (404 error), although I can find that a database wikidb has been created from myphpadmin, with 185000 entries in it.
Thank you very much for your help !
Best regards,
Axel
Axel Etzold wrote:
Dear all,
I am trying to set up a multilanguage Wikipedia for offline use from xml dumps using mediawiki. To do this, I have downloaded dumps from wikimedia foundation and tried to set up a Mysql server following the instructions given here:
http://modzer0.cs.uaf.edu/~dev2c/wiki/How_to_mirror_Wikipedia
I managed to go all the way through these instructions, and the mwimport line
<dawiki-<date>.xml mwimport | mysql -f -u <admin name> -p <database name> ,
(where <database name> was wikidb).
finished without errors. But I can't find my (Danish) wikipedia now ! From a similar instructions page as the one above, I would expect to find it in http://localhost/wikidb/ , but there seems is nothing there (404 error), although I can find that a database wikidb has been created from myphpadmin, with 185000 entries in it.
Thank you very much for your help !
Best regards,
Axel
It's on the place you installed the mediawiki software. Following those instructions above, that would be http://localhost/w/
If you want it just for one box, you may be interested in a program I recently made for offline MediaWiki browsing (almost) directly from the dumps.
-------- Original-Nachricht --------
Datum: Sat, 22 Nov 2008 23:18:17 +0100 Von: Keisial keisial@gmail.com An: mediawiki-l@lists.wikimedia.org Betreff: Re: [Mediawiki-l] Offline wikipedia installation with mwimport question
Axel Etzold wrote:
Dear all,
I am trying to set up a multilanguage Wikipedia for offline use from xml
dumps using
mediawiki. To do this, I have downloaded dumps from wikimedia foundation
and tried to set up
a Mysql server following the instructions given here:
http://modzer0.cs.uaf.edu/~dev2c/wiki/How_to_mirror_Wikipedia
I managed to go all the way through these instructions, and the mwimport
line
<dawiki-<date>.xml mwimport | mysql -f -u <admin name> -p <database
name> ,
(where <database name> was wikidb).
finished without errors. But I can't find my (Danish) wikipedia now ! From a similar instructions page as the one above, I would expect to
find it in http://localhost/wikidb/ ,
but there seems is nothing there (404 error), although I can find that a
database wikidb has been created from
myphpadmin, with 185000 entries in it.
Thank you very much for your help !
Best regards,
Axel
It's on the place you installed the mediawiki software. Following those instructions above, that would be http://localhost/w/
If you want it just for one box, you may be interested in a program I recently made for offline MediaWiki browsing (almost) directly from the dumps.
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Dear Keisial,
thank you very much for responding. I found it now and it's working well.
Could you help me with one additional question: In the online Wikipedia, you can enter a term, press the search button and get a list of articles with relevance percentages.
How could I implement that for my local wikipedia ? I'd be glad to learn about your software as well.
Thank you very much!
Best regards,
Axel
Axel Etzold wrote:
Dear Keisial,
thank you very much for responding. I found it now and it's working well.
Could you help me with one additional question: In the online Wikipedia, you can enter a term, press the search button and get a list of articles with relevance percentages.
How could I implement that for my local wikipedia ?
Wikipedia uses lucene, but the simpler mysql search is probably enough for your needs. The mwimport script doesn't seem to populate any secondary table with links. To populate the search index run maintenance/rebuildtextindex.php (note it is a slow operation) For filling other tables (categories, what links here...) run maintenance/refreshLinks.php
I'd be glad to learn about your software as well.
You give the windows user the program. He opens the icon and sees the wiki. Under the hood, it's running a MediaWiki instance and reading the articles from compressed xml dumps.
mediawiki-l@lists.wikimedia.org