I am trying to set up a multilanguage Wikipedia for offline use from xml dumps using
mediawiki. To do this, I have downloaded dumps from wikimedia foundation and tried to set up
a Mysql server following the instructions given here:
I managed to go all the way through these instructions, and the mwimport line
<dawiki-<date>.xml mwimport | mysql -f -u <admin name> -p <database name> ,
(where <database name> was wikidb).
finished without errors. But I can't find my (Danish) wikipedia now !
>From a similar instructions page as the one above, I would expect to find it in http://localhost/wikidb/ ,
but there seems is nothing there (404 error), although I can find that a database wikidb has been created from
myphpadmin, with 185000 entries in it.
Thank you very much for your help !
Sensationsangebot nur bis 30.11: GMX FreeDSL - Telefonanschluss + DSL
für nur 16,37 Euro/mtl.!* http://dsl.gmx.de/?ac=OM.AD.PD003K11308T4569a
The latest history file I could discover at
http://download.wikimedia.org/enwiki is from 20080312. If you happen
to know about the availability of more recent history files, could you
please send the URL?
Thanks a lot in advance for your advice and time.
I'm having a small problem with setting the value of a hidden form field
in the customUserCreateForm extension.
The alert gives the correct value, but somehow it doesn't get posted
after the user created a new account.
The value of the hidden value has to be set when someone checks a checkbox.
<input type='checkbox' name="wpFacultyCheck" onChange="createUserToggle(this)"
<input type="hidden" id="hiddenFacultyField" name="hiddenFacultyField" value="" />
Then in common.js I have:
alert(obj.value +" "+ document.getElementById('hiddenFacultyField').value)
I used "onchange" as "onclick" doesn't work in FF3.
Does anyone have a good solution for this?
Hi to all,
I want to know how can i get wikitext from simple wikipedia html page.
e.g i search for free keyword. I get wikipedia HTML text of that page. How
can i get it's wikitext and save it at my desktop. For further use.
I have to move a mediawiki install from an Ubuntu server (which was installed with the .deb package) to a windows server. I saw the how to and it talks about copying all the mediawiki files over but I can't seem to find them in one place on the ubuntu server. I am sure I am missing simple.
I tried to find a paper on the differences in directory structure between windows and the ubuntu but could not find anything.
If you could point me in the right direction, I would appreciate it.
Given the following text
where out1, in1, in2, out2 are tags defined as usual in
that order. In which order are these tags "expanded"?
out1 in1 in2 out2 (ie, as defined)?
out2 in2 out1 in1 (ie, as found)?
in2 out2 in1 out1?
I spent the last weeks researching lightly on the permissions situation
at MediaWiki and I found out the following -
a) MediaWiki is not made for permissions, however
b) MediaWiki starts to enter the corporate world, so extensions
implementing permissions start to appear, but
c) Because MediaWiki isn't uniformly developed (eg. not all reads go
over a read class, etc.) this isn't working very well.
While I fully respect a), I see the reasons behind b), because our
company (50 people), looking for a straight
almost-turn-key-out-of-the-box intranet solution, chose MediaWiki in
order to capitalize on the experience people have made with Wikipedia.
Having the treind, described in b), could somebody explain in a calm,
non-partisan way if c) is true and if there is a honest and stable mood
in this direction, or we should, generally speaking, forget it?
If the answer is positive, we are ready to contribute.
This used to be working, but has broken recently, either though my moving
the install around or through upgrading to 1.13.0.
I try to do an upload and I get:
Could not rename file "/tmp/phpVxOSnK" to
I've ensured that 'images' and it's subdirs are all writeable by the server.
I've pinned this down to ./includes/filerepo/FSRepo.php line 338, but am
struggling to diagnose any further (my PHP is limited). The following edit
provides no clues:
$status->error( 'filerenameerror', $srcPath, '['.getcwd()."|$IP]".$dstPath );
As I get "[$IP|]public/f/fd/Framework-SLOC-OOCalc.png" in the target
filename. Shouldn't getcwd() work? Shouldn't $IP evaluate?
Existing images do come up OK AFAICS.
I also tried creating 'images/public', and symlinking images /to/ public,
but to no avail.
[neil@fnx ~]# rm -f .signature
[neil@fnx ~]# ls -l .signature
ls: .signature: No such file or directory
[neil@fnx ~]# exit
I am one of the developers for TurnKey Linux, an opensource project that
aims to develop high-quality software appliances that are easy to use,
easy to deploy, and free. Our motto is "everything that can be easy,
should be easy!"
We recently released TurnKey MediaWiki, an easy-to-use, lightweight,
installable live CD of MediaWiki that can run on real hardware in
addition to most types of virtual machines. It is designed to provide
users with a pre-integrated, turn-key operating system environment that
is carefully built from the ground up with the minimum components needed
to run MediaWiki with maximum usability, efficiency, and security.
* automatically updated on a daily basis with latest security patches
* Mac OS X themed web management interface
* configuration and installation console (written from scratch in Python)
* based on Ubuntu 8.04.1 Hardy LTS
I'm announcing it here so you guys can try it out, give us some feedback
and help us spread the word.
For further details see:
I noticed that the Googlebot does not use the exact article path from
the sitemap files but uses Special:RecentchangesLinked instead to access
the articles. So instead of /www.mysite.com/wiki/My_Article/, Google
uses /www.mysite.com/wiki/Special:RecentchangesLinked/My_Article/. Why
I ran into this because I disallowed //wiki/Special:RecentchangesLinked/
within robots.txt. This was to prevent the bots from indexing all