I recently set up a MediaWiki (http://server.bluewatersys.com/w90n740/)
and I need to extra the content from it and convert it into LaTeX
syntax for printed documentation. I have googled for a suitable OSS
solution but nothing was apparent.
I would prefer a script written in Python, but any recommendations
would be very welcome.
Do you know of anything suitable?
Sorry about bugging the list about it, but can anyone please explain
the reason for not enabling the Interlanguage extension?
See bug 15607 -
I believe that enabling it will be very beneficial for many projects
and many people expressed their support of it. I am not saying that
there are no reasons to not enable it; maybe there is a good reason,
but i don't understand it. I also understand that there are many other
unsolved bugs, but this one seems to have a ready and rather simple
I am only sending it to raise the problem. If you know the answer, you
may comment at the bug page.
Thanks in advance.
Amir Elisha Aharoni
heb: http://haharoni.wordpress.com | eng: http://aharoni.wordpress.com
cat: http://aprenent.wordpress.com | rus: http://amire80.livejournal.com
"We're living in pieces,
I want to live in peace." - T. Moore
I've been putting placeholder images on a lot of articles on en:wp.
e.g. [[Image:Replace this image male.svg]], which goes to
[[Wikipedia:Fromowner]], which asks people to upload an image if they
I know it's inspired people to add free content images to articles in
several cases. What I'm interested in is numbers. So what I'd need is
a list of edits where one of the SVGs that redirects to
[[Wikipedia:Fromowner]] is replaced with an image. (Checking which of
those are actually free images can come next.)
Is there a tolerably easy way to get this info from a dump? Any
Wikipedia statistics fans who think this'd be easy?
(If the placeholders do work, then it'd also be useful convincing some
wikiprojects to encourage the things. Not that there's ownership of
articles on en:wp, of *course* ...)
Now I have constructed a local wiki.And I want to add the data which download
from the internet Wikipedia to the local wiki.I tried to read the source
code,but I coudln’t find the exact thing(Interface) that I want.
So,I want to ask some questions:
when click the save button after edit an article or add a new article, how is
the data stored? Which function/class does it call?
Could you describe the process of data storage ?
What form are articles stored in database?
In the edit box, when I type [[John Doe]], I want some chance to
verify that I'm linking to the right article, whether it is a
disambiguation page, or by seeing the first sentence from that
article. I know I can "preview" my edit and click that link to
see the page (or ctrl-click to make it appear in a new tab), but
that method just seems sooo 2002.
Is there some tool, button or gadget that does this trick? Perhaps
some greasemonkey script?
What it would do: From where the cursor stands in the edit box,
search backwards for a "[[" and then forwards to the following "|"
or "]]" which ever comes first (this covers the case that the
cursor is inside the link brackets). Look up that article, show
the first paragraph or 150 characters in a pop-up. If I click a
link in the pop-up (a top link, or a disambig page), replace the
link in the edit box so it points to that article.
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se
Part of the Wikipedia Usability Initiative is a project to create a system
whereby template calls are hidden (minimized, really) for most users on the
edit page, and when users do want to edit template calls, they can do so via
a form instead of editing the template call directly. I'm involved with that
project; my previous experience with such matters was creating the Semantic
Forms extension, which is similar in its basic concept. I've put together a
page explaining the current thinking on how the system should work, here:
We're looking for feedback; any thoughts or questions are welcome, either on
that page's talk page or by responding to this email.
I just created https://bugzilla.wikimedia.org/show_bug.cgi?id=20768 ("Branch
1.16") and Brion was quick to respond that some issues with js2 and the
new-upload stuff need to be ironed out; valid concerns, of course.
I proposed to make bug 20768 a tracking bug, so that it can be made visible
what issues are to/could be considered blocking something we can make a 1.16
Let the dependency tagging begin. Users of MediaWiki trunk are encouraged to
report each and every issue, so that what is known can also be resolved
I'm calling on all volunteer coders to keep an eye on this issue and to help
out fixing issues that are mentioned here.