Hi everyone,
I recently set up a MediaWiki (http://server.bluewatersys.com/w90n740/)
and I need to extra the content from it and convert it into LaTeX
syntax for printed documentation. I have googled for a suitable OSS
solution but nothing was apparent.
I would prefer a script written in Python, but any recommendations
would be very welcome.
Do you know of anything suitable?
Kind Regards,
Hugo Vincent,
Bluewater Systems.
Sorry about bugging the list about it, but can anyone please explain
the reason for not enabling the Interlanguage extension?
See bug 15607 -
https://bugzilla.wikimedia.org/show_bug.cgi?id=15607
I believe that enabling it will be very beneficial for many projects
and many people expressed their support of it. I am not saying that
there are no reasons to not enable it; maybe there is a good reason,
but i don't understand it. I also understand that there are many other
unsolved bugs, but this one seems to have a ready and rather simple
solution.
I am only sending it to raise the problem. If you know the answer, you
may comment at the bug page.
Thanks in advance.
--
Amir Elisha Aharoni
heb: http://haharoni.wordpress.com | eng: http://aharoni.wordpress.com
cat: http://aprenent.wordpress.com | rus: http://amire80.livejournal.com
"We're living in pieces,
I want to live in peace." - T. Moore
Added Jan Gerber ('j'). Jan is the developer of Firefogg, and will be
helping out with some of the open video player & sequencer work with
Michael Dale that Kaltura's sponsoring.
-- brion
I've been putting placeholder images on a lot of articles on en:wp.
e.g. [[Image:Replace this image male.svg]], which goes to
[[Wikipedia:Fromowner]], which asks people to upload an image if they
own one.
I know it's inspired people to add free content images to articles in
several cases. What I'm interested in is numbers. So what I'd need is
a list of edits where one of the SVGs that redirects to
[[Wikipedia:Fromowner]] is replaced with an image. (Checking which of
those are actually free images can come next.)
Is there a tolerably easy way to get this info from a dump? Any
Wikipedia statistics fans who think this'd be easy?
(If the placeholders do work, then it'd also be useful convincing some
wikiprojects to encourage the things. Not that there's ownership of
articles on en:wp, of *course* ...)
- d.
Hoi,
We have been testing the LocalisationUpdate extension for some time now and,
we consider it quite good at the moment. We have been testing it in a test
environment and we would like to expand our testing to MediaWiki wikis that
do not run in English or any of the other languages that are already
completely localised. What we are looking for are Wikis that are/will be
running MediaWiki 1.15 and would like to experience that the localisation
for their Wiki gets updated with later localisations.
Obviously in order for this to work, there have to be people localising for
your language.
What we offer is help with the installation of the extension and support
with the running of the extension on your MediaWiki wiki. We are looking for
five wikis with five different languages. We can make this offer for wikis
where you are able to install new extensions and, where you can add a
chronjob.
Thanks,
GerardM
Hi,
I have just seen two independent instances where people said that they
sent posts to foundation-l in the last +/- 12 hours, which never got
posted on the list. The emails do not show up in the moderation queue
either (nor are these two subscribers, or the entire list, moderated).
Are there any technical problems with the mail(inglist) server that
you are aware of?
Michael
--
Michael Bimmler
mbimmler(a)gmail.com
Hi all,
It seems to me that there's been sterling work on the 'flagged revisions'
front - with the bulk of the credit due to User:Cenarium over on en, and the
various folk working away over there.
With that in mind, could I please encourage a dev.s attention to;
https://bugzilla.wikimedia.org/show_bug.cgi?id=18244
Hopefully we can enable the extension as soon as possible :-)
best,
Peter,
PM.
Hi all,
We now have english wikipedia fully migrated to new servers / new search
backend. We cannot fully migrate other wikis until we resolve some hardware
issues. In the meantime, here is the overview of new features now deployed
on en.wiki:
1) Did you mean... - we now have search suggestions. Care has been taken to
provide suggestions that are context-sensitive, i.e. on phrases, proper
names, etc..
2) fuzzy and wildcard queries - a word can be made fuzzy by adding ~ to it's
end, e.g. query sarah~ thompson~ will give all different spellings and
similar names to sarah thompson. Wildcards can now be prefix and suffix,
e.g. *stan will give various countries in central asia.
3) prefix: - using this magic prefix, queries can be limited to pages
beginning with certain prefix. E.g.
mwsuggest prefix:Wikipedia:Village Pump
will search all village pumps and archives for mwsuggest. This should be
especially useful for archive searching in concert with inputbox or
searchbox
4) intitle: - using this magic prefix, queries can be limited to titles only
5) generally improved quality of search results via usage of related
articles (based on co-occurrence of links), anchor text, text abstracts,
proximity within articles, sections, redirects, improved stemming and such
Cheers, Robert
As per the subject, does mediawiki do browser version checking and
does it support double digit version numbers?. Opera is about to hit
v10 in their user agent strings which currently breaks a few scripts
on other websites currently although they have put in a tempoary stop
gap (naming as v9.80) we should make sure mediawiki detects v10
properly.
More information can be found in their Dev.Opera blog posting entitled
"Changes in Opera’s user agent string format".[1]
[1]. "Changes in Opera’s user agent string format"
<http://dev.opera.com/articles/view/opera-ua-string-changes/>
-Peachey