Hi everyone,
I recently set up a MediaWiki (http://server.bluewatersys.com/w90n740/)
and I need to extra the content from it and convert it into LaTeX
syntax for printed documentation. I have googled for a suitable OSS
solution but nothing was apparent.
I would prefer a script written in Python, but any recommendations
would be very welcome.
Do you know of anything suitable?
Kind Regards,
Hugo Vincent,
Bluewater Systems.
Seems to me playing the role of the average dumb user, that
en.wikipedia.org is one of the rather slow websites of the many websites
I browse.
No matter what browser, it takes more seconds from the time I click on a
link to the time when the first bytes of the HTTP response start flowing
back to me.
Seems facebook is more zippy.
Maybe Mediawiki is not "optimized".
Hi all.
"Recent changes" shows bytes added/removed in green/red. But "View history"
only shows revision length in bytes, and "User contributions" shows no byte
counts at all.
I think it would be nice for both "View history"[1] and "User contributions" to
show bytes added/removed. This would make it easier to distinguish between
small contributions from big ones: between multiple-sentence additions and
small typo fixes.
What do you think?
All the best,
-Jason
^ [1]. You can already get bytes added/removed to history revisions using a
gadget. Just add the following line to your vector.js:
importScript('fr:MediaWiki:Gadget-HistoryNumDiff.js');
Hi!
I've read on the techblog that the new UI go live in April. I have
some questions:
1) What version? Acai, babaco, citron?
2) How/where could a wiki customize the special character insert menu,
and the inserted strings? And the embed file (picture) button inserts
this: "[[Example.jpg]]", without any "File:" or "Image:"!
3) The search and replace button is available in firefox, but does not
appear at all in opera. Why?
4) Currently the new navigable TOC does not work on FF/Opera at all
(I've tried those).
Not too early for live deployment?
Regards,
Akos Szabo (Glanthor Reviol)
If you install:
http://www.mediawiki.org/wiki/Extension:VariablesExtension#Installation
Then edit the main page to contain the following (between the '---'):
---
{{#vardefine:pi|3.14159265418}}
{{#expr:{{#var:pi}}+1}}
---
The main page should, when rendered, now, show the number 4.14159265418
What I would like is something very similar called "CellsExtension"
which provides only the keyword "#cell" as in:
---
{{#expr:{{#cell:pi}}+1}}
---
However, it gets the value of "pi" from:
http://somedomain.org/mediawiki/index.php?title=Pi
Ideally, whenever a mediawiki rendered page is cached, dependency
pointers are created from all pages from which cells fetched values
during rendering of the page (implying the evaluation of #expr's. That
way, when the mediawiki source for one of the cached pages is edited,
not only is its cached rendering deleted, but so are all cached
renderings that depend on it directly or indirectly. This is so that
the next time those pages are accessed, they are rendered -- and
cached -- again, freshly evaluating the formulas in the #expr's
(which, of course, will contain #cell references such as {{#cell:pi}}).
I have been working on the ResourceLoader branch, where I've ended up
writing a CSSMin class which performs CSS minification, URI-remapping
and data-URI in-lining. It got me thinking that this class would be
pretty useful to non-MediaWiki projects too, but sadly we don't have a
history of sharing in this way...
* Software we've ported to PHP ourselves like our native-PHP CDB
implementation or CSSJanus are buried in our code-base, and make
use of a couple of trivial wf* global functions, making it
somewhat inaccessible to third-party users. Which sucks because
third-party users are important! They use the code in their own
systems, make improvements and potentially pass them back to us,
however if we don't make these things more general-purpose the
code will more likely get taken from our repository, tweaked and
never passed back; if we don't make it more easily accessible the
code will never be found and we won't be taking advantage of the
entire PHP development community. Sadness...
* Software we've borrowed from other projects like JSMin are also
buried within our MediaWiki-proprietary code, and while these
libraries can operate independently of MediaWiki, we need to make
it clear that they should be kept in sync with their original
sources both, upstream and down.
* Software we've created is often potentially useful to other
projects, but unfortunately tied to and buried within MediaWiki.
In some of these cases, the ties to MediaWiki are trivial and
could be either optional or removed entirely, and the component
could be factored out to a more general-purpose library, available
for re-use.
I don't have a very mature proposal for solving this completely, but as
a first step, it seems like we should have a libraries folder which we
can move things that can function in a stand-alone manner to. Initial
candidates appear to be libraries that already function in a stand-alone
way such as JSMin.php, CSSJanus.php, and CSSMin.php (in the
resourceloader branch right now but will be coming to trunk soon).
Additional software could be moved into this space after some
un-tethering such as Cdb/Cdb_PHP, DjVuImage, etc.
Overall, I think it would be great if we could take a look at this and
other ways to better share our work with non-MediaWiki projects, and
give back to the open-source community.
I welcome your thoughts and input.
- Trevor
I would like to announce latextwowiki -- a latex to mediawiki
translator.
It uses tralics to transform latex to xml. Latextwowiki is written in
Python3.1 and uses lxml. Code is under GPL v.3 or later.
The pre-alpha version can be
downloaded from http://escher.fuw.edu.pl/git/latextwowiki.
Features:
* translates \ref{} to sections to proper mediawiki references in [[]]
* various complicated latex equations work correctly
* supports equation, figure, table references via Extension:CrossReference
* latex bibliography support via Extension:Cite.
Every comment, bug and hairy latex file to test will make me happy.
Asia Jędrzejewska-Szmek
Over the past couple of weeks, I've been working on getting the test
server at http://ci.tesla.usability.wikimedia.org/ up and running on a
regular basis.
To do this, I've had to scale back the static code analysis for now, so
the old style checks (which don't yet target the MW style, so may not
be too useful), as well as other, more complicated static code analysis,
may not be working when you look.
In the mean time, I've put the configuration of the server under version
control at
http://svn.wikimedia.org/svnroot/mediawiki/trunk/test-server/
Soon, I'll have the build server bootstrapping from the configuration in
SVN so that anyone can help maintain it if they have commit access to
that part of the tree.
In the meantime, I've worked with MZMcBride to so that the codurr bot in
#mediawiki announces whenever the tests (including the parser tests)
break along with the possible culprits in terms of committers/commits.
Since not everyone's IRC nick is the same as their commit ID (for
example, “hexmode” is my IRC nick and “mah” is my commit ID), there is
an attempt to use the USERINFO files to translate from commit ID to IRC
nick. To take advantage of this, update your USERINFO file. See
http://svn.wikimedia.org/svnroot/mediawiki/USERINFO/mah for an example.
Finally, as a preview, this week I'm starting to integrate the Selenium
tests into the PHPUnit run.
Mark.
--
http://hexmode.com/
Embrace Ignorance. Just don't get too attached.