Hi everyone,
I recently set up a MediaWiki (http://server.bluewatersys.com/w90n740/)
and I need to extra the content from it and convert it into LaTeX
syntax for printed documentation. I have googled for a suitable OSS
solution but nothing was apparent.
I would prefer a script written in Python, but any recommendations
would be very welcome.
Do you know of anything suitable?
Kind Regards,
Hugo Vincent,
Bluewater Systems.
This is a question about an infrastructural detail of ResourceLoader and how it interacts with Internet Explorer. (It's my first post to wikitech-l, so apologies if it's the wrong forum.)
Our MediaWiki 1.17.0 site recently installed a bunch of extensions that use ResourceLoader, such as Extension:WikiEditor. To our surprise, some of our site's unrelated CSS styles stopped working. This was happening only in Internet Explorer. After some detective work, we discovered the problem is Internet Explorer's limit of 31 stylesheets:
http://support.microsoft.com/kb/262161
With so many extensions calling $wgOut->addModule [PHP} and mw.loader.load [JavaScript], the limit of 31 stylesheets is exceeded quickly. I removed a few mw.loader.load calls - it didn't matter which ones - and the problem went away.
Obviously this is an IE problem, not MediaWiki's, but it's going to cause issues on MediaWiki sites. WikiEditor itself loads about 10 stylesheets, for example, taking the site ~30% of the way toward a CSS failure.
So my questions are:
1. Is there a workaround for sites like mine, with many stylesheets from separate extensions loaded by ResourceLoader?
2. Should ResourceLoader address this IE problem? Maybe start combining stylesheets (with @import) automatically?
Thanks,
DanB
I've been informally mentoring André, Tiago, Diego, and César. They
are four students at Minho University who are currently working on a
project to improve DB2 database support in MediaWiki.
So far, they've:
- Fixed several outstanding issues with DB2 support involving
character encoding, Windows vs Linux, etc
- Added DB2 support to the new MediaWiki 1.17 Installer and Updater
- Put in the appropriate Updater sql patches to reflect database
schema changes since 1.14
MediaWiki already had some DB2 support, but it's been broken since
1.15 and never complete. As a result of their work, it's now possible
to successfully install MediaWiki on DB2 out of the box and to use the
core wiki features.
I'll shortly commit their first patch using my SVN account (leonsp).
I've taken some care to look over the code and make sure it abides by
the MediaWiki code guidelines.
Regards,
Leons Petrazickis
http://lpetr.org/blog/
I have been in discussion with two institutions (the University of British
Columbia and a military hospital in the USA) who are interested in
collaborating with Wikipedia. One of the proposed ideas is to create a
program that would simultaneously upload images deemed suitable to both
Wikimedia Commons and the other Wiki in question. Are there any programmers
interested in taking on this project?
--
James Heilman
MD, CCFP-EM, Wikipedian
Bug 24207 requests switching the math rendering preference default from its
current setting (which usually produces a nice PNG and occasionally produces
some kinda ugly HTML) to the "always render PNG" setting.
I'd actually propose dropping the rendering options entirely...
* "HTML if simple" and "if possible" produce *horrible* ugly output that
nobody likes, so people use hacks to force PNG rendering. Why not just
render to PNG?
* "MathML" mode is even *MORE* limited than "HTML if simple", making it
entirely useless.
* nobody even knows what "Recommended for modern browsers" means, but it
seems to be somewhere in that "occasionally crappy HTML, usually PNG"
continuum.
So we're left with only two sane choices:
* Always render PNG
* Leave it as TeX (for text browsers)
Text browsers will show the alt text on the images, which is... the TeX
code. So even this isn't actually needed for its stated purpose. (Hi
Jidanni! :) lynx should show the tex source when using the PNG mode.)
It's conceivable that a few folks really honestly prefer to see the latex
source in their graphical browsers (should at least do a quick stat check to
see if anybody uses it on purpose), but I wouldn't mind removing that
either.
Fancier rendering like MathJax etc should be considered as a separate thing
(and implemented a bit differently to avoid parser cache fragmentation!), so
don't let future mode concerns worry y'all. Any thoughts on whether this
makes sense to do for 1.18 or 1.19?
https://bugzilla.wikimedia.org/show_bug.cgi?id=24207#c9
-- brion
Hi!
I am starting this thread because Brion's revision r94289 reverted
r94289 [0] stating "core schema change with no discussion" [1].
Bugs 21860 [2] and 25312 [3] advocate for the inclusion of a hash
column (either md5 or sha1) in the revision table. The primary use
case of this column will be to assist detecting reverts. I don't think
that data integrity is the primary reason for adding this column. The
huge advantage of having such a column is that it will not be longer
necessary to analyze full dumps to detect reverts, instead you can
look for reverts in the stub dump file by looking for the same hash
within a single page. The fact that there is a theoretical chance of a
collision is not very important IMHO, it would just mean that in very
rare cases in our research we would flag an edit being reverted while
it's not. The two bug reports contain quite long discussions and this
feature has also been discussed internally quite extensively but oddly
enough it hasn't happened yet on the mailinglist.
So let's have a discussion!
[0] http://www.mediawiki.org/wiki/Special:Code/MediaWiki/94289
[1] http://www.mediawiki.org/wiki/Special:Code/MediaWiki/94541
[2] https://bugzilla.wikimedia.org/show_bug.cgi?id=21860
[3] https://bugzilla.wikimedia.org/show_bug.cgi?id=25312
Best,
Diederik
I think we finally have a complete copy from December 2007 through
August 2011 of the pageview stats scrounged from various sources, now
available on our dumps server.
See http://dumps.wikimedia.org/other/pagecounts-raw/
Ariel
Hi,
I think developer accounts on the Wikimedia SVN repository should be
easier to get. I say this because a consultant of ours at WikiWorks,
Ike Hecht, asked for a developer account last week and was rejected.
He created his first major MediaWiki extension, Ad Manager, recently,
which I added to the repository a few weeks ago - you can see it here:
http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/AdManager/
When he requested access, this was the relevant part of the response
from Sumana:
"Right now, we are not approving your request for commit access. I'm
sorry. We'd like for you to get more practice writing code for
MediaWiki, submit patches for review via Bugzilla attachments, and ask
us for comments... Please come back and request access again in a few
months."
I don't know whether this is WMF policy now, or a personal decision
from Sumana, or a decision made by someone else, but in any case I
don't understand it. It seems to me that there are two valid reasons
for not simply allowing everyone to get a developer account: the
first, and major, reason is to prevent malicious users from
vandalizing or deleting code. The second is to prevent
well-intentioned but incompetent developers from checking in buggy
and/or badly-written code that requires lots of fixes and review time
by the reviewers. In both cases, the person's presence in SVN would
cause more harm than good.
Neither of those cases apply here - the Ad Manager code was
well-written, and it works. If you're curious, you can see for
yourself the kinds of fixes and changes that were made to the code
after it was checked in - all minor stuff, the only major thing being
that the extension originally included support for MediaWiki 1.15,
which people thought was unnecessary. Clearly a higher bar is being
applied here than what's spelled out in the mediawiki.org
documentation - which only says that "we don't have time to train
programmers from scratch":
http://www.mediawiki.org/wiki/Commit_access#Prerequisites
Note, by the way, that if there's a more stringent policy in place
now, it's not being applied consistently, because the students in this
year's Google Summer of Code got developer access after much less
proof of programming ability.
It seems to me that if someone writes an extension that basically
matches the MediaWiki guidelines, works, and does something useful,
they should pretty much be granted automatic access to an account,
because they will have proved that their presence will be a net
positive overall. Any thoughts on this?
And out of curiosity - is there a new policy in place?
-Yaron
--
WikiWorks · MediaWiki Consulting · http://wikiworks.com
In the search box, I get suggestions on the fly as I type, and I'm
often impressed by the good suggestions. However, right now at Wiktionary
I get suggestions that aren't the best ones for the given prefix.
For example, at en.wiktionary.org if I type "lagru" it doesn't
suggest"lagrum", but instead a bunch of inflected and derived
forms:
lagrumshänvisning
lagrums
lagrumshänvisnings
lagrummets
lagrummen
lagrummet
lagrumshänvisningars
lagrumshänvisningar
lagrumshänvisningarnas
lagrumshänvisningarna
Since these are Swedish entries in the English Wiktionary,
none of these pages get much traffic. Are the completion
suggestions based on traffic stats? In this case, link
count might be a better predictor for best suggestion,
since all derived forms link back to the basic form.
Not much traffic: 5 page views in 30 days,
http://stats.grok.se/en.d/latest/lagrum
--
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se