Hi everyone,
I recently set up a MediaWiki (http://server.bluewatersys.com/w90n740/)
and I need to extra the content from it and convert it into LaTeX
syntax for printed documentation. I have googled for a suitable OSS
solution but nothing was apparent.
I would prefer a script written in Python, but any recommendations
would be very welcome.
Do you know of anything suitable?
Kind Regards,
Hugo Vincent,
Bluewater Systems.
Hi
I was planning to make an api sandbox for mediawiki as a part of gsoc 2011.
while going through mediawiki documentation, i found that many functions
needed POST rather than GET requests. i am planning for flickr like api
sandbox<http://www.flickr.com/services/api/explore/?method=flickr.contacts.getList>for
mediawiki.
in flickr the documentation of every method has a link to api explorer
(sandbox) where user can test different values for each parameter and the
result is displayed in a div in the same page.
media wiki sandbox will display available parameters that can be used for a
particular method (drop-down if the values are previously known). the user
can fill the forms with his own values. the api then will send a GET or POST
request (AJAX request) according to the scenario (using JQuery or some other
JS library) and display the results in a div in the same page. the page
shall also display a url for executing the same ajax request.
to make the api sandbox really useful, it ideally should have automatic php
code generation too ( i don't know if it is overambitious ). for example for
login and logout, user can just give his userid and password and the code he
would write for php is automatically displayed.
i eagerly await for suggestions
--
Salil
IRC : _Salil_
This is a fork of this thread:
http://www.gossamer-threads.com/lists/wiki/wikitech/228949?page=last
Is there a possibility that I could instead (easily) merge the handful
of individual wiki's content into one consolidated wiki and implement a
more enhanced search against it instead? I've no experience performing a
merge like this so expert advice would be appreciated!
Thanks - Tod
Hi all!
Wikimedia Germany invites anyone interested in improving MediaWiki to come and
join us at or third developer meet-up. Like the last two years, it's going to be
awesome! Unlike the last two years, there will be more hacking and less talking
- it'll be a Hackathon, not a BarCamp.
We'll meet on May 13 to 15, in Berlin, on the 4th floor of the betahaus
coworking space <http://betahaus.de/>.
There will not be an entrance fee, but registration is mandatory and now open:
<http://de.amiando.com/hackathon2011>.
Registration will close on April 10. If you like to attend, please register in
time!
More information can be found at
<http://www.mediawiki.org/wiki/Berlin_Hackathon_2011>.
The Berlin Hackathon 2011 is an opportunity for MediaWiki hackers to come
together, squash bugs and write crazy new features. Our main focus this time
around will probably be:
* Improving usability / accessibility
* Interactive Maps
* Fixing the parser
* WMF Ops (new data center, virtualization)
* Supporting the Wiki Loves Monuments image hunt
* Squashing bugs
If you have different ideas, please let us know:
<http://www.mediawiki.org/wiki/Berlin_Hackathon_2011#Topics>
The Hackathon will be hosting the Language committee and Wiki loves Monuments
group. There is a limited number of seats reserved for these groups and if you
belong to one of them, you should receive an invitation code soon.
If you have any doubts or questions, contact us at <hackathon(a)wikimedia.de>.
We’re excited to see you in Berlin, your Hackathon Team
Daniel Kinzler (Program Coordinator)
Nicole Ebber (Logistics)
Cornelius Kibelka (Assistant)
I think we should migrate MediaWiki to target HipHop [1] as its
primary high-performance platform. I think we should continue to
support Zend, for the benefit of small installations. But we should
additionally support HipHop, use it on Wikimedia, and optimise our
algorithms for it.
In cases where an algorithm optimised for HipHop would be excessively
slow when running under Zend, we can split the implementations by
subclassing.
I was skeptical about HipHop at first, since the road is littered with
the bodies of dead PHP compilers. But it looks like Facebook is pretty
well committed to this one, and they have the resources to maintain
it. I waited and watched for a while, but I think the time has come to
make a decision on this.
Facebook now write their PHP code to target HipHop exclusively, so by
trying to write code that works on both platforms, we'll be in new
territory, to some degree. Maybe that's scary, but I think it can work.
Who's with me?
-- Tim Starling
[1] https://github.com/facebook/hiphop-php/wiki/
Hello!
What is the current concensus on HTML5?
Are we going to fully support it and use as many features as we can or
are we going to keep just using javascript alterntives?
I would assume that we would continue to use javascript in the case
that the client does not support HTML5, though that may mean we have
to write things twice.
Also, if we are indeed going to use HTML5, are we going to use XHTML5?
I would like to work on integrating MediaWiki with HTML5 if at all
possible, I am just unsure on what parts we wish to use the new
standard.
TIA - Joseph Roberts
In MediaWiki there are two preferences: "Disable AJAX suggestions" and
"Enable enhanced search suggestions (Vector skin only)".
They are problematic for several reasons:
1. Most average users don't know what AJAX is. It should be just
called "search suggestions".
2. The first option says "Disable", but the second one says "Enable".
It's confusing - both should say "Enable". "Disable AJAX suggestions"
should become "Enable search suggestions", and the value of the
checkbox must be reversed for all users.
3. Better yet, the two options should be merged into one. "Disable
AJAX suggestions" doesn't seem to do anything useful in Vector, no
matter whether it's checked or not, and "Enable enhanced search
suggestions (Vector skin only)", as its name implies is relevant only
for Vector. Separating them may have been useful in the early days of
Vector, but now that a year after the rollout Vector is rather stable,
this doesn't seem to be useful any more.
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
"We're living in pieces,
I want to live in peace." - T. Moore
On Mon, Mar 28, 2011 at 7:20 AM, Aryeh Gregor <Simetrical+wikilist(a)gmail.com
> wrote:
> If, as Tim says, Wikimedia developers were un-assigned from code
> review after the 1.17 deployment, *that* is the problem that needs to
> be fixed. We need a managerial decision that all relatively
> experienced developers employed by Wikimedia need to set aside their
> other work to do as much code review as necessary to keep current. If
> commits are not, as a general rule, consistently reviewed within two
> or three days, the system is broken. I don't know why this isn't
> clear to everyone yet.
Hi Aryeh,
You say that as though this were obvious and uncontroversial. The reason
why we've been dancing around this issue is because it is not.
Right now, we have a system whereby junior developers get to commit whatever
they want, whenever they want. Under the system you outline, the only
remedy we have to the problem of falling behind is to throw more senior
developer time at the problem, no matter how ill-advised or low-priority the
changes the junior developers are making. Taken to an extreme, this means
that junior developers maintain complete control over the direction of
MediaWiki, with the senior developers there purely in a subservient role of
approving/rejecting code as it comes in.
What comes of this system should be obvious: senior developer burnout. If
only reward we offer for becoming an experienced developer is less
interesting work with less power over day-to-day work, we're not going to
attract and retain people in senior positions.
To be clear, none of the developers in WMF's General Engineering group have
been pulled off of code review. However, not all of the WMF's senior staff
are part of GenEng.
Rob