Hi, I'd like to present a new RFC for your consideration:
https://www.mediawiki.org/wiki/Requests_for_comment/Minifier
It is about how we can shave 10-15% off the size if JavaScript
delivered to users.
Your comments are highly welcome!:)
--
Best regards,
Max Semenik ([[User:MaxSem]])
Hi,
Thanks to the hard work of a lot of different people, PHP CodeSniffer is
now voting on all MediaWiki core patchsets! It checks for basic code
style issues automatically so new contributors (and experienced ones!)
can fix basic issues without a human needing to point them out.
I added some brief instructions to [1] on how to run it locally, or you
can read the jenkins output.
There are still a few code style rules that are disabled, [2] is
tracking fixing those issues.
Please file any bugs or feature requests in the MediaWiki-CodeSniffer[3]
project on Phabricator.
[1] https://www.mediawiki.org/wiki/Continuous_integration/PHP_CodeSniffer
[2] https://phabricator.wikimedia.org/T102609
[3] https://phabricator.wikimedia.org/tag/mediawiki-codesniffer/
-- Legoktm
Consensus was reached in the first discussion regarding the intro,
"Principles", and "Unacceptable behavior" sections. See
https://www.mediawiki.org/wiki/Talk:Code_of_conduct_for_technical_spaces/Dr…
.
However, while that was being discussed, several of us made changes to
these sections. In the future, we need to avoid this (to avoid endless
discussion).
Thus, in the future I'll send out two separate announcements, one for
"last call to work on these sections" and one for "consensus discussion
for these sections".
However, this time, that wasn't done. So I want to give people a chance
to weigh in on whether we should accept the changes that were made
during the first discussion.
Thus, there is a new discussion at
https://www.mediawiki.org/wiki/Talk:Code_of_conduct_for_technical_spaces/Dr…
.
This will only last a week. I expect to close it October 6th.
Thanks,
Matt Flaschen
Jack Phoenix and I were considering doing a tech talk about how to make
a MediaWiki skin. Would there be any interest in this? What would you
folks want to see from such a talk?
-I
If you are planning on proposing a session for the Wikimedia Developer
Summit, remember that the deadline for new proposals is *October 2*, in a
couple of days.
For this milestone we are only requesting draft tasks created in
Phabricator and associated to #Wikimedia-Developer-Summit-2016. The
following deadlines are
1. By *6 Nov 2015* all Summit proposals must have active discussions and
a Summit plan documented in the description. Proposals not reaching this
critical mass can continue at their own path out of the Summit.
2. By *4 Dec 2015* all the accepted proposals will be published in the
program. Strong candidates might be scheduled before.
https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit_2016#Submissions_…
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hello,
I was looking at some statistics of school students ( < 17 years )
participation from my state in Open Source program like Google Code In,
and it is ~0. The Government here has initiated a project to distribute
Raspberry Pi for school students[1], and it would be great to have them
setup a Mediawiki development environment with the Pi so that they can
contribute.
The Pi's have 1 Gig ram, and I got a docker container of Ubuntu ( arm )
running smooth. There are few blockers to install MW-Vagrant or the LXC
container, which are:
1. <bd808> The puppet config for mw-vagrant needs a 64-bit Ubuntu 14.04
container to run inside
2. <bd808> mv-vagrant has a lot of bells and whistles that make it
really want a lot of ram and CPU
3. <bd808> hhvm is too ram hungry
and lot more. The other option will be to setup a LAMP stack, which would
need to be automated ( need scripts ). I wanted to know if this porting
would be feasible, and worth the development hours, and specifically - if
someone is interested.
[1]
http://gadgets.ndtv.com/others/news/kerala-launches-learn-to-code-pilot-wil…
Thanks,
Tony Thomas <http://blog.tttwrites.in/>
ThinkFOSS <http://www.thinkfoss.com>
*"where there is a wifi, there is a way"*
Hello,
I need some help. I have to classify the wikilinks in a Wikipedia article based on their relative position in the article (in best case on the rendered page). For each wikilink I would like to have something like the position in text (ascending for each section), if it is in a infobox and if it is in a navibox. I need this classification for a specific revision of every article in the English Wikipedia in the zero namespace . I tried out to do it by parsing the wikitext, but there are some problems with replacing the templates. For example if a template is embedded with parameters and/or with conditions it is a bit difficult to know what exactly is rendered. I tried out some parser from https://www.mediawiki.org/wiki/Alternative_parsers that claim to handle templates but they did not work out mainly due the same problems that I had parsing wikitext myself. Now, I am considering parsing the html of a wikipedia article. I tried also the MediaWiki API (https://www.mediawiki.org/wiki/API:Parsing_wikitext) in order to retrieve the html for a article and parse it myself but the API is very slow for previous revisions of an article and it will take me forever. My question has two parts:
1. What is the fastest way to get the html of an article for specific revision or what is the best tool to setup local copy of Wikipedia (currently I am experimenting with Xowa and Wikitaxi).
2. Is somebody aware of a html Wikipedia parser that could provide e.g. the position of link or a classification of the links regarding their position in text (in each section), if a link is in a infobox and if it is in a navibox.
If you think there is a better way to get a classification of the links regarding their position than to parse the html of an article please let me know.
Cheers Dimi
GESIS - Leibniz Institute for the Social Sciences
GESIS Cologne
da|ra - Registration Agency for Social and Economic Data
Unter Sachsenhausen 6-8
D- 50667 Cologne
Tel: +49 221 47694 512