Consensus was reached in the first discussion regarding the intro,
"Principles", and "Unacceptable behavior" sections. See
https://www.mediawiki.org/wiki/Talk:Code_of_conduct_for_technical_spaces/Dr…
.
However, while that was being discussed, several of us made changes to
these sections. In the future, we need to avoid this (to avoid endless
discussion).
Thus, in the future I'll send out two separate announcements, one for
"last call to work on these sections" and one for "consensus discussion
for these sections".
However, this time, that wasn't done. So I want to give people a chance
to weigh in on whether we should accept the changes that were made
during the first discussion.
Thus, there is a new discussion at
https://www.mediawiki.org/wiki/Talk:Code_of_conduct_for_technical_spaces/Dr…
.
This will only last a week. I expect to close it October 6th.
Thanks,
Matt Flaschen
Jack Phoenix and I were considering doing a tech talk about how to make
a MediaWiki skin. Would there be any interest in this? What would you
folks want to see from such a talk?
-I
If you are planning on proposing a session for the Wikimedia Developer
Summit, remember that the deadline for new proposals is *October 2*, in a
couple of days.
For this milestone we are only requesting draft tasks created in
Phabricator and associated to #Wikimedia-Developer-Summit-2016. The
following deadlines are
1. By *6 Nov 2015* all Summit proposals must have active discussions and
a Summit plan documented in the description. Proposals not reaching this
critical mass can continue at their own path out of the Summit.
2. By *4 Dec 2015* all the accepted proposals will be published in the
program. Strong candidates might be scheduled before.
https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit_2016#Submissions_…
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hello,
I was looking at some statistics of school students ( < 17 years )
participation from my state in Open Source program like Google Code In,
and it is ~0. The Government here has initiated a project to distribute
Raspberry Pi for school students[1], and it would be great to have them
setup a Mediawiki development environment with the Pi so that they can
contribute.
The Pi's have 1 Gig ram, and I got a docker container of Ubuntu ( arm )
running smooth. There are few blockers to install MW-Vagrant or the LXC
container, which are:
1. <bd808> The puppet config for mw-vagrant needs a 64-bit Ubuntu 14.04
container to run inside
2. <bd808> mv-vagrant has a lot of bells and whistles that make it
really want a lot of ram and CPU
3. <bd808> hhvm is too ram hungry
and lot more. The other option will be to setup a LAMP stack, which would
need to be automated ( need scripts ). I wanted to know if this porting
would be feasible, and worth the development hours, and specifically - if
someone is interested.
[1]
http://gadgets.ndtv.com/others/news/kerala-launches-learn-to-code-pilot-wil…
Thanks,
Tony Thomas <http://blog.tttwrites.in/>
ThinkFOSS <http://www.thinkfoss.com>
*"where there is a wifi, there is a way"*
Hello,
I need some help. I have to classify the wikilinks in a Wikipedia article based on their relative position in the article (in best case on the rendered page). For each wikilink I would like to have something like the position in text (ascending for each section), if it is in a infobox and if it is in a navibox. I need this classification for a specific revision of every article in the English Wikipedia in the zero namespace . I tried out to do it by parsing the wikitext, but there are some problems with replacing the templates. For example if a template is embedded with parameters and/or with conditions it is a bit difficult to know what exactly is rendered. I tried out some parser from https://www.mediawiki.org/wiki/Alternative_parsers that claim to handle templates but they did not work out mainly due the same problems that I had parsing wikitext myself. Now, I am considering parsing the html of a wikipedia article. I tried also the MediaWiki API (https://www.mediawiki.org/wiki/API:Parsing_wikitext) in order to retrieve the html for a article and parse it myself but the API is very slow for previous revisions of an article and it will take me forever. My question has two parts:
1. What is the fastest way to get the html of an article for specific revision or what is the best tool to setup local copy of Wikipedia (currently I am experimenting with Xowa and Wikitaxi).
2. Is somebody aware of a html Wikipedia parser that could provide e.g. the position of link or a classification of the links regarding their position in text (in each section), if a link is in a infobox and if it is in a navibox.
If you think there is a better way to get a classification of the links regarding their position than to parse the html of an article please let me know.
Cheers Dimi
GESIS - Leibniz Institute for the Social Sciences
GESIS Cologne
da|ra - Registration Agency for Social and Economic Data
Unter Sachsenhausen 6-8
D- 50667 Cologne
Tel: +49 221 47694 512
Hi everyone,
We have our usually scheduled RfC review meeting on IRC coming up
tomorrow. At this meeting, we plan to discuss the following RfCs:
* T112553: Integrate the Virtual Rest Service (VRS) into core
* T90914: Provide semantic wiki-configurable styles for media display
We anticipate the first one will be relatively short, then we may run
out of time before feeling "done" with the second one.
Phab event link #1: https://phabricator.wikimedia.org/E68
We're also hoping to pull together an extra discussion later in the
day to discuss "T30085: Allow user login with email address in
addition to username". That depends on devunt's availability, though,
so that may be postponed.
Phab event link #2: https://phabricator.wikimedia.org/E74
Hope to see y'all there!
Rob
Good afternoon,
Today I went ahead and branched REL1_26 for MediaWiki, all extensions
and skins, and the vendor repository. They were not branched from master,
but rather about a week ago.
The starting point for the core branch was 0652366. It had a timestamp
of Tue 22 Sep 2015 18:24:35 UTC. All extensions and skins were branched
as of this timestamp. Vendor was branched at the same point as well.
Primary development of 1.26 has come to a close, master is now on 1.27
alpha cycle now. It's time to wrap up anything that needs to land in the
release branch, backport as necessary, and try to tidy up the loose ends
before we release. A reminder that our release date is currently scheduled
for the 25th of November. So test your extensions. Fix a blocker[0].
Installer always needs a good workout before we release too.
If you need help backporting things, have a question about whether a
particular change can/should be backported, or anything with this process
please ask.
-Chad
[0] https://phabricator.wikimedia.org/maniphest/query/jcSXdUecbcLp/#R