Hello,
An hour or so ago, I have changed the way we run PHPUnit tests for
extensions.
* we now bring in mediawiki/vendor repository
* mediawiki/core is now using the branch of the proposed patchset. So if
you propose a patch for your extension against REL1_23, it will be
tested with core@REL1_23. Previously we always used master.
The job names have been changed:
- 'mwext-{extension}-testextensions-master'
+ 'mwext-{extension}-testextension'
Some jobs are still updating as I write this. Should be completed soon.
If I see any failure in between I will retrigger the jobs.
Something I have yet to figure out is to have the patch voted +2 to be
tested sequentially to ensure the queue of proposed changes works well
together. Should be dealt with this evening or at worse tomorrow.
If anything is suspicious / failing. Please fill in bugs against
Wikimedia > "Continuous integration".
More details / doc will follow as time allow (see bug 1).
--
Antoine "hashar" Musso
The installation fails with an ORA-00947 error due to not enough
values in an INSERT statement. In the /maintenance/oracle/tables.sql
code, I see this:
CREATE SEQUENCE page_page_id_seq;
CREATE TABLE &mw_prefix.page (
page_id NUMBER NOT NULL,
page_namespace NUMBER DEFAULT 0 NOT NULL,
page_title VARCHAR2(255) NOT NULL,
page_restrictions VARCHAR2(255),
page_counter NUMBER DEFAULT 0 NOT NULL,
page_is_redirect CHAR(1) DEFAULT '0' NOT NULL,
page_is_new CHAR(1) DEFAULT '0' NOT NULL,
page_random NUMBER(15,14) NOT NULL,
page_touched TIMESTAMP(6) WITH TIME ZONE,
page_links_updated TIMESTAMP(6) WITH TIME ZONE,
page_latest NUMBER DEFAULT 0 NOT NULL, -- FK?
page_len NUMBER DEFAULT 0 NOT NULL,
page_content_model VARCHAR2(32),
page_lang VARCHAR2(35) DEFAULT NULL
);
ALTER TABLE &mw_prefix.page ADD CONSTRAINT &mw_prefix.page_pk PRIMARY
KEY (page_id);
CREATE UNIQUE INDEX &mw_prefix.page_u01 ON &mw_prefix.page
(page_namespace,page_title);
CREATE INDEX &mw_prefix.page_i01 ON &mw_prefix.page (page_random);
CREATE INDEX &mw_prefix.page_i02 ON &mw_prefix.page (page_len);
CREATE INDEX &mw_prefix.page_i03 ON &mw_prefix.page (page_is_redirect,
page_namespace, page_len);
-- Create a dummy page to satisfy fk contraints especially with revisions
INSERT INTO &mw_prefix.page
VALUES (0, 0, ' ', NULL, 0, 0, 0, 0, current_timestamp, NULL, 0, 0, NULL);
I see 14 columns defined, but only 13 values being passed. How can
this ever work?
I've already opened bug https://bugzilla.wikimedia.org/show_bug.cgi?id=71022
I'm not an Oracle person, so I'm not sure if this is normal or not.
Thanks
Bill
Dear wikimedians,
The Free and Open Source Souftware Outreach Program for Women offers paid
internships to developers and other technical profiles working on projects
together with free software organizations. Wikimedia is participating
again, and we welcome candidates.
https://www.mediawiki.org/wiki/FOSS_Outreach_Program_for_Women/Round_9
This call is open to Wikimedia volunteers (editors, developers...) and also
to people that would contribute for the first time in our projects. In the
past editions we have seen that candidates coming through a direct
recommendation have good chances of success. It is also known that many
good potential candidates will be reluctant to step in, but they will if
someone (like you) encourages them to apply, or to contact us with any
questions.
You can make a difference. If you know women with software development or
open source background / interest and full time availability between
December and March, please forward them this invitation. Thank you!
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hello,
for different custom extensions I've been doing something like this with
GET:
$.get( mw.util.wikiScript(), {
format: 'json',
action: 'ajax',
rs: 'MyExt::MyFunction',
rsargs: [param1, param2]
}, function(data) {
// console.log(data);
});
MyExt::MyFunction is in $wgAjaxExportList[].
However, since now I have too much data for a URL (414 HTTP error) I'd
prefer to use POST. So far I have not found a proper way to build it.
Any idea or advice?
Thanks!
--
Toni Hermoso Pulido
http://www.cau.cathttp://www.similis.cc
Dear all,
We're building a Firefox addon to perceptually match images in Commons
against images found elsewhere, so that people can see that they come
from Commons even if they appear on other web sites.
https://moqups.com/jonaso/lopej41Z has a quick mockup.
On https://commons.wikimedia.org/wiki/Commons:Bots/Requests/CommonsHasher
we've requested the apihighlimits right (after discussion on commons-l
starting here: https://lists.wikimedia.org/pipermail/commons-l/2014-September/007325.html)
in order to be able to retrieve more than 50 records at once from the
API.
According to EugeneZelenko who tried to grant this right, it could not
be granted through the normal interface. Question then: is
apihighlimits included in the bot flag, or how can the apihighlimits
right be granted?
Sincerely,
--
Jonas Öberg, Founder & Shuttleworth Foundation Fellow
Commons Machinery | jonas(a)commonsmachinery.se
E-mail is the fastest way to my attention
Forwarding. Please take this opportunity to ask questions!
Pine
---------- Forwarded message ----------
From: "Siko Bouterse" <sbouterse(a)wikimedia.org>
Date: Sep 18, 2014 3:30 PM
Subject: [Wikimedia-l] Upcoming help sessions for drafting Individual
Engagement Grant proposals
To: <wikimedia-l(a)lists.wikimedia.org>
Cc:
Hi all,
As we announced earlier this month[1], September is the month to apply for
an Individual Engagement Grant[2]. The deadline to submit a proposal is
September 30.
To help you turn your ideas and projects into successful proposals, we’re
hosting a few IdeaLab Proposal Clinics in Hangouts and IRC this month.[3]
The first one took place on September 16. Newcomers had a chance to talk to
current and past grantees, as well as WMF Grantmaking staff, to workshop
ideas and strengthen their proposals.
If you have an idea you would like to submit, but feel unsure about how to
draft or finalize your proposal, join a session! There are three events
left before the deadline:
* IRC office hours in #wikimedia-office - Sept 23, 1600 UTC (Tuesday)
* Hangout - Sept 25, 1700 UTC (Thursday) [4]
* Hangout - Sept 28, 1700 UTC (Sunday) [5]
Join us next week to discuss your ideas, and bring any questions you have
about IEG!
Cheers,
Siko
[1]
https://lists.wikimedia.org/pipermail/wikimedia-l/2014-September/074239.html
[2] https://meta.wikimedia.org/wiki/Grants:IEG
[3] https://meta.wikimedia.org/wiki/Grants:IdeaLab/Events#Upcoming_events
[4]IdeaLab Clinic II. Join via Hangout:
https://plus.google.com/events/cvk8hivoih04ifc6pp3sl0se6s8?hl
[5] IdeaLab Clinic III. Join via Hangout:
https://plus.google.com/events/c82527nlv8jhkv17j963gs9gp6s?hl
--
Siko Bouterse
Head of Individual Grants
Wikimedia Foundation, Inc.
sbouterse(a)wikimedia.org
*Imagine a world in which every single human being can freely share in the
sum of all knowledge. *
*Donate <https://donate.wikimedia.org> or click the "edit" button today,
and help us make it a reality!*
_______________________________________________
Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines
Wikimedia-l(a)lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
<mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
Every once in awhile I've found an idea for a service which would for
one reason or another need to know what releases of MediaWiki exist and
which ones are obsolete.
As far as I know, we don't have any sort of API or machine readable
metadata at all declaring this information for services to use.
I'd like to see a change to the release workflow made that would support
this use case.
Some time ago I thought we should do this with an api, but now I think
we can probably cover this by publishing a simple json file at either
one of these urls:
https://www.mediawiki.org/releases.jsonhttps://releases.wikimedia.org/mediawiki/releases.json
The file will need to be updated every time a new release is made or an
old one becomes obsolete.
What does everyone and the release managers think?
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
In September 2013, HTTPS was enabled by default for logged in users. To my knowledge, in October 2014, it's planned to also enable it by default for logged out users. I asked some questions here. Please provide me with the missing information.
https://meta.wikimedia.org/wiki/Talk:HTTPS#Preference
Hi,
There are currently three pending patches to change category collation
configuration for various Wikimedia wikis:
https://gerrit.wikimedia.org/r/140580https://gerrit.wikimedia.org/r/147922https://gerrit.wikimedia.org/r/154213
(This generally makes it so that accented letters like 'é' or 'ą' are not
sorted at the end of lists, after 'z', and fixed ordering for languages
with interesting alphabets; see
https://www.mediawiki.org/wiki/Manual:$wgCategoryCollation.)
I have previously been told that these are on hold because of WMF's Trusty
migration and the need to rebuild all existing collation data. (No one
told me how long that might take, but by my estimates that can't possibly
take more than two or three weeks, and I know for a fact that it can be
done with no downtime and minimal problems.)
That was about two months ago.
Is anything known about the current status of the collation things, and
the Trusty migration in general? Is there any public venue that I could
follow to know what is going on, or at least a private one I could be cc'd
on if I asked pretty-please? Is there any ETA, even vague, as to when
things might get done?
I would like to update the bugs that are stuck with this information.
--
Matma Rex