Hi
On Tue, Jun 17, 2014 at 12:14 AM, Quim Gil <qgil(a)wikimedia.org> wrote:
>
> If you want to be a Wikimedia delegate at the Google Summer of Code
> Reunion, apply before the end of June at
> https://www.mediawiki.org/wiki/Talk:Mentorship_programs/Possible_mentors
>
Only Siebrand has signed up at the wiki page so far. No other GSoC mentors
interested?
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
This is an email to shell account holders on translatewiki.net and to
wikitech-l, so that you are informed.
Today at 08:10 UTC Niklas noticed that the translatewiki.net server had
been compromised. We saw some suspicious files in /tmp and a few processes
that didn't belong:
elastic+ 22862 0.0 0.0 2684 2388 ? S 04:53 0:00
/tmp/freeBSD /tmp/freeBSD 1
elastic+ 31575 0.0 0.0 2684 2388 ? S 06:38 0:00
/tmp/freeBSD /tmp/freeBSD 1
elastic+ 31580 16.7 0.0 90816 724 ? Ssl 06:38 16:26
[.Linux_time_y_2]
We gathered data and looked at our recent traffic statistics. We drew the
following conclusions:
- Only the Elasticsearch account had been compromised. The intruder did not
gain access to other accounts.
- The attack could be made because the Elasticsearch process was bound to
all interfaces, instead of only the localhost interface, and dynamic
scripting was enabled, because it is required by CirrusSearch
(CVE-2014-3120).
- A virtual machine was started, and given the traffic that was generated
(about 1TB in the past 4 days), we think this was a DDoS drone. The process
reported to an IP address in China.
- A server reinstall is the right thing to do (better safe than sorry).
The compromised server was taken off-line around 10:00 UTC today.
Actions taken:
- Bind Elasticsearch only to localhost from now on:
https://gerrit.wikimedia.org/r/#/c/145262/
- Reinstall the server
Actions to be taken:
- Configure a firewall to only allow expected traffic to enter and exit the
translatewiki.net server so that something like the added virtual machine
could not have communicated to the outside world.
- As a precaution, shell account holders should change any secret that they
have used on the translatewiki.net server in the past 7 days.
We are thankful to the people in the MediaWiki security IRC channel and
Henri Salo for helping us with data gathering on the attack, and how to
proceed.
We have re-installed the translatewiki.net server, and are currently
re-importing the databases. We expect to be back online in a few hours.
Once we come back online, we'll still have to rebuild some non-critical
meta data stores, like populating the search database.
Cheers!
Siebrand
We're in the home stretch with replacing the old search engine. I'd like to
thank our many thousands of beta testers who opted in and provided their
feedback throughout this process.
You might have noticed we've been rolling it out as the primary search
engine on progressively more wikis. We're down to 11 of the largest wikis
based on search traffic and we've put together a timeline for these:
commons - July 7
eswiki - July 9
nlwiki - July 14
plwiki - July 16
ruwiki - July 28
svwiki - July 30
zhwiki - August 13
dewiki - August 18
frwiki - August 20
jawiki - August 25
enwiki - August 27
It's basically 2 per week starting Monday with a weird break for Wikimania.
Unless we find any last minute blockers I think we'll be done by the end of
August :) If you find anything that's still a problem in Cirrus at all,
even if it's
minor and not a blocker please let us know in Bugzilla.
-Chad
PS: Please feel free to forward this to appropriate other venues
Hi we are upgrading jquery cookie from an early alpha version of 1.1 to 1.2. Please start upgrading your code to be compatible with jquery cookie 1.2. There is just one deprecations to notice and that is $.cookie('foo', null) is now deprecated. And replace it with Adding $.removeCookie('foo') for deleting a cookie. We are slowly upgrading to version 1.4.1 but one step at a time because it is. A ja our change and removes a lot of things.
Hi we are upgrading jquery cookie from an early alpha version of 1.1 to 1.2. Please start upgrading your code to be compatible with jquery cookie 1.2. There is just one deprecations to notice and that is $.cookie('foo', null) is now deprecated. And replace it with Adding $.removeCookie('foo') for deleting a cookie. We are slowly upgrading to version 1.4.1 but one step at a time because it is. And please also follow this change log www.github.com/carhartl/jquery-cookie/blob/master/CHANGELOG.md so that we can upgrade to 1.4.1.
Hi we are upgrading jquery cookie from an early alpha version of 1.1 to 1.2. Please start upgrading your code to be compatible with jquery cookie 1.2. There is just one deprecations to notice and that is $.cookie('foo', null) is now deprecated. And replace it with Adding $.removeCookie('foo') for deleting a cookie. We are slowly upgrading to version 1.4.1 but one step at a time because it is. And please also follow this change log https://github.com/carhartl/jquery-cookie/blob/master/CHANGELOG.md so that we can upgrade to 1.4.1.
Just wanted to send out an update on the progress we made around MW-Vagrant
improvements at the Zürich Hackathon. Our primary goal was to make some key
production services available in MW-Vagrant in order to make local
development/testing easier/more reliable. We made some excellent headway,
focussing on a few key services: SSL, Varnish, CentralAuth/Multiwiki.
SSL:
I spent a majority of my time focussing on this and received a lot of
support/help from Ori. There is now an 'https' role in mw-vagrant which
when enabled, will allow you to access your devwiki on port 4430 (forwarded
to 443 in Vagrant). There is one outstanding patchset which will make it
possible to use $wgSecureLogin in MW-Vagrant:
https://gerrit.wikimedia.org/r/#/c/132799/
Varnish:
This is proving to be much more difficult than anticipated, however some
progress was made and work is ongoing, spearheaded by Andrew Otto. The plan
is to set up varnish VCLs for mw-vagrant similar to what is set up for text
varnishes in production, with a frontend and backend instance running in
vagrant. Andrew is in the midst of refactoring the production varnish
module, to make it usable in Vagrant.
CentralAuth/Multiwiki:
Bryan Davis, Chris Steipp, and Reedy spent a lot of time hacking on this,
and we now have support for multiwiki/CentralAuth in Vagrant! There is
still some cleanup work being done for the role to remove kludge/hacks/etc
(see https://gerrit.wikimedia.org/r/#/c/132691/).
Also of significant note, Matt Flaschen created a mw-vagrant iso which can
be packaged on USB thumb drives, making it possible to set up mw-vagrant
without a network connection. There is still some work to be done here to
create a one-click installer as well as updating documentation. Matt got
this done before the hackathon, and we brought a bunch of USB sticks imaged
with the iso, which was instrumental in getting a bunch of folks new to
mw-vagrant up and running at the hackathon. This was particularly useful
during Bryan Davis's vagrant bootcamp sessions.
I believe Katie Filbert from Wikidata did some mw-vagrant work at the
hackathon as well, although I'm not clear on the current status. Katie, can
you let us know where things were at with what you were working on?
All in all it felt like a very fruitful hack session, and we're closer than
ever to having a ready-to-go developer instance that mimics our production
environment. Big thanks to everyone involved in making our work successful.
--
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
Hi everyone!
Does anyone knows about the tool that can help to upload a lot of files and
create the page for every file with a given description? I'd say that it
should be a maintenance script since for some reason the API upload works
pretty slow. I saw UploadLocal Extension but it's too manual and it doesn't
work well when the amount of files to upload is very large.
Cheers,
-----
Yury Katkov
Hello,
A quick reminder that the Language Engineering office hour is happening in
a few hours (1700 UTC) on #wikimedia-office. Please see below for the
original announcement, including local time and agenda.
Thanks
Runa
Monthly IRC Office Hour:
==================
# Date: July 09, 2014 (Wednesday)
# Time: 1700 UTC/1000PDT (Check local time:
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20140709T1700)
# IRC channel: #wikimedia-office
# Agenda:
1. Content Translation project updates
2. Q & A (Questions can be sent to me ahead of the event)
---------- Forwarded message ----------
From: Runa Bhattacharjee <rbhattacharjee(a)wikimedia.org>
Date: Tue, Jul 8, 2014 at 1:41 PM
Subject: Language Engineering IRC Office Hour on July 9, 2014 (Wednesday)
at 1700 UTC
To: MediaWiki internationalisation <mediawiki-i18n(a)lists.wikimedia.org>,
Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>, Wikimedia
developers <wikitech-l(a)lists.wikimedia.org>,
wikitech-ambassadors(a)lists.wikimedia.org
[x-posted]
Hello,
The Wikimedia Language Engineering team will be hosting the next
monthly IRC office hour on Wednesday, July 09 2014 at 1700 UTC on
#wikimedia-office.
In this office hour we will be discussing about our recent activities
around the Content Translation project[1] and taking questions.
Please see below for event details and local time. See you at the office
hour.
Thanks
Runa
[1] https://www.mediawiki.org/wiki/Content_translation
Monthly IRC Office Hour:
==================
# Date: July 09, 2014 (Wednesday)
# Time: 1700 UTC/1000PDT (Check local time:
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20140709T1700)
# IRC channel: #wikimedia-office
# Agenda:
1. Content Translation project updates
2. Q & A (Questions can be sent to me ahead of the event)
--
Language Engineering - Outreach and QA Coordinator
Wikimedia Foundation
Hello,
Ori and I attempted to run phpunit tests with all extensions deployed on
the wmf cluster. With the help of Sam Reed we fixed must of the issues
but there is one nasty one remaining.
The parser tests expectations can not be meet when one mix extensions
together. For example the Cite tests have some images/thumbnails, when
one installs the MultimediaViewer extension, the test file because the
generated output adds data-file-width="1941" data-file-height="220" to
the <img> elements.
That is due to MultimediaViewer registering the parser hook
ThumbnailBeforeProduceHTML . And that is legit.
I am looking for ideas to properly fix extensions altering output and
thus breaking other extensions parser tests.
A lame idea would be to have the Cite parser tests to unregister any
hook altering HTML which are not registered by Cite. We could maybe
come up with another repositories that has a different set of
parsertests suitable for multiple extensions.
There is a few more extensions having a similar issue such as
ProofReadpage inserting class="prp-pagequality-1 in title links.
Thoughts?
== References ==
A bug for MultimediaViewer / Cite has some more details:
https://bugzilla.wikimedia.org/show_bug.cgi?id=67302
We have a tracking bug to get all extensions pass unit tests together:
https://bugzilla.wikimedia.org/show_bug.cgi?id=67216
An experimental run is:
console:
https://integration.wikimedia.org/ci/job/mediawiki-core-extensions-integrat…
test report which highlight other issues:
https://integration.wikimedia.org/ci/job/mediawiki-core-extensions-integrat…
--
Antoine "hashar" Musso