We will be pushing the new unified certificate this afternoon @ 2PM PDT.
This new certificate will include all our primary top level domains, as
well as the mobile subdomains.
Unfortunately, those projects that have a sub.subdomain (example:
arbcom.de.wikimedia.org) will have to redo their redirection rules and the
like, as this will break them again. (As those users saw yesterday)
The serial/fingerprint of the new certificate is:
07:24:ee:a9:7c:55:f2:57:5e:28:8b:a4:cc:f2:0e:8e
A quick grep through the pdns files shows me the following projects will be
affected by this change (and have to redo their redirections/whatnot):
arbcom.de.wikipedia.orgarbcom.en.wikipedia.orgarbcom.fi.wikipedia.orgarbcom.nl.wikipedia.orgnoboard.chapters.wikimedia.org
This is all the ones that I found, but it may not be all inclusive.
This change will fix certificate issues that have been around for awhile
now, as listed in:
https://bugzilla.wikimedia.org/show_bug.cgi?id=34788
If there are any questions or concerns, feel free to reply back to this
thread.
Thanks,
--
Rob Halsell
Operations Engineer
Wikimedia Foundation, Inc.
E-Mail: rhalsell(a)wikimedia.org
Hi, there is a meetup tomorrow Thursday in San Francisco + video streaming:
Lua meets Wikipedia
http://www.meetup.com/Wikipedia-Engineering-Meetup/events/106078042/
The talks will start at about 6pm Pacific - 1:00 AM UTC
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20130315T0100
Video will be streamed and stored at YouTube. The URLs will be available
at the meetup page as soon as we have them.
We are co-organizing this meetup with the Bay Area Lua Developers
meetup, who has the first slot with a demo-based minimal crash course.
SF(Rob Lanphier && Aaron Schulz) + remote(Tim Starling && Brad Jorsch)
will continue with a session specific to Lua support in Wikipedia and
MediaWiki in general.
--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hello,
I'm in the process of re-working mediawiki-vagrant, which is a set of scripts for provisioning a virtual machine for MediaWiki development. I'm struggling to identify the best way of fetching mediawiki/core.
An ideal solution would have the following attributes:
- Fast.
- Includes .git metadata, to facilitate contribution of patches.
- Viable on slow network connections.
- Does not require a Gerrit account (to help newcomers get started quickly)
What I tried:
- A shallow (--depth=0) git-clone over HTTPS took around half an hour and required transferring 272MB, with 200MB taken up by .git/objects/pack.
- The nightlies on integration.mediawiki.org are small (18MB) and easy to retrieve, but the most recent one is from December, and they don't contain any .git metadata.
- The snapshots Krinkle maintains on the toolserver are both small and up-to-date, but they too do not contain any .git metadata.
- The snapshot link on http://www.mediawiki.org/wiki/Download (https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=snapshot;h=ref…) just didn't work. It hangs for a while and then spits out HTML.
- Getting a snapshot from GitHub would probably work, but I am loathe to depend on it.
Does anyone have any suggestions?
--
Ori Livneh
Hello,
The Wikimedia Language Engineering team [1] invites everyone for the
team’s monthly office hours on March 13, 2013. The team has lots of
exciting updates about their projects, programs and events since the
last office hour in November 2012. Some of this has already been
shared in our recent blogs. Event details and the general agenda is
mentioned below.
See you all at the IRC office hour!
Thanks
Runa
Event Details:
Date: 2013-03-13
Time: 1700 UTC
IRC channel: #wikimedia-office on irc.freenode.net
Agenda:
# Introductions
# MLEB Release[2]
# Translate UX[3] - Updates
# Updates about participation in various community events
# Follow up from earlier office hours:
Language Team (new) plans
Testing Event plans
# Q/A - We shall be taking questions during the session. Questions can
also be sent to runa at wikimedia dot org before the event and can be
addressed during the office-hour.
[1] http://wikimediafoundation.org/wiki/Language_Engineering_team
[2] http://www.mediawiki.org/wiki/MediaWiki_Language_Extension_Bundle
[3] http://www.mediawiki.org/wiki/Extension:Translate
--
Language Engineering - Outreach and QA Coordinator
Wikimedia Foundation
A few people on rationalwiki.org have been muttering about doing a
customised Vector skin.
The trouble with Vector is that, as I understand it, it's an odd
melange of extension and skin, with functionality that should be in
one being in the other, both ways.
I also understand that there are plans to refactor it to be sensibly
organised with the right functionality in the right place. (Though I
have no idea if there is actually anyone assigned to such a task.)
Is this the case? If not, what is? What's the present and future of Vector?
- d.
This category is sorted strangely,
http://sv.wikipedia.org/wiki/Kategori:Svenska_kokboksf%C3%B6rfattare
A, B, E, G, H, L, B, M, N, F, J, R, Þ, S, W, Z, Å
Not only does B appear twice, and F after N, but
the article sorted under "Þ" has
{{STANDARDSORTERING:Tunberger, Pernilla}}
where "STANDARDSORTERING" is Swedish for DEFAULTSORT,
and there is no "Þ" (Icelandic Thorn) in sight.
--
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se
Hello,
I'm checking whether I can detect that a process is run from a
maintenance script in a parser function extension.
Which would be the best way / more recommendable to detect it?
Thanks!
--
Toni Hermoso Pulido
http://www.cau.cat
Hello all,
------
tl;dr: There's a difference between how the WMF Platform team and
Feature Teams commit to master. Why? How can we unify and/or make
testing from development branches easier for everyone?
------
Two weeks ago Ops and Platform both had all staff local in SF for a week
for a slew of meetings. I took that opportunity to make even more
meetings with people/teams regarding our dev and deployment process.
(I promise I'll be nicer in the future.)
You can see the fruits of that labor here:
https://wikitech.wikimedia.org/wiki/Deployments/Features_Process/General_Fe…
(More info linked from:
https://wikitech.wikimedia.org/wiki/Deployments/Features_Process )
One of the things that became apparent is that the Features Teams (eg:
E2, E3, Visual Editor) tend to have separate branches that they develop
on and only merge to master closer to their deploy windows. There is
some variation among them, of course, but the general idea stands.
This contrasts with the way the WMF Platform team develops; we
predominately commit to master as we develop with any new features set
as disabled in a config until it is ready. The Features Teams also use
the "disabled in a config until ready" bit, of course.
What is the reasoning for the use of this development/deployment
distinction on the Feature Teams? Mostly it is testing, from what I
could tell. Many teams have the development branch running on a test
instance that they, well, test against. Then, as things stabilize they
merge to master via Gerrit.
(Feature Teams members, please correct me if I'm wrong anywhere above,
generally: specifics may differ, of course, but is the overall notion
accurate?)
How do we encourage a more unified development, testing, and deployment
process across all WMF teams and community members (sometimes they are
one and the same)?
This issue should also be looked at with betalabs in mind; how can we
utilize betalabs to be the best test environment it can be? What code
should be running there and how often should it be updated? What tests
should be automatically run against it? These questions are almost out
of scope of this thread, but a solution to this thread should also have
an eye towards betalabs.
Thanks,
Greg
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
While trying to add some more information to
https://www.mediawiki.org/wiki/Manual:Code, I came across a slightly
peculiar issue regarding the entry points for MediaWiki:
Right now, among all the entry points that I know of (those are listed in
Manual:Code), only mw-config/index.php doesn't sit in the root folder.
Furthermore, it's related to the installer at includes/installer/, but that
is not clear at all from the code organization, specifically the directory
names (and the lack of documentation both in the file and on mediawiki.org).
I have two questions, then:
1) should all access points be on the root directory of the wiki, for
consistency?
2) should the name "mw-config" be changed to something that more clearly
indicates its relationship with the installer?
Note that these aren't merely nitpicking: a consistent structure and
intuitive names for files and directories play an important role in the
self-documenting nature of the code, and make the learning curve smoother
for new developers (e.g. yours truly :-)).
Also, I used Tim Starling's suggestion on IRC to make sure the list of
entry point scripts listed in Manual:Code was complete: git grep -l
/includes/WebStart.php
I am not sure that exhausts the list, however, since thumb_handler.php
doesn't show up on its results. Any pointers regarding potential entry
points currently omitted from that list are most welcome.
--Waldir
ps - while investigating this subject I came accross some inconsistencies
in the way the access points include the WebStart.php file, including an
incorrect use (according to Tim) of require_once() instead of require(). I
submitted a change to Gerrit harmonizing them; if this interests you,
please review the commit: https://gerrit.wikimedia.org/r/#/c/49208/