tl;dr:
VMs created on or after September 8th will stop having .eqiad.wmflabs
domains, and be found only under .eqiad1.wikimedia.cloud
The whole story:
Currently cloud-vps VMs stand astride two worlds: wmflabs and
wikimedia.cloud. Here's the status quo:
- New VMs get three different DNS entries:
hostname.project.eqiad1.wikimedia.cloud, hostname.project.eqiad.wmflabs,
and hostname.eqiad.wmflabs [0]
- Reverse DNS lookups return hostnames under eqiad1.wikimedia.cloud
- VMs themselves believe (e.g. via hostname -f) that they're still under
eqiad.wmflabs
That hybrid system has done a good job maintaining backwards
compatibility, but it's a bit of a mess. In the interest of simplifying,
standardizing, and eliminating ever more uses of the term 'labs', we're
going to start phasing out the wmflabs domain name. Beginning on
September 8th, new VMs will no longer receive any naming associated with
.wmflabs [1].
- New VMs will get one DNS entry: hostname.project.eqiad1.wikimedia.cloud
- New VMs will continue to have a pointer DNS entry that refers to the
.wikimedia.cloud name
- New VMs will be assigned an internal hostname under .wikimedia.cloud
In order to avoid breaking existing systems, these changes will NOT be
applied retroactively to existing VMs. Old DNS entries will live on
until the VM is deleted and should be largely harmless. If, however,
you find yourself rewriting code in order to deal with VMs under both
domains (due to the change in hostname -f behavior), don't worry --
adjusting an old VM to identify as part of .wikimedia.cloud only
requires a simple change to /etc/hosts. I'll be available to make that
change for any project that chooses consistency over
backwards-compatibility.
[0]
https://phabricator.wikimedia.org/phame/post/view/191/new_names_for_everyone
[1] https://phabricator.wikimedia.org/T260614
_______________________________________________
Wikimedia Cloud Services announce mailing list
Cloud-announce(a)lists.wikimedia.org (formerly labs-announce(a)lists.wikimedia.org)
https://lists.wikimedia.org/mailman/listinfo/cloud-announce
A change was just now made to the shared proxy system for Toolforge
which makes the proxy respond with default content for /favicon.ico
and /robots.txt when a tool's webservice returns a 404 Not Found
response for these files.
The default /favicon.ico is the same as
<https://tools-static.wmflabs.org/toolforge/favicons/favicon.ico>.
The default robots.txt denies access to all compliant web crawlers. We
decided that this "fail closed" approach would be safer than a "fail
open" telling all crawlers to crawl all tools. Any tool that does wish
to be indexed by search engines and other crawlers can serve their own
/robots.txt content. Please see <https://www.robotstxt.org/> for more
information on /robots.txt in general.
These changes fix a regression [0] in functionality caused by the
toolforge.org migration and the introduction of the 2020 Kubernetes
ingress layer. Previously the /robots.txt and /favicon.ico from the
"admin" tool were served for all tools due to the use of a shared
hostname.
[0]: https://phabricator.wikimedia.org/T251628
Bryan, on behalf of the Toolforge admin team
--
Bryan Davis Technical Engagement Wikimedia Foundation
Principal Software Engineer Boise, ID USA
[[m:User:BDavis_(WMF)]] irc: bd808
_______________________________________________
Wikimedia Cloud Services announce mailing list
Cloud-announce(a)lists.wikimedia.org (formerly labs-announce(a)lists.wikimedia.org)
https://lists.wikimedia.org/mailman/listinfo/cloud-announce
TL;DR:
* HTTP -> HTTPS redirection is live (finally!)
* Currently allowing a "POST loophole"
* "POST loophole" will be closed on 2021-02-01
Today we merged a small change [0] to the front proxy used by Cloud
VPS projects [1]. This change brings automatic HTTP -> HTTPS
redirection to the "domain proxy" service and a
Strict-Transport-Security header with a 1 day duration.
The current configuration is conservative. We will only redirect GET
and HEAD requests to HTTPS to avoid triggering bugs in the handling of
redirects during POST requests. This "POST loophole" is the same
process that we followed when converting the production wiki farm and
Toolforge to HTTPS.
When we announced similar changes for Toolforge in 2019 [2] we forgot
to set a timeline for closing the POST loophole. This time we are
wiser! We will close the POST loophole and make all HTTP requests,
regardless of the verb used, redirect to HTTPS on 2021-02-01. This 6
month transition period should give us all a chance to find and update
URLs to use https and to fix any dependent software that might break
if a redirect was sent for a POST request.
If you find issues in your projects resulting from this change, please
do let us know. The tracking task for this change is T120486 [3]. We
also provide support in the #wikimedia-cloud channel on Freenode and
via the cloud(a)lists.wikimedia.org mailing list [4].
[0]: https://gerrit.wikimedia.org/r/c/operations/puppet/+/620122/
[1]: https://wikitech.wikimedia.org/wiki/Help:Using_a_web_proxy_to_reach_Cloud_V…
[2]: https://phabricator.wikimedia.org/phame/post/view/132/migrating_tools.wmfla…
[3]: https://phabricator.wikimedia.org/T120486
[4]: https://lists.wikimedia.org/mailman/listinfo/cloud
Bryan, on behalf of the Cloud VPS admin team
--
Bryan Davis Technical Engagement Wikimedia Foundation
Principal Software Engineer Boise, ID USA
[[m:User:BDavis_(WMF)]] irc: bd808
_______________________________________________
Wikimedia Cloud Services announce mailing list
Cloud-announce(a)lists.wikimedia.org (formerly labs-announce(a)lists.wikimedia.org)
https://lists.wikimedia.org/mailman/listinfo/cloud-announce
Hello,
I have a Cloud VPS project called srwiki-dev. I have set cron which
automatically updates Mediawiki (core, extensions, skins) also and
composer/npm dependencies. It happens each Thursday. But, I don't know why
GitHub sends me this mail below.
Best regards,
Zoran Dori
Volunteer on Wikimedia Foundation's projects
E: zorandori4444(a)gmail.com
W: kizule.tk
I: iamkizule <https://instagram.com/iamkizule>
---------- Forwarded message ---------
Од: GitHub <noreply(a)github.com>
Date: чет, 13. авг 2020. у 22:00
Subject: [GitHub API] Deprecation notice for authentication via URL query
parameters
To: Zoran Dori <zorandori4444(a)gmail.com>
Hi @zoranzoki21,
On August 13th, 2020 at 20:00 (UTC) your personal access token (Composer on
discordwiki 2020-03-25 2307) using Composer/1.8.4 (Linux; 4.19.0-8-amd64;
PHP 7.3.19) was used as part of a query parameter to access an endpoint
through the GitHub API:
https://api.github.com/repositories/14212481/zipball/d1e4107e1c4f4ca0139a4a…
Please use the Authorization HTTP header instead, as using the
`access_token` query parameter is deprecated. If this token is being used
by an app you don't have control over, be aware that it may stop working as
a result of this deprecation.
Depending on your API usage, we'll be sending you this email reminder on a
monthly basis for each token and User-Agent used in API calls made on your
behalf.
Just one URL that was accessed with a token and User-Agent combination will
be listed in the email reminder, not all.
Visit
https://developer.github.com/changes/2020-02-10-deprecating-auth-through-qu…
for more information about suggested workarounds and removal dates.
Thanks,
The GitHub Team
Hi,
I would like to sync the 100k most popular articles to my local machine.
Maybe I was blind, but I could not find a suitable documentation about this.
Maybe the list with the 100k most popular articles could be the start.
If I have this list, I could download the articles with 100k http calls.
But before I do this, I would kindly ask you about the correct way to do
this.
I don't want that you think I am doing a DoS attack :-)
If there is a better place for this question, please tell me!
Regards,
Thomas Güttler
--
Thomas Guettler http://www.thomas-guettler.de/
I am looking for feedback: https://github.com/guettli/programming-guidelines
Hello everyone,
We recently installed a tutorial for the Wikidata Query Service
<https://wdqs-tutorial.toolforge.org/> (WDQS) on Toolforge.
This is the first time Wikimedia Israel developed an instructional tool in
English for the use of the whole Wikmedia community and beyond - as opposed
to our other tools that were aimed for the Hebrew or Arabic speaking
communities. As such, it is also the first time we are hosting a tool on
Toolforge.
All our other tools (mostly Wordpress sites, like the one on Toolforge) are
hosted in Israel at a commercial hosting service, and maintenance (Wordpress
update and security) is done by a commercial company. However, I am not
sure this company would provide maintenance for the WDQS tutorial given
that access to Toolforge requires a Wikimedia developer account, a Wikitech
account, SSH access - not the kind of things commercial services are keen
to get into...
I was wondering if any of you have some on what we could do to ensure the
WDQS tutorial is kept up-to-date and secure.
I believe maintenance would take up to 2-3 hours per month - this is not a
huge undertaking, but at our organization we do not have anyone in-house
who is savvy enough to do it.
Do you know or can suggest someone who knows how to maintain a Wordpress
site AND is familiar with the Toolforge? Are you this someone?!? Or maybe
you know someone that might know someone?
Any input regarding this issue would be much appreciated.
Cheers and stay safe,
Dr. Keren Shatzman
Academic & Projects Coordinator