Hello everyone,
The Committee has finished selecting its new members. The new committee
candidates are (in alphabetical order):
- Amir Sarabadani
- Egbe Eugene Agbor (Eugene233)
- Greg Grossmeier
- Jayprakash12345
- Kamila Součková
Auxiliary members will be (alphabetically):
- Effie Mouzeli
- Joris Darlington Quarshie
- Martin Urbanec
- MusikAnimal
- SD0001
You can read more information about the members at [1]. List of changes
from the last term:
- Greg and Kamila will be joining the main committee.
- Nuria is leaving the main committee
- MusikAnimal is moving to the aux committee
- Tony is leaving the aux committee
This is not the final structure. According to the CoC [2], the current
committee publishes the new members and calls for public feedback for *six
weeks* and after that, the current committee might apply changes to the
structure based on public feedback.
Please let the committee know (via techconduct(a)wikimedia.org) if you have
any concern regarding the members and its structure until *08 October 2024*
and after that, the new committee will be in effect and will serve for a
year.
[1]
https://www.mediawiki.org/wiki/Code_of_Conduct/Committee/Members/Candidates
[2]
https://www.mediawiki.org/wiki/Code_of_Conduct/Committee#Selection_of_new_m…
--
Amir (he/him)
Following several requests from users over the past eight years [0],
we are finally enabling access to ToolsDB's "public" databases (the
ones with a name ending with "_p") [1] from both Quarry [2] and
Superset [3].
The data stored in those databases have always been accessible to
every Toolforge user, but after this change they will become more
broadly accessible, as Quarry can be accessed by anyone with a
Wikimedia account, and saved queries in Quarry can be shared with
public links that require no login at all.
== This change is planned to go live on Monday, July 1st. ==
If you have any concerns or questions related to this change, please
leave a comment in the Phabricator task or one of its subtasks. [0]
Thanks to everyone for your patience and for keeping the task alive
over the years!
[0] https://phabricator.wikimedia.org/T151158
[1] https://wikitech.wikimedia.org/wiki/Help:Toolforge/Database#Privileges_on_t…
[2] https://meta.wikimedia.org/wiki/Research:Quarry
[3] https://superset.wmcloud.org/
--
Francesco Negri (he/him) -- IRC: dhinus
Site Reliability Engineer, Cloud Services team
Wikimedia Foundation
_______________________________________________
Cloud-announce mailing list -- cloud-announce(a)lists.wikimedia.org
List information: https://lists.wikimedia.org/postorius/lists/cloud-announce.lists.wikimedia.…
Just now one of our hypervisors crashed and left several VMs shut down
for around 45 minutes. I have rebooted the host and service has been
restored.
The affected VMs were:
web01
tools-k8s-worker-nfs-55
tools-k8s-worker-nfs-24
toolsbeta-test-k8s-worker-nfs-2
opensearch2
traffic-acmechief01
section-ranker
integration-agent-docker-1047
xtools-prod08
xtools-dev06
statanalyser
bullseye
deployment-ms-fe04
deployment-ms-be07
signwriting-swis-2022
signwriting-swserver-2022
mwparserfromhtml
video-nfs-1
striker-docker-01
twl-nfs-1
teyora
suggestbot-02
edit-types
soweego
quarry-dbbackup-01
pki-db
Followup on this incident is at https://phabricator.wikimedia.org/T373740
_______________________________________________
Cloud-announce mailing list -- cloud-announce(a)lists.wikimedia.org
List information: https://lists.wikimedia.org/postorius/lists/cloud-announce.lists.wikimedia.…
Hello everyone,
Hope this message finds you well. I wanted to share some updates regarding
the Language & Internationalization newsletter, the upcoming community
meeting, and the Language Support track at Wikimania.
*Language and Internationalization Newsletter*
The fourth edition of the Language & Internationalization newsletter
(August 2024) is available at this link: <
https://www.mediawiki.org/wiki/Wikimedia_Language_and_Product_Localization/…
>.
This newsletter provides updates from the April–June 2024 quarter on new
feature development, improvements in various language-related technical
projects and support efforts, details about community meetings, and
contribution ideas for getting involved in projects.
To stay updated, you can subscribe to the newsletter on its wiki page: <
https://www.mediawiki.org/wiki/Wikimedia_Language_and_Product_Localization/…
>.
*Language Community Meeting*
The next language community meeting is scheduled in a few weeks—on August
30th at 15:00 UTC. If you're interested in joining, you can sign up on this
wiki page: <
https://www.mediawiki.org/wiki/Wikimedia_Language_and_Product_Localization/…
>.
This participant-driven meeting will focus on sharing language-specific
updates related to various projects, discussing technical issues related to
language wikis, and working together to find possible solutions. For
example, in the last meeting, topics included the Language Converter, the
state of language research, updates on the Incubator conversations, and
technical challenges around external links not working with special
characters on Bengali sites.
Do you have any ideas for topics to share technical updates or discuss
challenges? Please add agenda items to the document here: <
https://etherpad.wikimedia.org/p/language-community-meeting-aug-2024>.
*Language Support Track at Wikimania*
Attending the Wikimania Hackathon in person next week? There will be a
dedicated Language Support track focusing on addressing small,
language-specific technical requests across various Wikimedia projects to
support new and existing language communities. Join us to collaborate with
interested volunteers on technical tasks, organize technical sessions, and
run a support help desk. There will also be a session for remote attendees
on using Translatewiki.net for translating Wikimedia interface messages! <
https://wikimania.wikimedia.org/wiki/2024:Program/Hackathon#Language_Support>
Cheers,
Srishti
*Srishti Sethi*
Senior Developer Advocate
Wikimedia Foundation <https://wikimediafoundation.org/>
Cross-posting this announcement from the wikitech-l mailing list. As I
understand it, this policy change is intended to be clarifying rather
than introducing new limits or hurdles for the existing API consumer
community.
Discussion should happen on wiki please as folks will not be
monitoring the cloud mailing lists for feedback.
Bryan
---------- Forwarded message ---------
From: Luca Martinelli [Sannita@WMF] <sannita(a)wikimedia.org>
Date: Thu, Aug 29, 2024 at 6:21 AM
Subject: [Wikitech-l] New draft text of WMF legal policy regarding use of APIs
To: <wikitech-l(a)lists.wikimedia.org>, <wikimedia-l(a)lists.wikimedia.org>
Hello all,
We have published a new draft text of WMF legal policy discussing the
use of its APIs. The full page is available on Meta.[1]
We are currently looking for feedback on this draft text, and you're
welcome to express your considerations on the text's talk page.[2]
This new document would be published at "API usage guidelines" on the
Foundation wiki, unless community comments suggest the text should be
appended on an existing page (for example, the User-Agent policy
page),[3] or a different wiki.
This page is available for community comment for a period of at least
two weeks or until new comments, questions, and suggestions have
concluded.
Thank you in advance for your consideration!
[1] https://meta.wikimedia.org/wiki/API_Policy_Update_2024
[2] https://meta.wikimedia.org/wiki/Talk:API_Policy_Update_2024
[3] https://foundation.wikimedia.org/wiki/Special:MyLanguage/Policy:User-Agent_…
--
Luca Martinelli [Sannita] (he/him)
Movement Communications Specialist
_______________________________________________
Wikitech-l mailing list -- wikitech-l(a)lists.wikimedia.org
To unsubscribe send an email to wikitech-l-leave(a)lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
--
Bryan Davis Wikimedia Foundation
Principal Software Engineer Boise, ID USA
[[m:User:BDavis_(WMF)]] irc: bd808
_______________________________________________
Cloud-announce mailing list -- cloud-announce(a)lists.wikimedia.org
List information: https://lists.wikimedia.org/postorius/lists/cloud-announce.lists.wikimedia.…
Hi,
Is it allowed to build a webpack app in Toolforge environment?
I'm asking because I tried to build an Agular app (old ng) for Wiki
Loves Monuments and failed. I did found this instructions:
https://wikitech.wikimedia.org/wiki/Help:Toolforge/Node.js#Running_npm_with…
And Node18 was installed OK, running `npm i` was OK, but my build failed.
My steps:
```bash
become zabytki
# install
webservice --backend=kubernetes node18 shell
# clone (or pull)
mkdir ~/testrepo
cd ~/testrepo
git clone https://github.com/Eccenux/wlm-zabytki-mapa.git ./
# build
npm i
npm run build-prod
```
And that last part fails at 92%:
> 92% sealing asset processing TerserPlugin*Killed*
The build-prod script is defined as:
webpack --progress --mode production --env=prod
The webpack script is here:
https://github.com/Eccenux/wlm-zabytki-mapa/blob/master/webpack.config.js
Would be grateful for any pointers. For now I'm building locally and
deploying assets to github (so that I can skip the build step on
Toolforge), but when I'm on vacation setting up build env is a
complication I'd like to avoid :)
Cheers,
Maciej Nux.
TL;DR:
* Nominations are open for the Toolforge standards committee at
<https://wikitech.wikimedia.org/wiki/Help_talk:Toolforge/Toolforge_standards…>
The Toolforge standards committee [0] (née Tool Labs standards
committee) was established in 2017 following a membership nomination
process in December 2016.[1] There have been no formal membership
changes in the subsequent 7.5 years. As one might imagine, a number of
the highly active Toolforge members who formed the original committee
have since drifted away from the project as their personal lives and
interests have changed. Those that are still active within the
movement seem to have drifted away from this particular functionary
role.
I feel both grateful to those who have served for so long and
responsible for not helping them transition away more gracefully prior
to now. It is long past time for us to thank Eranroz, Huji, Ladsgroup,
Matanya, Quiddity, and Zhuyifei1999 for their efforts and find a new
set of folks to carry on the work of the committee.
The committee's primary duties are to oversee the Right to fork
policy[2] and Abandoned tool policy[3]. The committee is also
encouraged to implement additional programs designed to promote and
improve the use of Toolforge for the benefit of the Wikimedia
movement.
Nominations are being collected on Wikitech.[4] Any Toolforge project
member with a Developer account not blocked in any Wikimedia technical
space is eligible for nomination. Self-nominations are acceptable.
Nominations by a third party should be counter-signed by the nominee
to indicate they would accept the responsibility and be willing to
sign the Volunteer NDA.[5] Nominations will remain open until at least
2024-08-26.
[0]: https://wikitech.wikimedia.org/wiki/Help:Toolforge/Toolforge_standards_comm…
[1]: https://wikitech.wikimedia.org/wiki/Help_talk:Toolforge/Toolforge_standards…
[2]: https://wikitech.wikimedia.org/wiki/Help:Toolforge/Right_to_fork_policy
[3]: https://wikitech.wikimedia.org/wiki/Help:Toolforge/Abandoned_tool_policy
[4]: https://wikitech.wikimedia.org/wiki/Help_talk:Toolforge/Toolforge_standards…
[5]: https://wikitech.wikimedia.org/wiki/Volunteer_NDA
--
Bryan Davis Wikimedia Foundation
Principal Software Engineer Boise, ID USA
[[m:User:BDavis_(WMF)]] irc: bd808
_______________________________________________
Cloud-announce mailing list -- cloud-announce(a)lists.wikimedia.org
List information: https://lists.wikimedia.org/postorius/lists/cloud-announce.lists.wikimedia.…
Hello Cloud list admins,
Can you please consider increasing the size of digest_size_threshold for this list (cloud)? I've received 7 digest emails today so far containing 2-3 emails each, and I'd prefer to receive 1 digest email per day with 20+ emails inside of it. Less emails = better for folks that want daily digests. This will not affect folks that are configured to receive individual emails.
Thanks for your consideration,
Novem LInguae
-----Original Message-----
From: cloud-request(a)lists.wikimedia.org <cloud-request(a)lists.wikimedia.org>
Sent: Sunday, August 18, 2024 1:09 PM
To: cloud(a)lists.wikimedia.org
Subject: Cloud Digest, Vol 152, Issue 18
Send Cloud mailing list submissions to
cloud(a)lists.wikimedia.org
To subscribe or unsubscribe, please visit
https://lists.wikimedia.org/postorius/lists/cloud.lists.wikimedia.org/
You can reach the person managing the list at
cloud-owner(a)lists.wikimedia.org
When replying, please edit your Subject line so it is more specific than "Re: Contents of Cloud digest..."
Today's Topics:
1. Re: javascript tooling? (Travis Briggs)
----------------------------------------------------------------------
Message: 1
Date: Sun, 18 Aug 2024 13:08:44 -0700
From: Travis Briggs <audiodude(a)gmail.com>
Subject: [Cloud] Re: javascript tooling?
To: Wikimedia Cloud Services general discussion and support
<cloud(a)lists.wikimedia.org>
Message-ID:
<CAMPYpA5vo__8OvLosq6NCdjW4jWkRoSejrYAVPZJvZGk+ML-bw(a)mail.gmail.com>
Content-Type: multipart/alternative;
boundary="0000000000000d6ba4061ffac3a6"
Okay that makes a lot of sense then. I bet for some endpoints you want to include OAuth credentials to act on behalf of a user.
-Travis
On Sun, Aug 18, 2024 at 1:01 PM Aoyan Sarkar <sarkaraoyan(a)gmail.com> wrote:
> The browser does not allow you to pass in credentials (via cookies or
> Authenticafion header) when origin=*.
>
> From MDN:
>
> > For requests without credentials, the literal value "*" can be
> > specified
> as a wildcard; the value tells browsers to allow requesting code from
> any origin to access the resource. Attempting to use the wildcard with
> credentials results in an error
> <https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS/Errors/CORSNot
> SupportingCredentials>
> .
>
> - Aoyan Sarkar
>
>
> On Sun, Aug 18, 2024 at 3:58 PM Travis Briggs <audiodude(a)gmail.com> wrote:
>
>> CORS is unrelated to authentication. It has nothing to do with what
>> cookies you do or do not have. While a website could look at cookies
>> when deciding whether to send the Access-Control-Allow-Origin header,
>> that would be unusual.
>>
>> origin=* should be all you ever need, because otherwise you're just
>> telling the MW server to allow whatever you tell it. So it's just as
>> good for the MW server to tell the browser that it allows everything.
>> You don't gain any security by asking the MW server to only allow
>> your specific origin for that request.
>>
>> -Travis
>>
>> On Sun, Aug 18, 2024 at 12:52 PM Roy Smith <roy(a)panix.com> wrote:
>>
>>> OK, I'm reading along at
>>> https://www.mediawiki.org/wiki/API:Cross-site_requests and this is
>>> starting to make sense. I see that origin=* forces anonymous mode.
>>> This is enough to get me started. At some point I'll certainly need
>>> to be authenticated, but I'll tackle that when I get to it.
>>>
>>>
>>> On Aug 18, 2024, at 2:37 PM, Sportzpikachu via Cloud <
>>> cloud(a)lists.wikimedia.org> wrote:
>>>
>>> Access-Control-Allow-Origin (and other related headers) is standard.
>>> `origin=*` is specific to the Action API, which requests MW to add
>>> the ACAO header.
>>>
>>> `origin=localhost:5173` IIRC makes MW check the origin against a
>>> whitelist of sites that can use credentials, but origin=* is special
>>> in that MW allows CORS for anonymous requests (ie no cookies sent).
>>>
>>> On 19 Aug 2024, at 02:32, Roy Smith <roy(a)panix.com> wrote:
>>>
>>> OK, that worked, thanks. Surprisingly origin=localhost:5173 doesn't
>>> work, but I can live with that.
>>>
>>> Is this a standard part of the CORS protocol, or something specific
>>> to the Action API?
>>>
>>>
>>> On Aug 18, 2024, at 1:45 PM, Siddharth VP <siddharthvp(a)gmail.com> wrote:
>>>
>>> Add origin=* in the API request query params. This tells the API to
>>> include Access-Control-Allow-Origin: * in the response headers.
>>> Don't put
>>> mode: no-cors.
>>>
>>> On Sun, 18 Aug 2024 at 22:29, Roy Smith <roy(a)panix.com> wrote:
>>> So, after beating my head against this for a couple of days, I've
>>> come to the conclusion that I just don't understand how CORS works.
>>>
>>> If I get the following URL:
>>> https://en.wikipedia.org/w/api.php?action=query&format=json&meta=use
>>> rinfo&formatversion=2&uiprop=rights
>>>
>>> from a browser, I get what I expect:
>>>
>>> {
>>> "batchcomplete": true,
>>> "query": {
>>> "userinfo": {
>>> "id": 130326,
>>> "name": "RoySmith",
>>> "rights": [
>>> etc
>>> }
>>>
>>>
>>>
>>> On the command-line with curl, likewise something that makes sense:
>>>
>>> {
>>> "batchcomplete" : true,
>>> "query" : {
>>> "userinfo" : {
>>> "anon" : true,
>>> "id" : 0,
>>> etc
>>> }
>>>
>>>
>>>
>>> But when I do the same fetch from inside a VUE app:
>>>
>>> <template>
>>> <main>
>>> <button @click="getUserInfo">UserInfo!</button>
>>> </main>
>>> </template>
>>>
>>> <script>
>>> export default {
>>> methods: {
>>> getUserInfo() {
>>> fetch('
>>> https://en.wikipedia.org/w/api.php?action=query&format=json&meta=use
>>> rinfo&formatversion=2&uiprop=rights',
>>> { mode: 'no-cors'})
>>> .then(function (response) {
>>> response
>>> })
>>> }
>>> }
>>> }
>>> </script>
>>>
>>>
>>>
>>> I get a 200 response
>>>
>>>
>>>
>>> Request URL:
>>> https://en.wikipedia.org/w/api.php?action=query&format=json&meta=use
>>> rinfo&formatversion=2&uiprop=rights
>>> Request Method: GET
>>> Status Code: 200 OK
>>> Remote Address: (elided)
>>> Referrer Policy: strict-origin-when-cross-origin
>>>
>>>
>>>
>>> Request headers:
>>>
>>>
>>>
>>> :authority: en.wikipedia.org
>>> :method: GET
>>> :path:
>>> /w/api.php?action=query&format=json&meta=userinfo&formatversion=2&ui
>>> prop=rights
>>> :scheme: https
>>> accept: */*
>>> accept-encoding: gzip, deflate, br, zstd
>>> accept-language: en-US,en;q=0.9
>>> dnt: 1
>>> priority: u=1, i
>>> referer: http://localhost:5173/
>>> sec-ch-ua: "Not)A;Brand";v="99", "Google Chrome";v="127",
>>> "Chromium";v="127"
>>> sec-ch-ua-mobile: ?0
>>> sec-ch-ua-platform: "macOS"
>>> sec-fetch-dest: empty
>>> sec-fetch-mode: no-cors
>>> sec-fetch-site: cross-site
>>> user-agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7)
>>> AppleWebKit/537.36 (KHTML, like Gecko) Chrome/127.0.0.0
>>> Safari/537.36
>>>
>>>
>>>
>>> Response headers:
>>>
>>>
>>>
>>> accept-ranges: bytes
>>> age: 2
>>> cache-control: private, must-revalidate, max-age=0
>>> content-disposition: inline; filename=api-result.json
>>> content-encoding: gzip
>>> content-length: 207
>>> content-type: application/json; charset=utf-8
>>> date: Sun, 18 Aug 2024 13:45:15 GMT
>>> nel: { "report_to": "wm_nel", "max_age": 604800, "failure_fraction":
>>> 0.05, "success_fraction": 0.0}
>>> report-to: { "group": "wm_nel", "max_age": 604800, "endpoints": [{
>>> "url": "
>>> https://intake-logging.wikimedia.org/v1/events?stream=w3c.reportingapi.netw…"
>>> }] }
>>> server: mw-api-ext.eqiad.main-5744d5b77-fzvxz
>>> server-timing: cache;desc="pass", host;desc="cp1102"
>>> set-cookie:
>>> WMF-Last-Access=18-Aug-2024;Path=/;HttpOnly;secure;Expires=Thu, 19
>>> Sep 2024
>>> 12:00:00 GMT
>>> set-cookie: WMF-Last-Access-Global=18-Aug-2024;Path=/;Domain=.
>>> wikipedia.org;HttpOnly;secure;Expires=Thu, 19 Sep 2024 12:00:00 GMT
>>> set-cookie: GeoIP=(elided)
>>> set-cookie:
>>> NetworkProbeLimit=0.001;Path=/;Secure;SameSite=Lax;Max-Age=3600
>>> strict-transport-security: max-age=106384710; includeSubDomains;
>>> preload
>>> vary: Accept-Encoding
>>> x-cache: cp1102 miss, cp1102 pass
>>> x-cache-status: pass
>>> x-client-ip: (elided)
>>> x-content-type-options: nosniff
>>> x-frame-options: DENY
>>>
>>>
>>>
>>> but the response body is empty. Chrome just says "Failed to load
>>> response data: No data found for resource with given identifier". I
>>> assume this is just Chrome's way of saying "Your brain is not big
>>> enough to understand how CORS works". Can anybody help me increase my brain capacity?
>>>
>>>
>>>
>>> On Aug 15, 2024, at 2:36 PM, Roy Smith <roy(a)panix.com> wrote:
>>>
>>> Thank you to everybody who offered advice. I've set up a Vue/Vite
>>> environment on my laptop and started working my way through the tutorials.
>>> Some stuff makes a lot of sense; other stuff not so much yet, but
>>> I'm working on cleaning out some old neural storage areas to make
>>> room for new knowledge.
>>>
>>> I think what would help me at this point was being able to look at
>>> the source for some well-written Vue apps written to work in the
>>> wikipedia environment. If folks could point me to some examples, I
>>> would appreciate it. Are there higher-level wrappers around the
>>> Action API, like pywikibot for Python, or do you just do raw fetch calls on the API endpoints?
>>>
>>>
>>> On Aug 8, 2024, at 1:14 PM, Travis Briggs <audiodude(a)gmail.com> wrote:
>>>
>>> Vue.js is definitely a good option. I already had a lot of
>>> JavaScript experience, but I learned Vue at someone's recommendation
>>> for a wikimedia project and it was a great experience.
>>>
>>> One quick tip that might help you: in the "old world" you might use
>>> jQuery or something to do AJAX requests (XHR). However, in modern
>>> browsers, the built-in `fetch` function is more than adequate for almost everything.
>>>
>>> Also, I would highly recommend using create-vue to bootstrap your
>>> project, because it sets up all the complicated JavaScript "compilation"
>>> steps for you, and gives you commands so that you can just do "npm
>>> run build" and get a static site in a single directory.
>>>
>>> Good luck!
>>> -Travis
>>>
>>> On Thu, Aug 8, 2024 at 8:36 AM Kimmo Virtanen
>>> <kimmo.virtanen(a)gmail.com>
>>> wrote:
>>> Hi,
>>>
>>> Vue.js is afaik current choice.
>>> - https://www.mediawiki.org/wiki/Vue.js
>>>
>>> -- Kimmo
>>>
>>> On Thu, Aug 8, 2024 at 6:34 PM Roy Smith <roy(a)panix.com> wrote:
>>> I'm about to embark on building a client-side javascript tool
>>> intended to help with enwiki's [[WP:DYK]] process. JS is not my
>>> strength (and what I do know about tooling is quite outdated) so I'm
>>> looking for advice on what's in common use in the WMF environment
>>> these days. If I'm going to learn some new tools, I figure I might
>>> as well learn what folks here are using. If only because it'll make
>>> it easier for me to mooch on other people for help :-)
>>>
>>> As far as testing goes, I used to use JUnit. I gather that's pretty
>>> old-hat by now. What are you-all using?
>>>
>>> And for app frameworks. Angular? React? I hear Vie might be the
>>> new hotness? I'm leaning more towards "easy to learn" vs "most powerful".
>>> _______________________________________________
>>> Cloud mailing list -- cloud(a)lists.wikimedia.org List information:
>>> https://lists.wikimedia.org/postorius/lists/cloud.lists.wikimedia.or
>>> g/ _______________________________________________
>>> Cloud mailing list -- cloud(a)lists.wikimedia.org List information:
>>> https://lists.wikimedia.org/postorius/lists/cloud.lists.wikimedia.or
>>> g/ _______________________________________________
>>> Cloud mailing list -- cloud(a)lists.wikimedia.org List information:
>>> https://lists.wikimedia.org/postorius/lists/cloud.lists.wikimedia.or
>>> g/
>>>
>>>
>>>
>>> _______________________________________________
>>> Cloud mailing list -- cloud(a)lists.wikimedia.org
>>> List information:
>>> https://lists.wikimedia.org/postorius/lists/cloud.lists.wikimedia.org/
>>> _______________________________________________
>>> Cloud mailing list -- cloud(a)lists.wikimedia.org
>>> List information:
>>> https://lists.wikimedia.org/postorius/lists/cloud.lists.wikimedia.org/
>>>
>>>
>>> _______________________________________________
>>> Cloud mailing list -- cloud(a)lists.wikimedia.org
>>> List information:
>>> https://lists.wikimedia.org/postorius/lists/cloud.lists.wikimedia.org/
>>>
>>>
>>> _______________________________________________
>>> Cloud mailing list -- cloud(a)lists.wikimedia.org
>>> List information:
>>> https://lists.wikimedia.org/postorius/lists/cloud.lists.wikimedia.org/
>>>
>>>
>>> _______________________________________________
>>> Cloud mailing list -- cloud(a)lists.wikimedia.org
>>> List information:
>>> https://lists.wikimedia.org/postorius/lists/cloud.lists.wikimedia.org/
>>>
>> _______________________________________________
>> Cloud mailing list -- cloud(a)lists.wikimedia.org
>> List information:
>> https://lists.wikimedia.org/postorius/lists/cloud.lists.wikimedia.org/
>>
> _______________________________________________
> Cloud mailing list -- cloud(a)lists.wikimedia.org
> List information:
> https://lists.wikimedia.org/postorius/lists/cloud.lists.wikimedia.org/
>
I will be upgrading the cloud-vps openstack install on Thursday,
beginning around 16:00 UTC. Here's what to expect:
- Intermittent Horizon and API downtime (maybe an hour or two total)
- Inability to schedule new VMs (also for an hour or two)
Toolforge users will be unaffected by this outage. Existing, running
services and VMs on cloud-vps should also be unaffected.
This is the upgrade to version 2024.1 'caracal', tracked at
https://phabricator.wikimedia.org/T369044
-Andrew + the WMCS team
_______________________________________________
Cloud-announce mailing list -- cloud-announce(a)lists.wikimedia.org
List information: https://lists.wikimedia.org/postorius/lists/cloud-announce.lists.wikimedia.…