I know it has been annoying a couple of people other than me, so now that I've learned how to make it work I'll share the knowledge here.
tl;dr: Star the repositories. No, seriously. (And yes, you need to star each extension repo separately.)
(Is there a place on mw.org to put this tidbit on?)
------- Forwarded message -------
From: "Brian Levine" <support(a)github.com> (GitHub Staff)
To: matma.rex(a)gmail.com
Cc:
Subject: Re: Commits in mirrored repositories not showing up on my profile
Date: Tue, 09 Jul 2013 06:47:19 +0200
Hi Bartosz
In order to link your commits to your GitHub account, you need to have some association with the repository other than authoring the commit. Usually, having push access gives you that connection. In this case, you don't have push permission, so we don't link you to the commit.
The easy solution here is for you to star the repository. If you star it - along with the other repositories that are giving you this problem - we'll see that you're connected to the repository and you'll get contribution credit for those commits.
Cheers
Brian
--
Matma Rex
We just released a new version of Research:FAQ on Meta [1], significantly
expanded and updated, to make our processes at WMF more transparent and to
meet an explicit FDC request to clarify the role and responsibilities of
individual teams involved in research across the organization.
The previous version – written from the perspective of the (now inactive)
Research:Committee, and mostly obsolete since the release of WMF's open
access policy [2] – can still be found here [3].
Comments and bold edits to the new version of the document are welcome. For
any question or concern, you can drop me a line or ping my username on-wiki.
Thanks,
Dario
[1] https://meta.wikimedia.org/wiki/Research:FAQ
[2] https://wikimediafoundation.org/wiki/Open_access_policy
[3] https://meta.wikimedia.org/w/index.php?title=Research:FAQ&oldid=15176953
*Dario Taraborelli *Head of Research, Wikimedia Foundation
wikimediafoundation.org • nitens.org • @readermeter
<http://twitter.com/readermeter>
Hello all,
The Technical Decision-Making Forum Retrospective team
<https://www.mediawiki.org/wiki/Technical_decision_making> invites you to
complete a survey about Wikimedia's technical decision-making processes.
While there will be more ways to participate, this is the first and most
important step in our data collection. It aims to gather information about
your experience, thoughts, and needs regarding the process of making
technical decisions across the Wikimedia technical spaces.
This survey will be used for gathering information about the process and
the needs around technical decision-making that touches our production
systems.
You can find the survey link here:
https://wikimediafoundation.limesurvey.net/885471?lang=en
Who should take this survey?
People who do technical work that relies on software maintained by the
Wikimedia Foundation (WMF) or affiliates. If you contribute code to
MediaWiki or extensions used by Wikimedia, or you maintain gadgets or tools
that rely on WMF infrastructure, this survey is for you.
What is the deadline?
*August 7th, 2023 *
What will the Retrospective team do with the information?
The retrospective team will synthesize the collected data and publish an
anonymized analysis that will help leadership make decisions about the
future of the process.
We will collect anonymized information that we will analyze in two main
ways:
-
Sentiments based on demographic information: these will tell us whether
there are different needs and desires from different groups of people.
-
General needs and perceptions about decision-making in our technical
spaces: This will help us understand what kind of decisions happen in
the spaces, who is involved, and how to adjust our processes accordingly.
Is the survey the only way to participate?
The survey is the most important way for us to gather information because
it helps us gather input in a structured manner. But it will not be the
only way you can share your thoughts with us - we will have more
information soon about upcoming listening sessions where you can talk with
us live. In the meantime, you are always welcome to leave feedback on the
talk page:
https://www.mediawiki.org/wiki/Talk:Technical_decision_making/Technical_Dec…
Where can I see more information?
There are several places where you can find more information about the
Technical Decision-Making Process Retrospective:
-
The original announcement about the retrospective from Tajh Taylor:
https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/…
-
The Technical Decision-Making Process general information page:
https://www.mediawiki.org/wiki/Technical_decision_making
-
The Technical Decision-Making Process Retrospective on MediaWiki:
https://www.mediawiki.org/wiki/Technical_decision_making/Technical_Decision…
-
Phabricator ticket: https://phabricator.wikimedia.org/T333235
How to contact the retrospective core team:
-
Write to the core team mailing list: tdf-retro-2023(a)lists.wikimedia.org
-
The Technical Decision-Making Process Retrospective on MediaWiki talk
page:
https://www.mediawiki.org/wiki/Talk:Technical_decision_making/Technical_Dec…
Thank you,
Moriel, on behalf of the TDMP Retro Core Group
Core group:
-
Moriel Schottlender (chair)
-
Daniel Kinzler
-
Chris Danis
-
Kosta Harlan
-
Temilola Adeleye
--
Moriel Schottlender (she/her <https://pronoun.is/she>)
Principal Software Engineer
Wikimedia Foundation https://wikimediafoundation.org/
Yesterday there was a conversation about code review on irc and among other
things, how sometimes patches can get "stuck".
I had an idea for a way to improve things. I'm not sure if it is a good
idea, but there's only one way to find out.
So without further ado, announcing the Code Review Patch Board:
https://www.mediawiki.org/wiki/Code_review/patch_board
In short - each person is allowed to list one of their patches on the board
that they would really like to see reviewed. You can only list one patch at
a time, and it should be a patch that you have been unable to get review
for for at least a week through normal means. See the page for the full
list of guidelines.
I encourage people to give it a try. Add a patch you wrote that you cannot
get a review for. Or if you have +2 rights, try giving some love to these
underloved patches.
I would also love to hear feedback on the general idea as well as the
current guidelines.
To repeat, the url is:
https://www.mediawiki.org/wiki/Code_review/patch_board
Thanks,
bawolff
Hi everybody,
TL;DR We would like users of ORES models to migrate to our new open source
ML infrastructure, Lift Wing, within the next five months. We are available
to help you do that, from advice to making code commits. It is important to
note: All ML models currently accessible on ORES are also currently
accessible on Lift Wing.
As part of the Machine Learning Modernization Project (
https://www.mediawiki.org/wiki/Machine_Learning/Modernization), the Machine
Learning team has deployed a Wikimedia’s new machine learning inference
infrastructure, called Lift Wing (
https://wikitech.wikimedia.org/wiki/Machine_Learning/LiftWing). Lift Wing
brings a lot of new features such as support for GPU-based models, open
source LLM hosting, auto-scaling, stability, and ability to host a larger
number of models.
With the creation of Lift Wing, the team is turning its attention to
deprecating the current machine learning infrastructure, ORES. ORES served
us really well over the years, it was a successful project but it came
before radical changes in technology like Docker, Kubernetes and more
recently MLOps. The servers that run ORES are at the end of their planned
lifespan and so to save cost we are going to shut them down in early 2024.
We have outlined a deprecation path on Wikitech (
https://wikitech.wikimedia.org/wiki/ORES), please read the page if you are
a maintainer of a tool or code that uses the ORES endpoint
https://ores.wikimedia.org/). If you have any doubt or if you need
assistance in migrating to Lift Wing, feel free to contact the ML team via:
- Email: ml(a)wikimedia.org
- Phabricator: #Machine-Learning-Team tag
- IRC (Libera): #wikimedia-ml
The Machine Learning team is available to help projects migrate, from
offering advice to making code commits. We want to make this as easy as
possible for folks.
High Level timeline:
**By September 30th 2023: *Infrastructure powering the ORES API endpoint
will be migrated from ORES to Lift Wing. For users, the API endpoint will
remain the same, and most users won’t notice any change. Rather just the
backend services powering the endpoint will change.
Details: We'd like to add a DNS CNAME that points ores.wikimedia.org to
ores-legacy.wikimedia.org, a new endpoint that offers a almost complete
replacement of the ORES API calling Lift Wing behind the scenes. In an
ideal world we'd migrate all tools to Lift Wing before decommissioning the
infrastructure behind ores.wikimedia.org, but it turned out to be really
challenging so to avoid disrupting users we chose to implement a transition
layer/API.
To summarize, if you don't have time to migrate before September to Lift
Wing, your code/tool should work just fine on ores-legacy.wikimedia.org and
you'll not have to change a line in your code thanks to the DNS CNAME. The
ores-legacy endpoint is not a 100% replacement for ores, we removed some
very old and not used features, so we highly recommend at least test the
new endpoint for your use case to avoid surprises when we'll make the
switch. In case you find anything weird, please report it to us using the
aforementioned channels.
**September to January: *We will be reaching out to every user of ORES we
can identify and working with them to make the migration process as easy as
possible.
**By January 2024: *If all goes well, we would like zero traffic on the
ORES API endpoint so we can turn off the ores-legacy API.
If you want more information about Lift Wing, please check
https://wikitech.wikimedia.org/wiki/Machine_Learning/LiftWing
Thanks in advance for the patience and the help!
Regards,
The Machine Learning Team
Hello everyone,
TLDR; Wikimedia is participating in the Outreachy Round 27 internship
program <https://www.mediawiki.org/wiki/Outreachy/Round_27 > [1].
Outreachy's goal is to support people from groups underrepresented in the
technology industry. Interns will work remotely with mentors from our
community. We are seeking mentors to propose projects that Outreachy
interns can work on during their internship. If you have some ideas for coding
or non-coding (design, documentation, translation, outreach, research)
projects, share them by Sept. 29, 2023 at 4 pm UTC here as a subtask of
this parent task: <https://phabricator.wikimedia.org/T343871 > [2]
Program Timeline
As a mentor, you engage potential candidates in the application period
between October–November (winter round) and help them make small
contributions to your project. You work more closely with the accepted
candidates during the internship period between December–March (winter
round).
Important dates are:
-
Aug. 22, 2023 at 4pm UTC - Live Q&A for Outreachy mentors
<https://www.youtube.com/@outreachyinternships>
-
September 29, 2023 at 4pm UTC - Project submission deadline
<https://www.outreachy.org/communities/cfp/wikimedia/>
Guidelines for Crafting Project Proposals
* Follow this task description template when you propose a project in
Phabricator: <
https://phabricator.wikimedia.org/tag/outreach-programs-projects> [3]. You
can also use this workboard to pick an idea if you don't have one already.
Add #Outreachy (Round 27) tag.
* Project should require an experienced developer ~15 days and a newcomer
~3 months to complete.
* Each project should have at least two mentors, including one with a
technical background.
* Ideally, the project has no tight deadlines, a moderate learning curve,
and fewer dependencies on Wikimedia's core infrastructure. Projects
addressing the needs of a language community are most welcome.
Learn more about the roles and responsibilities of mentors on
MediaWiki.org: <https://www.mediawiki.org/wiki/Outreachy/Mentors> [4][5]
Cheers,
Onyinye & Sheila (Wikimedia Org Admins for Outreachy Round 27)
[1] https://www.mediawiki.org/wiki/Outreachy/Round_27
[2] https://phabricator.wikimedia.org/T343871
[3] https://phabricator.wikimedia.org/tag/outreach-programs-projects/
[4] https://www.mediawiki.org/wiki/Outreachy/Mentors
[ 5] https://www.outreachy.org/mentor/mentor-faq
Hi everybody,
In https://phabricator.wikimedia.org/T342116 the Machine Learning team
announces its intention to deprecate the mediawiki.revision-score stream.
For external users, the stream is consumable via the
https://stream.wikimedia.org API and it currently has very few users.
Our idea is to create smaller streams, one for each model type, instead of
having a big aggregator. For example, revision 123456 for enwiki ends up
with several scores from various models in the current revision-score
stream, that is convenient but very hard to manage and maintain for us
(since it is not clear if users are interested in all the data or only a
subset of it). The revision-score stream is also very tightly coupled with
the ORES' architecture, which we are trying to deprecate. In the future we
plan to have smaller streams, in which every revision will get associated
with a single score, from a specific model server:
mediawiki.revision-score-goodfaith
mediawiki.revision-score-damaging
...
...
[ and also new models that will be deployed. ]
To avoid creating unnecessary streams, we'll create the ones that WMF teams
and the community will need and ask during the next months. If you have any
requirement, please follow up with us:
- Email: ml(a)wikimedia.org
- Phabricator: #Machine-Learning-Team tag
- IRC (Libera): #wikimedia-ml
If you are a user of the Mediawiki revision-score stream please follow up
on the task above explaining your use case, we'll try to do our best to
find a good solution for you!
Thanks in advance,
Regards,
Luca
Dear fellow developers! If you don't work on gadgets or Wikimedia code,
feel free to ignore this email!
For some time we've had the Stable interface policy which has been super
helpful for backend-development. I would love us to have an equivalent for
frontend code.
For the past 3 years we have been building one with feedback and
suggestions from gadget developers, WMF staff and Wikimedia volunteers. The
current draft can be found at:
https://www.mediawiki.org/wiki/User:Jdlrobson/Stable_interface_policy/front…
I would like to make this policy official so that we can get the benefits
of having a document and continue to evolve it in a more official capacity.
If anyone wants to veto this, I'd like to hear from you on the talk page or
by a reply to this email (either privately or publicly). When making a veto
please make that explicit and include the text you find problematic and
details about why.
If there is no active veto after one month, this policy will be made
official and moved to
https://www.mediawiki.org/wiki/Stable_interface_policy/frontend.
Thanks in advance for all your help with this important matter!
Jon Robson
PS. This note has also been sent to tech news.
Hi everybody!
A couple of weeks ago we started switching the backend used by the ORES
extension which is used by the Recent Changes filters.
After fixing the issue that was caused in some wikis (
https://phabricator.wikimedia.org/T343308) the Machine Learning team
started rolling out the change again.
We have already deployed the changes to *fiwiki* and *itwiki* and will
continue with the rest of the wikis starting from next week.
This change is using Lift Wing (
https://wikitech.wikimedia.org/wiki/Machine_Learning/LiftWing) to get
revision scores instead of ORES, and is a necessary step in the process of
deprecating ORES (for more info please refer to the "ORES to Lift Wing
Migration
<https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/…>"
email).
The purpose of this change is to use *exactly* the same models which are
also deployed on Lift Wing so there shouldn't be anything different for our
users.
If however you see anything out of the ordinary, feel free to contact
the Machine Learning team:
IRC libera: #wikimedia-ml
Phabricator: Machine-Learning-team tag
Thank you!
Ilias (on behalf of the Machine Learning team)
Hello,
I 'd like to inform everyone of what we consider a big change (for the
better) in the Datacenter Switchover process. The full rationale, planning
and implementation is documented at
https://wikitech.wikimedia.org/wiki/Switch_Datacenter/Recurring,_Equinox-ba…
and it includes a TL;DR that I am pasting below for everyone's convenience:
Site Reliability Engineering will, starting September 2023, run a data
center Switchover every 6 months, in the week of the solar Equinox
<https://en.wikipedia.org/wiki/Equinox>, namely the *work weeks containing
March 21st and September 21st*. If you are interested to learn more about
Switchovers and why we perform them, or already know what they are and want
to learn more about how this proposal would impact your workflows or the
Wikimedia Movement, please read on.
We hope that making the Switchover dates and duration predictable will
allow the teams involved and/or utilizing a Switchover, as well as the
entire movement reap the benefits we anticipate and document in the doc
linked above.
Regards,
--
Alexandros Kosiaris
Principal Site Reliability Engineer
Wikimedia Foundation