I know it has been annoying a couple of people other than me, so now that I've learned how to make it work I'll share the knowledge here.
tl;dr: Star the repositories. No, seriously. (And yes, you need to star each extension repo separately.)
(Is there a place on mw.org to put this tidbit on?)
------- Forwarded message -------
From: "Brian Levine" <support(a)github.com> (GitHub Staff)
To: matma.rex(a)gmail.com
Cc:
Subject: Re: Commits in mirrored repositories not showing up on my profile
Date: Tue, 09 Jul 2013 06:47:19 +0200
Hi Bartosz
In order to link your commits to your GitHub account, you need to have some association with the repository other than authoring the commit. Usually, having push access gives you that connection. In this case, you don't have push permission, so we don't link you to the commit.
The easy solution here is for you to star the repository. If you star it - along with the other repositories that are giving you this problem - we'll see that you're connected to the repository and you'll get contribution credit for those commits.
Cheers
Brian
--
Matma Rex
We just released a new version of Research:FAQ on Meta [1], significantly
expanded and updated, to make our processes at WMF more transparent and to
meet an explicit FDC request to clarify the role and responsibilities of
individual teams involved in research across the organization.
The previous version – written from the perspective of the (now inactive)
Research:Committee, and mostly obsolete since the release of WMF's open
access policy [2] – can still be found here [3].
Comments and bold edits to the new version of the document are welcome. For
any question or concern, you can drop me a line or ping my username on-wiki.
Thanks,
Dario
[1] https://meta.wikimedia.org/wiki/Research:FAQ
[2] https://wikimediafoundation.org/wiki/Open_access_policy
[3] https://meta.wikimedia.org/w/index.php?title=Research:FAQ&oldid=15176953
*Dario Taraborelli *Head of Research, Wikimedia Foundation
wikimediafoundation.org • nitens.org • @readermeter
<http://twitter.com/readermeter>
Hello everyone,
Wikimedia is gearing up to apply as a mentoring organization for Google
Summer of Code 2024 <
https://www.mediawiki.org/wiki/Google_Summer_of_Code/2024>[1] and Outreachy
Round 28 <https://www.mediawiki.org/wiki/Outreachy/Round_28> [2].
Currently, we're crafting a list of exciting project ideas for the
application. If you have any suggestions for projects, whether coding or
non-coding (design, documentation, translation, outreach, research), please
share them by February 5th via this Phabricator task: <
https://phabricator.wikimedia.org/T354734> [3]. Note that for non-coding
projects eligible for Outreachy, slots are limited and will be allocated to
mentors on a first-come, first-serve basis.
Timeline
In your role as a mentor, your involvement spans the application period for
both programs, taking place from March to April. During this time, you'll
guide candidates in making small contributions to your project and address
any project-related queries they may have. As the application period
concludes, you'll further intensify your collaboration with accepted
candidates throughout the coding period, which extends from May to August.
Your support and guidance are crucial to their success in the program.
Guidelines for Crafting Project Proposals:
-
Follow this task description template when you propose a project in
Phabricator: <
https://phabricator.wikimedia.org/tag/outreach-programs-projects> [4].
You can also use this workboard to pick an idea if you don't have one
already. Add #Google- Summer-of-Code (2024) or #Outreachy (Round 28) tag.
-
Project should require an experienced developer ~15 days and a newcomer
~3 months to complete.
-
Each project should have at least two mentors, including one with a
technical background.
-
Ideally, the project has no tight deadlines, a moderate learning curve,
and fewer dependencies on Wikimedia's core infrastructure. Projects
addressing the needs of a language community are most welcome.
* Learn more about the roles and responsibilities of Mentors for both
programs:*
-
Outreachy: <https://www.mediawiki.org/wiki/Outreachy/Mentors> [5]
-
Google Summer of Code: <
https://www.mediawiki.org/wiki/Google_Summer_of_Code/Mentors> [6]
Thank you,
Links:
[1] https://www.mediawiki.org/wiki/Google_Summer_of_Code/2024
[2] https://www.mediawiki.org/wiki/Outreachy/Round_28
[3] https://phabricator.wikimedia.org/T354734
[4] https://phabricator.wikimedia.org/tag/outreach-programs-projects
[5] https://www.mediawiki.org/wiki/Outreachy/Mentors
[6] https://www.mediawiki.org/wiki/Google_Summer_of_Code/Mentors
--
*Onyinyechi Onifade *
Technical Community Program Manager
Wikimedia Foundation <https://wikimediafoundation.org/>
Hello all,
The Technical Decision-Making Forum Retrospective team
<https://www.mediawiki.org/wiki/Technical_decision_making> invites you to
complete a survey about Wikimedia's technical decision-making processes.
While there will be more ways to participate, this is the first and most
important step in our data collection. It aims to gather information about
your experience, thoughts, and needs regarding the process of making
technical decisions across the Wikimedia technical spaces.
This survey will be used for gathering information about the process and
the needs around technical decision-making that touches our production
systems.
You can find the survey link here:
https://wikimediafoundation.limesurvey.net/885471?lang=en
Who should take this survey?
People who do technical work that relies on software maintained by the
Wikimedia Foundation (WMF) or affiliates. If you contribute code to
MediaWiki or extensions used by Wikimedia, or you maintain gadgets or tools
that rely on WMF infrastructure, this survey is for you.
What is the deadline?
*August 7th, 2023 *
What will the Retrospective team do with the information?
The retrospective team will synthesize the collected data and publish an
anonymized analysis that will help leadership make decisions about the
future of the process.
We will collect anonymized information that we will analyze in two main
ways:
-
Sentiments based on demographic information: these will tell us whether
there are different needs and desires from different groups of people.
-
General needs and perceptions about decision-making in our technical
spaces: This will help us understand what kind of decisions happen in
the spaces, who is involved, and how to adjust our processes accordingly.
Is the survey the only way to participate?
The survey is the most important way for us to gather information because
it helps us gather input in a structured manner. But it will not be the
only way you can share your thoughts with us - we will have more
information soon about upcoming listening sessions where you can talk with
us live. In the meantime, you are always welcome to leave feedback on the
talk page:
https://www.mediawiki.org/wiki/Talk:Technical_decision_making/Technical_Dec…
Where can I see more information?
There are several places where you can find more information about the
Technical Decision-Making Process Retrospective:
-
The original announcement about the retrospective from Tajh Taylor:
https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/…
-
The Technical Decision-Making Process general information page:
https://www.mediawiki.org/wiki/Technical_decision_making
-
The Technical Decision-Making Process Retrospective on MediaWiki:
https://www.mediawiki.org/wiki/Technical_decision_making/Technical_Decision…
-
Phabricator ticket: https://phabricator.wikimedia.org/T333235
How to contact the retrospective core team:
-
Write to the core team mailing list: tdf-retro-2023(a)lists.wikimedia.org
-
The Technical Decision-Making Process Retrospective on MediaWiki talk
page:
https://www.mediawiki.org/wiki/Talk:Technical_decision_making/Technical_Dec…
Thank you,
Moriel, on behalf of the TDMP Retro Core Group
Core group:
-
Moriel Schottlender (chair)
-
Daniel Kinzler
-
Chris Danis
-
Kosta Harlan
-
Temilola Adeleye
--
Moriel Schottlender (she/her <https://pronoun.is/she>)
Principal Software Engineer
Wikimedia Foundation https://wikimediafoundation.org/
(If you don’t work with pagelinks table, feel free to ignore this message)
Hello,
Here is an update and reminder on the previous announcement
<https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/…>
regarding normalization of links tables that was sent around a year ago.
As part of that work, soon the pl_namespace and pl_title columns of
pagelinks table will be dropped and you will need to use pl_target_id
joining with the linktarget table instead. This is basically identical to
the templatelinks normalization that happened a year ago.
Currently, MediaWiki writes to both data schemes of pagelinks for new rows
in all wikis except English Wikipedia and Wikimedia Commons (we will start
writing to these two wikis next week). We have started to backfill the data
with the new schema but it will take weeks to finish in large wikis.
So if you query this table directly or your tools do, You will need to
update them accordingly. I will write a reminder before dropping the old
columns once the data has been fully backfilled.
You can keep track of the general long-term work in T300222
<https://phabricator.wikimedia.org/T300222> and the specific work for
pagelinks in T299947 <https://phabricator.wikimedia.org/T299947>. You can
also read more on the reasoning in T222224
<https://phabricator.wikimedia.org/T222224> or the previous announcement
<https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/…>
.
Thank you,
--
*Amir Sarabadani (he/him)*
Staff Database Architect
Wikimedia Foundation <https://wikimediafoundation.org/>
Hello,
We are now only three weeks away from the Wikimedia Wishathon! Exciting
news - User:Lucas Werkmeister has signed up to host a piano concert during
a social hour 🎉
Join us and contribute to the development of community wishes between March
15th and 17th! Participate in discussion sessions and work on user scripts,
gadgets, extensions, tools and more!
The full event schedule is available here: <
https://meta.wikimedia.org/wiki/Event:WishathonMarch2024>.
Explore the event wiki for project ideas and keep an eye out for
non-technical tasks (documentation and design-related) that will soon be
added to the Wishathon workboard: <
https://phabricator.wikimedia.org/project/view/5906/>. Project breakouts
will also be added to the schedule, where you can participate in wish
development or explore innovative solutions as a user, developer, or
designer.
We are seeking volunteers to assist with a wide range of activities such as
monitoring discussion channels during hacking hours, answering technical
queries, and helping with session note-taking. Check out the Help desk
schedule and add yourself to a slot where you are available and interested
in providing assistance: <
https://meta.wikimedia.org/wiki/Event:WishathonMarch2024/Help_desk>.
If you have any questions about the Wishathon, reach out via Telegram: <
https://t.me/wmhack>.
Cheers,
Srishti
On behalf of the Wishathon organizing committee
*Srishti Sethi*
Senior Developer Advocate
Wikimedia Foundation <https://wikimediafoundation.org/>
Hi everyone,
tl;dr The tools we use to document Wikimedia JavaScript code are changing.
In the short term, you can read the complete MediaWiki core JavaScript docs
using the 1.41 version[0] while we migrate to the new system[1]. If you use
JavaScript documentation on doc.wikimedia.org, please share your feedback
on wiki[2].
Wikimedia JavaScript codebases are switching from using JSDuck[3] to
JSDoc[4] for documentation. Started in 2016, this migration is necessary
because JSDuck is currently unmaintained and does not support the ES6
standard[5]. Several Wikimedia JavaScript codebases, including Vector and
GlobalWatchlist, already use JSDoc, while several others, such as
VisualEditor and MediaWiki core, still use JSDuck.
The migration project consists of two parts: changing the codebases to
support JSDoc and improving the usability of the JSDoc WMF theme. For more
information, see phab:T138401[6].
== Migrating MediaWiki core to JSDoc ==
We are migrating MediaWiki core to JSDoc incrementally. While the migration
is in progress, the master branch docs will be incomplete, containing only
those modules that have been migrated. To read the old JSDuck docs, see the
MediaWiki 1.41 docs[0].
To help with migration, choose a module from the list in phab:T352308[7],
and follow the guide on phab:T138401[6] to translate the tags from JSDuck
to JSDoc.
== Migrating other codebases ==
You can find a list of codebases that use JSDuck on phab:T138401[6].
(Please add any that are missing.) To help migrate a codebase that uses
JSDuck, follow the instructions to set up JSDoc[8], and use the guide in
phab:T138401[6] to translate the tags from JSDuck to JSDoc.
== Improving the JSDoc WMF theme ==
One of the biggest differences between JSDuck and JSDoc is the HTML
interface for reading the docs. The WMF theme for JSDoc is not as
full-featured as the JSDuck theme, but to support this migration, the
Wikimedia Foundation Web, Design Systems, and Technical Documentation teams
are working to prioritize and complete a set of improvements to the JSDoc
theme, with the goal of releasing version 1 of jsdoc-wmf-theme in 2024.
If you use JavaScript documentation on doc.wikimedia.org, please leave a
comment on the JSDoc WMF theme talk page[2] and let us know how you use the
docs and which features of the theme are the most important to you.
Thank you for reading!
Alex, Kamil, Jon, Roan, and Anne
[0]: https://doc.wikimedia.org/mediawiki-core/REL1_41/js/
[1]: https://doc.wikimedia.org/mediawiki-core/master/js/
[2]: https://www.mediawiki.org/wiki/Talk:JSDoc_WMF_theme
[3]: https://github.com/senchalabs/jsduck
[4]: https://en.wikipedia.org/wiki/JSDoc
[5] https://en.wikipedia.org/wiki/ECMAScript
[6]: https://phabricator.wikimedia.org/T138401
[7]: https://phabricator.wikimedia.org/T352308
[8]: https://www.mediawiki.org/wiki/JSDoc
--
Alex Paskulin
Technical Writer
Wikimedia Foundation
Hi All, welcome to the monthly MediaWiki Insights email!
Like last time
<https://www.mediawiki.org/wiki/MediaWiki_Product_Insights/Reports/January_2…>,
we’re starting this email by celebrating volunteer contributors who got
their first patch merged in MW core, WMF deployed extensions or services
over the past month:
A big thanks to GergesShamon and Agent Isai for their contributions!
Welcome :-)
Many thanks also to the reviewers - volunteers and staff - of patches by
new contributors. Your support is invaluable for helping people contribute
to MediaWiki effectively and make it a fun first experience!
The focus of this edition of the monthly MW Insights email lies on our
multi-year initiatives: Migration of MediaWiki from bare metal to Kubernetes
<https://wikitech.wikimedia.org/wiki/MediaWiki_On_Kubernetes> in Wikimedia
Production, Parser Unification
<https://www.mediawiki.org/wiki/Parsoid/Parser_Unification> and RESTBase
deprecation <https://phabricator.wikimedia.org/project/profile/6289/>.
These initiatives involve many people and projects, require orchestrated
efforts and coordination, and will help us in many different ways. Another
thing that these initiatives have in common is that we’re getting closer to
being able to benefit from these efforts :-).
Project Snapshots: Milestones reached on 3 multi-year projects
MediaWiki on Kubernetes migration is 75% complete, completing migration of
all internal traffic, all scheduled jobs and reaches 50% global traffic
milestone!
Often shortened as mw-on-k8s, MediaWiki on Kubernetes is a multi-year
effort to move all of the MediaWiki deployments running on WMF production
infrastructure to the new WikiKube platform.
The migration is underway and very recently a new milestone was hit where
the WikiKube platform now serves 50% of end user requests
<https://phabricator.wikimedia.org/T290536>. At half mark, having more
percentage of traffic on WikiKube, Service Operations is working with
Release Engineering to make changes to monitoring tools to surface any
issues during deployments to continue with further ramps. Also, almost the
entirety of what is called internal traffic
<https://phabricator.wikimedia.org/T333120>, that is traffic that is
generated by applications running in the infrastructure and reaching out to
MediaWiki for various purposes is migrated to WikiKube. A special mention
should go to MediaWiki Jobs fully migrated just a week ago
<https://phabricator.wikimedia.org/T349796>. Feel free to follow the higher
umbrella task <https://phabricator.wikimedia.org/T290536> and/or
MediaWiki_On_Kubernetes
on Wikitech <https://wikitech.wikimedia.org/wiki/MediaWiki_On_Kubernetes>.
This migration will unleash the ability to deploy multiple versions of code
simultaneously. Also, this will help enhance platform capabilities to build
dockerized isolated environments for coding, testing and even production
debugging.
MediaWiki on Kubernetes will allow us to deprecate and eventually remove a
lot of our in-house developed code. Another benefit is that we will be able
to react to sudden traffic spikes, like newsworthy events better, as the
flexing up and down is a matter of configuration change. This enables
efficient placement of workloads, packing workloads in a more
environmentally friendly way, increasing hardware utilization.
It takes a village to get this far: A big thanks to the Service Operations
team (Clément Goubert, Giuseppe Lavagetto, Alexandros Kosiaris, Kamila
Součková, Hugh Nowlan, Effie Mouzeli, Reuven Lazarus, Jannis Mayboom and
Kavitha Appakayala) for their leadership on this project, the Release
Engineering team (specifically: Dan Duvall, Jeena Huneidi, Tyler Cipriani
and Ahmon Dancy) for their work on Blubber
<https://wikitech.wikimedia.org/wiki/Blubber>, the Deployment Pipeline
<https://wikitech.wikimedia.org/wiki/Deployment_pipeline> and Scap
<https://wikitech.wikimedia.org/wiki/Scap>, Dom Walden from Quality and
Test Engineering for crafting and executing a plan to test the first
deployment of MediaWiki on Kubernetes, and everyone else who has
contributed in the one way or the other! <3
More milestones:
We’re seeing the first lights at the end of the tunnel of another
multi-year initiative: The *parser unification*
<https://www.mediawiki.org/wiki/Parsoid/Parser_Unification>. One week
ago, Parsoid
Read Views was rolled out to first wikis: Parsoid is now the default read
views renderer on the Foundations’ Office Wiki and Wikitech
DiscussionTools. This early experimentation allows us to find issues in a
limited space, which will help us evaluate readiness of the feature
and increase
our confidence for future rollouts
<http://mediawiki.org/wiki/Parsoid/Parser_Unification/Confidence_Framework>.
A huge thanks to Subbu Sastry, Mateus Santos, CScott, Isabelle
Hubert-Pallatin, Arlo Breault, Shannon Bailey, Yiannis Giannelos and Sérgio
Lopes for making this work! Many thanks also to Daniel Kinzler: His work on
the RESTBase deprecation directly helped us get to this milestone :-)
*RESTBase deprecation*: We have been continuously working on decoupling
services from RESTBase, aiming for the modernisation and sustainability of
Wikimedia products in our services platform. The MediaWiki Interfaces team
finished up reimplementation of Reading Lists endpoints in MW REST API
<https://phabricator.wikimedia.org/T348491> and are now confirming with
affected callers <https://phabricator.wikimedia.org/T357478> that the new
endpoints meet their needs before rerouting calls
<https://phabricator.wikimedia.org/T348493> and retiring old code
<https://phabricator.wikimedia.org/T348494>. The overall effort
<https://phabricator.wikimedia.org/T336693> will not only move us forward
on RESTBase retirement, but also reduce the total amount of code we have to
maintain. Many thanks to Bill Pirkle, Atieno Njira, Wendy Quarshie, and
Daniel Kinzler for making this work! We also fully turned off Parsoid cache
storage in RESTBase - clients will get outputs direct from MediaWiki and
the cache will be handled by ParserCache. Next, we will re-route clients
directly to MediaWiki and fully remove Parsoid from RESTBase (T344944
<https://phabricator.wikimedia.org/T344944>). The Page Content Service
(PCS) will also handle its own cache
<https://phabricator.wikimedia.org/T348995> and we are ready to test the
new capabilities in staging soon. Many thanks to Yiannis Giannelos and the
Content Transform team and to Daniel Kinzler for his efforts and support on
decoupling Parsoid from RESTBase!
All of these multi-year initiatives help us increase sustainability and
maintainability of the platform, streamline engineering and developer
workflows, and unlock the path for new and improved platform capabilities
and product opportunities.
Outlook: Knowledge Platform in the annual plan 2024/2025
The Wikimedia Foundation has recently published the draft objectives by the
Product & Technology department
<https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2024-2025/…>
for the next annual plan on Meta, alongside an introduction by Selena
Deckelmann and a few questions that we’re exploring
<https://meta.wikimedia.org/wiki/Talk:Wikimedia_Foundation_Annual_Plan/2024-…>.
Your input on these questions is very welcome!
The draft objectives include “Knowledge Platform I” - centered around
MediaWiki platform evolution and “Knowledge Platform II” - centered around
developer/engineering services and workflows. The objectives show only the
high-level direction for next year. The draft “key results” (currently work
in progress) will give a better idea of what areas of work we’re thinking
about. We’ll be publishing these in March and share the link + invitation
for feedback with this list again.
Thanks all for reading,
Birgit
--
Birgit Müller (she/her)
Director of Product, MediaWiki and Developer Experiences
Wikimedia Foundation <https://wikimediafoundation.org/>
Hi all,
We’re excited to announce Wikimedia’s participation in Google Season of
Docs! 🎉 For the first time since 2020, Wikimedia is applying for Google
Season of Docs[1]: a program that provides grants to technical writers to
work on documentation for open-source projects.
This year, we’ll be selecting one Wikimedia documentation project and, if
accepted by Google, hiring one technical writer to complete the project.
For details about the program and how to apply, see the wiki page[2].
Here's how you can get involved:
- [Until March 22] Propose a project, share feedback on a project, or
volunteer to help support a project[3].
- [Until April 28] Apply as a technical writer. We highly encourage
Wikimedians to apply! Even if you’re not a professional technical writer,
we value experience working on wikis.
Our goal is to use this program to improve MediaWiki.org and other
Wikimedia technical documentation, supplementing the work done by
volunteers and staff. We’re hoping to use this year’s edition as a model to
expand the program to multiple projects in future years.
Let's make this a successful season of documentation together! 📚
Best,
- Alex and Onyinyechi
[1] - https://developers.google.com/season-of-docs
[2] - https://www.mediawiki.org/wiki/Season_of_Docs/2024
[3] -
https://www.mediawiki.org/wiki/Season_of_Docs/2024#Volunteering_to_support_…
--
Alex Paskulin
Technical Writer
Wikimedia Foundation
This week's 1.42.0-wmf.20 version of MediaWiki is blocked[0], and
1.42.0-wmf.19 may require a Monday rollback, which is a situation we
hope to avoid.
We can't proceed until the following issue is resolved:
* T336504 - Transcluding Special:Prefixindex can force the default skin
- https://phabricator.wikimedia.org/T336504
For context, see the Village pump discussion linked here:
* https://phabricator.wikimedia.org/T336504#9572477
Once this issue is resolved, the train can proceed normally for the week.
Thank you for any help!
-- Your designated sacrificial train victims
[0]. https://phabricator.wikimedia.org/T354438