I know it has been annoying a couple of people other than me, so now that I've learned how to make it work I'll share the knowledge here.
tl;dr: Star the repositories. No, seriously. (And yes, you need to star each extension repo separately.)
(Is there a place on mw.org to put this tidbit on?)
------- Forwarded message -------
From: "Brian Levine" <support(a)github.com> (GitHub Staff)
To: matma.rex(a)gmail.com
Cc:
Subject: Re: Commits in mirrored repositories not showing up on my profile
Date: Tue, 09 Jul 2013 06:47:19 +0200
Hi Bartosz
In order to link your commits to your GitHub account, you need to have some association with the repository other than authoring the commit. Usually, having push access gives you that connection. In this case, you don't have push permission, so we don't link you to the commit.
The easy solution here is for you to star the repository. If you star it - along with the other repositories that are giving you this problem - we'll see that you're connected to the repository and you'll get contribution credit for those commits.
Cheers
Brian
--
Matma Rex
We just released a new version of Research:FAQ on Meta [1], significantly
expanded and updated, to make our processes at WMF more transparent and to
meet an explicit FDC request to clarify the role and responsibilities of
individual teams involved in research across the organization.
The previous version – written from the perspective of the (now inactive)
Research:Committee, and mostly obsolete since the release of WMF's open
access policy [2] – can still be found here [3].
Comments and bold edits to the new version of the document are welcome. For
any question or concern, you can drop me a line or ping my username on-wiki.
Thanks,
Dario
[1] https://meta.wikimedia.org/wiki/Research:FAQ
[2] https://wikimediafoundation.org/wiki/Open_access_policy
[3] https://meta.wikimedia.org/w/index.php?title=Research:FAQ&oldid=15176953
*Dario Taraborelli *Head of Research, Wikimedia Foundation
wikimediafoundation.org • nitens.org • @readermeter
<http://twitter.com/readermeter>
Hello all,
The Technical Decision-Making Forum Retrospective team
<https://www.mediawiki.org/wiki/Technical_decision_making> invites you to
complete a survey about Wikimedia's technical decision-making processes.
While there will be more ways to participate, this is the first and most
important step in our data collection. It aims to gather information about
your experience, thoughts, and needs regarding the process of making
technical decisions across the Wikimedia technical spaces.
This survey will be used for gathering information about the process and
the needs around technical decision-making that touches our production
systems.
You can find the survey link here:
https://wikimediafoundation.limesurvey.net/885471?lang=en
Who should take this survey?
People who do technical work that relies on software maintained by the
Wikimedia Foundation (WMF) or affiliates. If you contribute code to
MediaWiki or extensions used by Wikimedia, or you maintain gadgets or tools
that rely on WMF infrastructure, this survey is for you.
What is the deadline?
*August 7th, 2023 *
What will the Retrospective team do with the information?
The retrospective team will synthesize the collected data and publish an
anonymized analysis that will help leadership make decisions about the
future of the process.
We will collect anonymized information that we will analyze in two main
ways:
-
Sentiments based on demographic information: these will tell us whether
there are different needs and desires from different groups of people.
-
General needs and perceptions about decision-making in our technical
spaces: This will help us understand what kind of decisions happen in
the spaces, who is involved, and how to adjust our processes accordingly.
Is the survey the only way to participate?
The survey is the most important way for us to gather information because
it helps us gather input in a structured manner. But it will not be the
only way you can share your thoughts with us - we will have more
information soon about upcoming listening sessions where you can talk with
us live. In the meantime, you are always welcome to leave feedback on the
talk page:
https://www.mediawiki.org/wiki/Talk:Technical_decision_making/Technical_Dec…
Where can I see more information?
There are several places where you can find more information about the
Technical Decision-Making Process Retrospective:
-
The original announcement about the retrospective from Tajh Taylor:
https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/…
-
The Technical Decision-Making Process general information page:
https://www.mediawiki.org/wiki/Technical_decision_making
-
The Technical Decision-Making Process Retrospective on MediaWiki:
https://www.mediawiki.org/wiki/Technical_decision_making/Technical_Decision…
-
Phabricator ticket: https://phabricator.wikimedia.org/T333235
How to contact the retrospective core team:
-
Write to the core team mailing list: tdf-retro-2023(a)lists.wikimedia.org
-
The Technical Decision-Making Process Retrospective on MediaWiki talk
page:
https://www.mediawiki.org/wiki/Talk:Technical_decision_making/Technical_Dec…
Thank you,
Moriel, on behalf of the TDMP Retro Core Group
Core group:
-
Moriel Schottlender (chair)
-
Daniel Kinzler
-
Chris Danis
-
Kosta Harlan
-
Temilola Adeleye
--
Moriel Schottlender (she/her <https://pronoun.is/she>)
Principal Software Engineer
Wikimedia Foundation https://wikimediafoundation.org/
(If you don’t work with pagelinks table, feel free to ignore this message)
Hello,
Here is an update and reminder on the previous announcement
<https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/…>
regarding normalization of links tables that was sent around a year ago.
As part of that work, soon the pl_namespace and pl_title columns of
pagelinks table will be dropped and you will need to use pl_target_id
joining with the linktarget table instead. This is basically identical to
the templatelinks normalization that happened a year ago.
Currently, MediaWiki writes to both data schemes of pagelinks for new rows
in all wikis except English Wikipedia and Wikimedia Commons (we will start
writing to these two wikis next week). We have started to backfill the data
with the new schema but it will take weeks to finish in large wikis.
So if you query this table directly or your tools do, You will need to
update them accordingly. I will write a reminder before dropping the old
columns once the data has been fully backfilled.
You can keep track of the general long-term work in T300222
<https://phabricator.wikimedia.org/T300222> and the specific work for
pagelinks in T299947 <https://phabricator.wikimedia.org/T299947>. You can
also read more on the reasoning in T222224
<https://phabricator.wikimedia.org/T222224> or the previous announcement
<https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/…>
.
Thank you,
--
*Amir Sarabadani (he/him)*
Staff Database Architect
Wikimedia Foundation <https://wikimediafoundation.org/>
Hi everyone,
tl;dr The tools we use to document Wikimedia JavaScript code are changing.
In the short term, you can read the complete MediaWiki core JavaScript docs
using the 1.41 version[0] while we migrate to the new system[1]. If you use
JavaScript documentation on doc.wikimedia.org, please share your feedback
on wiki[2].
Wikimedia JavaScript codebases are switching from using JSDuck[3] to
JSDoc[4] for documentation. Started in 2016, this migration is necessary
because JSDuck is currently unmaintained and does not support the ES6
standard[5]. Several Wikimedia JavaScript codebases, including Vector and
GlobalWatchlist, already use JSDoc, while several others, such as
VisualEditor and MediaWiki core, still use JSDuck.
The migration project consists of two parts: changing the codebases to
support JSDoc and improving the usability of the JSDoc WMF theme. For more
information, see phab:T138401[6].
== Migrating MediaWiki core to JSDoc ==
We are migrating MediaWiki core to JSDoc incrementally. While the migration
is in progress, the master branch docs will be incomplete, containing only
those modules that have been migrated. To read the old JSDuck docs, see the
MediaWiki 1.41 docs[0].
To help with migration, choose a module from the list in phab:T352308[7],
and follow the guide on phab:T138401[6] to translate the tags from JSDuck
to JSDoc.
== Migrating other codebases ==
You can find a list of codebases that use JSDuck on phab:T138401[6].
(Please add any that are missing.) To help migrate a codebase that uses
JSDuck, follow the instructions to set up JSDoc[8], and use the guide in
phab:T138401[6] to translate the tags from JSDuck to JSDoc.
== Improving the JSDoc WMF theme ==
One of the biggest differences between JSDuck and JSDoc is the HTML
interface for reading the docs. The WMF theme for JSDoc is not as
full-featured as the JSDuck theme, but to support this migration, the
Wikimedia Foundation Web, Design Systems, and Technical Documentation teams
are working to prioritize and complete a set of improvements to the JSDoc
theme, with the goal of releasing version 1 of jsdoc-wmf-theme in 2024.
If you use JavaScript documentation on doc.wikimedia.org, please leave a
comment on the JSDoc WMF theme talk page[2] and let us know how you use the
docs and which features of the theme are the most important to you.
Thank you for reading!
Alex, Kamil, Jon, Roan, and Anne
[0]: https://doc.wikimedia.org/mediawiki-core/REL1_41/js/
[1]: https://doc.wikimedia.org/mediawiki-core/master/js/
[2]: https://www.mediawiki.org/wiki/Talk:JSDoc_WMF_theme
[3]: https://github.com/senchalabs/jsduck
[4]: https://en.wikipedia.org/wiki/JSDoc
[5] https://en.wikipedia.org/wiki/ECMAScript
[6]: https://phabricator.wikimedia.org/T138401
[7]: https://phabricator.wikimedia.org/T352308
[8]: https://www.mediawiki.org/wiki/JSDoc
--
Alex Paskulin
Technical Writer
Wikimedia Foundation
Hey Everyone,
We hope this message finds you in great spirits and excitement for the
upcoming Hackathon! 🚀
Time is ticking, and we don't want you to miss out on this incredible
opportunity to be part of a collaborative and innovative event.
To-Do:
1. *Register: *Secure your spot by completing the registration form [Hackathon
Registration Link <https://pretix.eu/wikimedia/wmhackathon2024/>]. Remember
the registration portal will remain accessible until we hit our venue's
capacity, which is approximately set at 220 participants.
2. *Scholarship Application:*We are committed to nurturing a diverse and
inclusive community. As part of this commitment, we offer scholarships that
cover travel and accommodation expenses for a selected group of technical
contributors.
[To apply: Scholarship Application Link
<https://pretix.eu/wikimedia/wmhackathon2024/>]. Remember, the deadline for
scholarship applications is *January 5th, 2024*.
If you have any questions or need assistance, our team is here to help.
Feel free to reach out to hackathon(a)wikimedia.org for support.
Best regards,
--
*Onyinyechi Onifade *
Technical Community Program Manager
Wikimedia Foundation <https://wikimediafoundation.org/>
Hello!
Please take the December 2023 Developer Satisfaction Survey!
Link: <https://wikimediafoundation.limesurvey.net/484133>
https://wikimediafoundation.limesurvey.net/796964
The survey is open until Fri, 5 January 2024—four weeks from today.
____
This survey is for members of the Wikimedia developer community and covers
the following topics:
-
Wikimedia Cloud Services
-
Development and testing environments
-
Phabricator
-
Code review
-
Continuous Integration
-
Deployment
-
Code quality
-
Technical documentation
-
Developer research needs
Please take the survey if you play a role in developing software for the
Wikimedia ecosystem and would like to share your opinions on any of the
topics listed above.
We’re soliciting your feedback to:
- Measure developer satisfaction, and
- determine where to invest resources in the future
We will anonymize, explore, and report the data we gather on mediawiki.org.
View previous years' survey results:
https://www.mediawiki.org/wiki/Developer_Satisfaction_Survey
Privacy statement: This survey will be conducted via a third-party service,
which may subject it to additional terms. For more information on privacy
and data-handling, see the privacy statement.
<https://foundation.wikimedia.org/wiki/Legal:December_2023_Developer_Satisfa…>
Thank you!
Tyler Cipriani (he/him)
Engineering Manager, Release Engineering
Wikimedia Foundation
Hi All,
Welcome to the monthly MediaWiki Insights email!
Enable more people to know MediaWiki and contribute effectively
In the last MW insights email
<https://www.mediawiki.org/wiki/MediaWiki_Product_Insights/Reports/October_2…>
we shared more about our approach to helping people contribute effectively
to MediaWiki
<https://www.mediawiki.org/wiki/MediaWiki_Product_Insights/Contributor_reten…>.
A few interesting data points:
The number of contributors to MediaWiki core who have more than > 5 patches
continued to grow: We just hit for the first time the goal of 20% since the
start of the Foundation’s fiscal year in July, compared to the
July-November time period last year. This is exciting to see - now it’s
about keeping the momentum and continuing on that path.
Many thanks to all the people who have contributed to MediaWiki core!
The average and median time to first review for patches in MediaWiki core
decreased significantly in the period July 1st to Nov 30 compared to the
same time period one year earlier.
- Average time to first review dropped from previously 16.5 days to 4.5
days
- Median time to first review dropped from previously 1.2 days to 0.6
days
Many thanks to all the code reviewers of MediaWiki core patches!
Keep in mind that this data is only one data point. There are many factors
that play into the experience of contributors; a helpful comment may be
more relevant than a fast +1/-1, etc.
Over the past weeks, we have been spending some time with planning
initiatives to further support people in onboarding and contributing to
MediaWiki:
- We are preparing for a WMF internal MediaWiki code jam in December to
try out a few things and focus specifically on the needs of teams.
- One thing we wanted to test in practice at the code jam is the “MediaWiki
Quick Install <https://phabricator.wikimedia.org/T347347>” guide. This
has been a collaboration between the Tech Docs team and the MediaWiki
Platform team - you can find the latest version of this experiment here:
https://www.mediawiki.org/wiki/Local_development_quickstart
- We discussed a possible focus project in the next quarter on improving
first time MediaWiki (core) contributors’ experience. We’re exploring a few
simple, small ideas that we could implement/try out in the next quarter
(ticket follows!).
Project snapshot: Analysis of MediaWiki execution timings, fixing issues
with logging in on Mobile, progress on RESTBase deprecation and more!
Performance: Piotr and Timo conducted an analysis of MediaWiki execution
timings <https://phabricator.wikimedia.org/T350593> and identified areas
for improvement. One of the fixes promises a 50ms improvement
<https://phabricator.wikimedia.org/T351807>! Timo and Derick worked on
bagOStuff improvements <https://phabricator.wikimedia.org/T336004> (cache
layer), shipped on MW-1.42. This work aims to lower the barriers for
contributors by making interfaces leaner and more intuitive and is reducing
storage access cost from 10ms to ~1 ms. Thank you for your work!
More highlights:
MediaWikiIntegrationTestCase now automatically tracks what database tables
get touched during the integration test, removing the need for developers
to keep track (T342301 <https://phabricator.wikimedia.org/T342301>). Many
thanks to Daimona and others for their work on this!
Work towards PHP 8.2 support continues, with one helpful outcome being a
new DynamicPropertyTestHelper feature (T326466
<https://phabricator.wikimedia.org/T326466>). Many thanks to TK-999 and all
reviewers!
Gergö worked on solving a variety of problems with logging in on mobile
(see https://phabricator.wikimedia.org/T257852#9347008 and below). Many
thanks to Gergö and everyone who provided support!
RESTBase sunset: Wikifeeds now calls the Parsoid endpoint in MediaWiki core
rather than RESTBase. Many thanks to Yiannis and Daniel for their hard work
on making this happen! Cxserver is preparing a deployment to the same soon
<https://gerrit.wikimedia.org/r/c/operations/deployment-charts/+/977983/>
(thank
you, Language team!).
Upcoming:
There is an OutputTransform
<https://www.mediawiki.org/wiki/Parsoid/OutputTransform> pipeline that is
being introduced to replace ParserOutput::getText(). This pipeline
initially targets content that comes from the ParserCache before it is
rendered (as a 1:1 getText() equivalent ). The team is likely going to
introduce another layer of cacheability of this output so that we can store
richer canonical Parsoid content and use this pipeline to transform it for
final rendering. Many thanks to Isabelle, CScott and Daniel for this work
in progress (Gerrit:967449
<https://gerrit.wikimedia.org/r/c/mediawiki/core/+/967449>)!
As one puzzle piece of our product research efforts and platform design
explorations, Moriel and others have been working on mapping high level
essential user workflows such as edit and patrol against platform
components to explore workflow patterns and potential architectural
opportunities in the platform. One outcome of this is going to be to
describe the key challenges when trying to model our system. Many thanks to
Moriel for leading on this work, and Daniel, Timo, Subbu, James, Cindy,
Emanuele and Amir S for their support, great questions and ideas!
Up next: Presentations at Semantic MediaWikiCon
Semantic MediaWikiCon
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2023#Program> is
coming up, virtual and in person from Dec 11-13. We shared about the
updates to the rdbms library in the last MW Insights email - if you want to
learn more about this work, check out Amir’s presentation at Semantic
MediaWikiCon
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2023/Major_changes_on_i…>!
Subbu and C.Scott are also going to give their yearly update on the parser
unification work
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2023/Updates_from_the_W…>,
Chris will be talking about Codex
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2023/Codex,_the_Design_…>,
and Stef about automated testing for complex MediaWiki topologies
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2023/Automated_Testing_…>.
Since the theme of this edition is MediaWiki in the age of AI, Mike will be
presenting on the recent experiences with the experimental Wikipedia
ChatGPT plugin
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2023/The_Wikipedia_Chat…>.
Keynote speaker of this years’ Semantic MediaWikiCon is Markus Krötsch
<https://www.korrekt.org/page/Short_biography>.
That’s the last insights email for 2023. The deployment train pauses for
the end of the year break, and so does the monthly MW Insights email!
We’ll be following up with a double-edition in January.
Thanks all for reading,
Birgit
--
Birgit Müller (she/her)
Director of Product, MediaWiki and Developer Experiences
Wikimedia Foundation <https://wikimediafoundation.org/>