I know it has been annoying a couple of people other than me, so now that I've learned how to make it work I'll share the knowledge here.
tl;dr: Star the repositories. No, seriously. (And yes, you need to star each extension repo separately.)
(Is there a place on mw.org to put this tidbit on?)
------- Forwarded message -------
From: "Brian Levine" <support(a)github.com> (GitHub Staff)
To: matma.rex(a)gmail.com
Cc:
Subject: Re: Commits in mirrored repositories not showing up on my profile
Date: Tue, 09 Jul 2013 06:47:19 +0200
Hi Bartosz
In order to link your commits to your GitHub account, you need to have some association with the repository other than authoring the commit. Usually, having push access gives you that connection. In this case, you don't have push permission, so we don't link you to the commit.
The easy solution here is for you to star the repository. If you star it - along with the other repositories that are giving you this problem - we'll see that you're connected to the repository and you'll get contribution credit for those commits.
Cheers
Brian
--
Matma Rex
We just released a new version of Research:FAQ on Meta [1], significantly
expanded and updated, to make our processes at WMF more transparent and to
meet an explicit FDC request to clarify the role and responsibilities of
individual teams involved in research across the organization.
The previous version – written from the perspective of the (now inactive)
Research:Committee, and mostly obsolete since the release of WMF's open
access policy [2] – can still be found here [3].
Comments and bold edits to the new version of the document are welcome. For
any question or concern, you can drop me a line or ping my username on-wiki.
Thanks,
Dario
[1] https://meta.wikimedia.org/wiki/Research:FAQ
[2] https://wikimediafoundation.org/wiki/Open_access_policy
[3] https://meta.wikimedia.org/w/index.php?title=Research:FAQ&oldid=15176953
*Dario Taraborelli *Head of Research, Wikimedia Foundation
wikimediafoundation.org • nitens.org • @readermeter
<http://twitter.com/readermeter>
Hello everyone,
Wikimedia is gearing up to apply as a mentoring organization for Google
Summer of Code 2024 <
https://www.mediawiki.org/wiki/Google_Summer_of_Code/2024>[1] and Outreachy
Round 28 <https://www.mediawiki.org/wiki/Outreachy/Round_28> [2].
Currently, we're crafting a list of exciting project ideas for the
application. If you have any suggestions for projects, whether coding or
non-coding (design, documentation, translation, outreach, research), please
share them by February 5th via this Phabricator task: <
https://phabricator.wikimedia.org/T354734> [3]. Note that for non-coding
projects eligible for Outreachy, slots are limited and will be allocated to
mentors on a first-come, first-serve basis.
Timeline
In your role as a mentor, your involvement spans the application period for
both programs, taking place from March to April. During this time, you'll
guide candidates in making small contributions to your project and address
any project-related queries they may have. As the application period
concludes, you'll further intensify your collaboration with accepted
candidates throughout the coding period, which extends from May to August.
Your support and guidance are crucial to their success in the program.
Guidelines for Crafting Project Proposals:
-
Follow this task description template when you propose a project in
Phabricator: <
https://phabricator.wikimedia.org/tag/outreach-programs-projects> [4].
You can also use this workboard to pick an idea if you don't have one
already. Add #Google- Summer-of-Code (2024) or #Outreachy (Round 28) tag.
-
Project should require an experienced developer ~15 days and a newcomer
~3 months to complete.
-
Each project should have at least two mentors, including one with a
technical background.
-
Ideally, the project has no tight deadlines, a moderate learning curve,
and fewer dependencies on Wikimedia's core infrastructure. Projects
addressing the needs of a language community are most welcome.
* Learn more about the roles and responsibilities of Mentors for both
programs:*
-
Outreachy: <https://www.mediawiki.org/wiki/Outreachy/Mentors> [5]
-
Google Summer of Code: <
https://www.mediawiki.org/wiki/Google_Summer_of_Code/Mentors> [6]
Thank you,
Links:
[1] https://www.mediawiki.org/wiki/Google_Summer_of_Code/2024
[2] https://www.mediawiki.org/wiki/Outreachy/Round_28
[3] https://phabricator.wikimedia.org/T354734
[4] https://phabricator.wikimedia.org/tag/outreach-programs-projects
[5] https://www.mediawiki.org/wiki/Outreachy/Mentors
[6] https://www.mediawiki.org/wiki/Google_Summer_of_Code/Mentors
--
*Onyinyechi Onifade *
Technical Community Program Manager
Wikimedia Foundation <https://wikimediafoundation.org/>
Hi,
We (MediaWiki Platform Team) are calling on you to nominate individuals for
the Web Perf Hero award for 2024.
https://wikitech.wikimedia.org/wiki/Web_Perf_Hero_award
A nomination is as simple as sending an email to
mediawiki-platform-team(a)wikimedia.org with the relevant information and we
will curate these responses and develop a list once the nomination period
has closed. The nomination period would run from today April 17, 2024 to
May 19, 2024.
When you nominate a person, include an example of where or how they made a
measurable performance improvement on Wikipedia or sister projects. This
can be as simple as links to Gerrit patch(es) or Phabricator task(s).
Nominations are open to all technical contributors (including WMF staff).
We really encourage you to nominate others, though self nominations are
permitted. You are also free to nominate multiple developers if you have
more than one in mind.
You can have a look at
https://wikitech.wikimedia.org/wiki/Web_Perf_Hero_award for past awards
given out in previous years. Thanks in advance for your nominations!
Derick,
On behalf of MediaWiki Platform Team
Hi everyone,
It’s time to nominate your favorite tool(s) for the fifth edition of the
Coolest Tool Award!! 🎉
The Coolest Tool Award seeks to spotlight the diverse range of tools
created by Wikimedia community members. These tools play a vital role in
improving the efficiency, accessibility, and functionality of Wikimedia
projects, ultimately enriching the experience for Wikimedia communities.
We’d like to invite you all to nominate your favorite & most used tools and
help us celebrate the people who create them!
To nominate your favorite tools, simply follow this link: [
https://wikimediafoundation.limesurvey.net/797991?lang=en] . Feel free to
submit multiple nominations by completing the form as many times as you'd
like. The deadline for Nomination is *May 10th 2024.* For further details
on the nomination and selection process, the Coolest Tool Award academy,
and the upcoming award ceremony, please visit:
https://meta.wikimedia.org/wiki/Coolest_Tool_Award.
We plan to award the coolest tools in a variety of categories (see last
year’s categories
<https://meta.wikimedia.org/wiki/Coolest_Tool_Award/2022#2022_Winners> [2]
for an example). Excitingly, we're returning to an in-person event for the
first time since 2019, with winners set to be unveiled at Wikimania 2024.
If you have any questions or suggestions, don't hesitate to reach out via
the Talk Page <https://meta.wikimedia.org/wiki/Talk:Coolest_Tool_Award>.
We're here to help!
Thank you for your participation and recommendations.
Regards,
Onyinyechi, for the 2024 Coolest Tool Academy
[1] https://meta.wikimedia.org/wiki/Coolest_Tool_Award#Coolest_Tool_Academy
[2] https://meta.wikimedia.org/wiki/Coolest_Tool_Award/2022#2022_Winners
Esteemed technical community,
The WMF’s SRE Observability team
<https://wikitech.wikimedia.org/wiki/SRE/Observability> invites you to join
our ongoing effort to migrate MediaWiki metrics to Prometheus
<https://wikitech.wikimedia.org/wiki/Prometheus>, utilizing StatsLib
<https://www.mediawiki.org/wiki/Manual:Stats>, an internally developed,
Prometheus-capable metrics interface. This initiative is fundamental in
unifying our metrics, improving MediaWiki observability, and reducing tool
fragmentation. Your participation is crucial to drive this effort forward.
The Ask
We invite you to contribute to this project
<https://phabricator.wikimedia.org/T350592>. Your expertise can drive the
success of this migration, helping us streamline and improve our monitoring
capabilities.
We appreciate your support in migrating your component’s metrics to
StatsLib (*T350592*) <https://phabricator.wikimedia.org/T350592>; this
involves:
-
Look up your component, extension, or module in the task above, claim
(or create) the corresponding sub-task for your metrics, and follow the
examples/docs available to migrate your metrics to the new metrics
interface.
-
Help deprecate and clean up/remove outdated metrics not in use (or
graphed in dashboards).
-
Collaborate with testing and provide feedback for a seamless transition.
Why Prometheus <https://prometheus.io/docs/introduction/overview/>?
We have been using Prometheus in production for several years as it offers
several benefits over Graphite
<https://prometheus.io/docs/introduction/comparison/>. Migrating MediaWiki
off Graphite ensures we stay ahead with a supported, scalable metrics
platform for more effective long-term, multidimensional metrics analysis
and storage. The new stack provides more robust data labeling, storage, and
query capabilities. This project facilitates the improvement of our
production metrics infrastructure and deprecates older systems. The general
thought process is outlined in T249164 RFC: Better interface for
generating metrics in MediaWiki <https://phabricator.wikimedia.org/T249164>.
Support and Resources:
We can assist via various channels such as email, phabricator and IRC
(#wikimedia-observability on Libera.Chat), including periodic technical
office hours (to be scheduled soon). Please contact me or visit the SRE
Observability Team Interface Page
<https://office.wikimedia.org/wiki/Team_interfaces/SRE_-_Observability> for
ways to get involved.
Timeline
Please prioritize and schedule this work within the next few quarters to
ensure a seamless transition and the sustainability of the MW production
ecosystem. Together, we can achieve a more efficient and reliable
Observability platform. Thank you in advance for your understanding and
support.
Hackathon 2024!
We will be at this year’s hackathon
<https://www.mediawiki.org/wiki/Wikimedia_Hackathon_2024> for those
attending who are interested in participating or have any questions.
We appreciate your support and help; many thanks for your attention and to
all those who have/are participating already!
Respectfully,
Leo
*Leo Mata* (he/him)
Engineering Manager - Observability
Wikimedia Foundation <https://wikimediafoundation.org/>
*[[Note: I've received feedback that the previous announcement email may
have been buried in older threads, so I'm resending this for visibility.
Apologies if you are seeing this more than once.]]*
Hello all,
The TDMP Retro Committee has finalized the gathering and analysis of
the Technical
Decision-Making Process Retrospective and Consultation phase
<https://www.mediawiki.org/wiki/Technical_decision_making/Technical_Decision…>
.
This document presents the findings of the survey conducted among the
Wikimedia technical community regarding their perspectives on a technical
decision-making process in July 2023.
The purpose of this document is to serve as a comprehensive summary,
offering insights from the retrospective collection of opinions and voices
within the technical community. It is not intended to provide a new
decision-making methodology or devise a fresh process.
It serves to inform the next steps of the process of re-evaluating and
adjusting the current technical decision making process.
The report is now available on MediaWiki at Retrospective and Consultation:
Results and Analysis
<https://www.mediawiki.org/wiki/Technical_decision_making/Technical_Decision…>
Mark Bergsma (VP SRE), Tajh Taylor (VP Data Sci and Eng), and Suman
Cherukuwada (Senior Dir Feature Eng) are taking on the responsibility to
determine what changes to make to the TDMP. They will guide a process to
solicit input and feedback on a proposal that will be open to staff and
volunteer technical contributors.
Thank you,
Moriel, on behalf of the TDMP Retro Core Group.
Core group:
- Moriel Schottlender (chair)
- Daniel Kinzler
- Chris Danis
- Kosta Harlan
- Temilola Adeleye
--
Moriel Schottlender (she/her <https://pronoun.is/she>)
Principal Software Engineer
Wikimedia Foundation https://wikimediafoundation.org/
We operate Wikipedia <https://wikipedia.org> and its sister sites
Hi all!
- Results:
https://www.mediawiki.org/wiki/Developer_Satisfaction_Survey/2024
- The developer satisfaction survey launched in December 2023 and closed in
January 2024[0].
- 171 members of the Wikimedia developer community responded to questions
about developer tools and processes.
____
The developer satisfaction survey results show what's important to our
developer community—the parts of the developer experience that work well,
and the parts that need more resources to improve.
➡️ Read it. Talk about it. And ask questions! (please ;))!
____
🎉 Thanks to everyone who took the time to fill out our survey.
And huge thanks to the Wikimedia Research Team for helping Cloud
Services, Release
Engineering, Technical Documentation and other Wikimedia Research team
members build the survey and process the data. In particular, Yu-Ming Liou
for survey-building expertise. And Caroline Myrick who was an absolute hero
processing, exploring, graphing, and patiently explaining all the data.
<3
Tyler Cipriani (he/him)
Engineering Manager, Release Engineering
Wikimedia Foundation
[0]: <
https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/…
>
i've never contributed to any mailing lists, or partaken much, so i don't know how to start this off...
hi? i've realized i can compressed a dataset of lossless .png files down to between a third- or a fourth of the initial size on disk.
that said: in compressing my own backups i find that lossless .png files converted to .xpm (not .ppm, not .bmp) before being compressed result in a compression that of 3- to 4 times smaller in size on disk than the initial .png with it's own compression --- irregardless of any 'PNG optimization' done beforehand: the resulting .xpm files remain identical in size and compress to precisely the same size.
the following commands¹ can surely be tinkered with to greater effect:
---start of shell commands---
mkdir -p xpm
magick convert image.png xpm/image.xpm
mkdwarfs -i xpm/ -o compressed.dfs -l9
---end of shell commands---
now you have a compressed image, three to four times smaller in size on disk, to inspect.
here i openly wonder how a comparison - to assert whether the resulting .xpm file and the lossless .png are indeed the same picture still - would be carried out.
likewise i see this quality of compression extends to the .xgm .xbm formats.
in sharing i wish to bring up the above observation to my best ability. insights welcome as to why this happens and if indeed the image resulting from a lossless .png to .xpm conversion is the same - if this compressed bitmap outperforms the PNG compression significantly without compromising image integrity. (do reply with saying if this is irrelavant information and or presented inadequately in any way: i don't wish to bring red herrings to this mailing list.)
ultimately this is about if a large portion of WikiMedia imagedata indeed can be compressed further by this process - in a 'Pareto improvement' kind of way.
-Ivy, 25
for interest: sources to programs referenced and my brief notes on these.
[1]
magick: https://imagemagick.org/index.php
mkdwarfs, part of the inappropriately named dwarfs toolset:
https://github.com/mhx/dwarfs/blob/main/doc/mkdwarfs.md
note that the -l flag is given the option 9 in the command - this means LZMA is used in this program.
also note that while i use 'mkdwarfs' with LZMA here, i realize the same result on any other program using LZMA or XZ occurs - like the following 'dar', more adequate for single file extraction from an archive and analyzing individual file compression values en masse.
dar: http://dar.linux.free.fr/doc/man/dar.html
unfortunately the projects' website doesn't use HTTPS, so a wayback machine link with HTTPS:
https://web.archive.org/web/20240423233825/http://dar.linux.free.fr/doc/man…
---start of referenced dar command---
dar -c output -zxz9 -R input/
---end of referenced dar command---
lastly two clarifications:
'a lossless .png file' here means an imagefile which never has been converted in a lossy way. a .jpg file converted to a .png used in this procedure produces an output taking more space on disk.
'the procedure' here refers to conversion of a lossless .png to .xpm imagefile format then compressed with either LZMA or XZ in any program.
thank you for reading.