Following the process described in the Code of Conduct for Wikimedia
technical spaces <https://www.mediawiki.org/wiki/Code_of_Conduct>, the
Wikimedia Foundation’s Technical Collaboration team has selected five
candidates to form the first Code of Conduct Committee and five candidates
to become auxiliary members.
Here you have their names in alphabetical order. For details about each
candidate, please check
https://www.mediawiki.org/wiki/Code_of_Conduct/Committee_members
Committee member candidates:
-
Amir Sarabadani (Ladsgroup)
-
Lucie-Aimée Kaffee (Frimelle)
-
Nuria Ruiz (NRuiz-WMF)
-
Sébastien Santoro (Dereckson)
-
Tony Thomas (01tonythomas)
Auxiliary member candidates:
-
Ariel Glenn (ArielGlenn)
-
Caroline Becker (Léna)
-
Florian Schmidt (Florianschmidtwelzow)
-
Huji
-
Matanya
This list of candidates is subject to a community review period of two
weeks starting today. If no major objections are presented about any
candidate, they will be appointed in six weeks.
You can provide feedback on these candidates, via private email to
techconductcandidates(a)wikimedia.org. This feedback will be received by
the Community
Health
<https://meta.wikimedia.org/wiki/Technical_Collaboration/Community_health>
group handling this process, and will be treated with confidentiality.
We want to thank all the people who has considered the possibility to
support the Code of Conduct with their participation in this Committee. 77
persons have been contacted during the selection process, counting
self-nominations and recommendations. From these, 21 made it to a short
list of candidates confirmed and (according to our estimation) a potential
good fit for the Committee. Selecting the five candidates for the Committee
has been hard, as we have tried to form a diverse group that could work
together effectively in the consolidation of the Code of Conduct. Selecting
the five auxiliary members has been even harder, and we know that we have
left out candidates who could have contributed just as much. Being the
first people assuming these roles, we have tended a bit towards more
technical profiles with good knowledge of our technical spaces. We believe
that future renewals will offer better chances to other profiles (not so
technical and/or not so Wikimedia veteran), adding a higher diversity and
variety of perspectives to the mix.
On Thu, Mar 9, 2017 at 12:30 PM, Quim Gil <qgil(a)wikimedia.org> wrote:
> Dear Wikimedia technical community members,
>
> https://www.mediawiki.org/wiki/Code_of_Conduct
>
> The review of the Code of Conduct for Wikimedia technical spaces has been
> completed and now it is time to bootstrap its first committee. The
> Technical Collaboration team is looking for five candidates to form the
> Committee plus five additional auxiliary members. One of them could be you
> or someone you know!
>
> You can propose yourself as a candidate and you can recommend others
> *privately* at
> techconductcandidates AT wikimedia DOT org
>
> We want to form a very diverse list of candidates reflecting the variety
> of people, activities, and spaces in the Wikimedia technical community. We
> are also open to other candidates with experience in the field. Diversity
> in the Committee is also a way to promote fairness and independence in
> their decisions. This means that no matter who you are, where you come
> from, what you work on, or for how long, you are a potential good member of
> this Committee.
>
> The main requirements to join the Committee are a will to foster an open
> and welcoming community and a commitment to making participation in
> Wikimedia technical projects a respectful and harassment-free experience
> for everyone. The committee will handle reports of unacceptable behavior,
> will analyze the cases, and will resolve on them according to the Code of
> Conduct. The Committee will also handle proposals to amend the Code of
> Conduct for the purpose of increasing its efficiency. The term of this
> first Committee will be one year.
>
> Once we have a list of 5 + 5 candidates, we will announce it here for
> review. You can learn more about the Committee and its selection process at
> https://www.mediawiki.org/wiki/Code_of_Conduct/Committee and you can ask
> questions in the related Talk page (preferred) or here.
>
> You can also track the progress of this bootstrapping process at
> https://www.mediawiki.org/wiki/Talk:Code_of_Conduct#
> Bootstrapping_the_Code_of_Conduct_Committee
>
> PS: We have many technical spaces and reaching to all people potentially
> interested is hard! Please help spreading this call.
>
> --
> Quim Gil
> Engineering Community Manager @ Wikimedia Foundation
> http://www.mediawiki.org/wiki/User:Qgil
>
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hi all,
Thanks for the feedback on my question. I see that the image got a
different template assigned to it and now the API response is fine. I'll
spread the news about that.
I have one follow up question. When an image didn't get the right
template/license assigned. Is there a tool/bot that people can use to
correct these mistakes? For instance can a bot remove images where they
aren't supposed to be? If my tool ever becomes wide spread I'd like there
to be an easy way to update images like the one we saw before the weekend.
Thanks a lot,
Fako
Hi all!
Today I had a look on my github profile page and I recognized, that some of
the commits I made to Wikimedia projects, are not listed in the
contributions activity list. Because my ego is sometimes really big (;)), I
started looking around where they're. My finding was (according to the
github help[1]), that a commit and a repo of the commit must fulfill some
requirements:
All of:
The email address used for the commits is associated with your GitHub
account.
The commits were made in a standalone repository, not a fork.
The commits were made:
In the repository's default branch (usually master)
In the gh-pages branch (for repositories with Project Pages sites)
-> all of them are fulfilled by my commits
And at least one of:
You are a collaborator on the repository or are a member of the organization
that owns the repository.
You have forked the repository.
You have opened a pull request or issue in the repository.
You have starred the repository.
Ok, that's problematic:
Because we use gerrit for code-review, I normally does not fork any project
of Wikimedia on github, why should I? So, this doesn't sound like a solution
for the problem, because forking, just to have the contributions visible
sounds like a bad solution at all.
I could, however, Open a pull request or an issue for all projects, but,
apart from the work doing this manually, this sounds like wasted time, too,
as we don't use pull requests on github and use phabricator instead of
github issues for the organization of work, that's why this is not a
solution, too.
Star-ing all repositories would be, from the meaning, totally ok, however,
there're two problems: The manual work to do that sounds like wasted time,
too, and I would use the star only, if I really like the repository, like I
did before for some of the projects (core, ConfirmEdit, ...). Doing it just
for getting the contributions visible on the profile sounds bad to me, too,
perfonally.
Being a collaborator is also probably not the right solution, as that would
require, that someone invites me to all projects under Wikimedia, which is a
lot of work for at least two people, and it's wasted time, too.
So, I was wondering, if, and if yes how, I, as a community contributor,
could be a member of the Wikimedia organization on github. I searched on
mediawiki.org and wikitech, but couldn't find a policy or information page
about it, and that's what this thread should be all about :P
I asked hashar and he looked at the people, who are already part of the org,
and found some volunteers, so he invited me, too. However, I think we
should've at least a broader discussion, and hopefully we can create a help
page or a public policy for if, when and how a volunteer developer can be a
member of the Wikimedia org on github.
So I'm open for opinions and discussions, probably there's already some
information available, and I just haven't found it? :)
Thanks for your time
Florian
[1]
https://help.github.com/articles/why-are-my-contributions-not-showing-up-on-
my-profile/
GCP has a number of models-as-a-service
<https://cloud.google.com/products/machine-learning/> that might be useful.
On Mon, Apr 3, 2017 at 6:46 PM Daniel Mietchen <
daniel.mietchen(a)googlemail.com> wrote:
> Hi Jordan,
> can your pipeline help with video or perhaps even audio as well?
> There are lots of such files as well that need categorization.
> Thanks,
> Daniel
>
> On Tue, Apr 4, 2017 at 12:05 AM, Jordan Adler <jmadler(a)google.com> wrote:
> > Looks like some of these images still need categorization. I think
> there's
> > still an unrealized opportunity here to use the results I shared to work
> the
> > backlog of the category on the Commons.
> >
> > On Thu, Aug 11, 2016 at 1:47 PM Pine W <wiki.pine(a)gmail.com> wrote:
> >>
> >> Forwarding.
> >>
> >> Pine
> >>
> >> ---------- Forwarded message ----------
> >> From: "Jordan Adler" <jmadler(a)google.com>
> >> Date: Aug 11, 2016 13:06
> >> Subject: [Commons-l] Programmatically categorizing media in the Commons
> >> with Machine Learning
> >> To: "commons-l(a)wikimedia.org" <commons-l(a)lists.wikimedia.org>
> >> Cc: "Ray Sakai" <rsakai(a)reactive.co.jp>, "Ram Ramanathan"
> >> <ramramanathan(a)google.com>, "Kazunori Sato" <kazsato(a)google.com>
> >>
> >> Hey folks!
> >>
> >>
> >> A few months back a colleague of mine was looking for some unstructured
> >> images to analyze as part of a demo for the Google Cloud Vision API.
> >> Luckily, I knew just the place, and the resulting demo, built by
> Reactive
> >> Inc., is pretty awesome. It was shared on-stage by Jeff Dean during the
> >> keynote at GCP NEXT 2016.
> >>
> >>
> >> I wanted to quickly share the data from the programmatically identified
> >> images so it could be used to help categorize the media in the Commons.
> >> There's about 80,000 images worth of data:
> >>
> >>
> >> map.txt (5.9MB): A single text file mapping id to filename in a "id :
> >> filename" format, one per line
> >>
> >> results.tar.gz (29.6MB): a tgz'd directory of json files representing
> the
> >> output of the API, in the format "${id}.jpg.json"
> >>
> >>
> >> We're making this data available under the CC0 license, and these links
> >> will likely be live for at least a few weeks.
> >>
> >>
> >> If you're interested in working with the Cloud Vision API to tag other
> >> images in the Commons, talk to the WMF Community Tech team.
> >>
> >>
> >> Thanks for your help!
> >>
> >>
> >> _______________________________________________
> >> Commons-l mailing list
> >> Commons-l(a)lists.wikimedia.org
> >> https://lists.wikimedia.org/mailman/listinfo/commons-l
> >>
> >
> > _______________________________________________
> > Commons-l mailing list
> > Commons-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/commons-l
> >
>
> _______________________________________________
> Commons-l mailing list
> Commons-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/commons-l
>
Hello,
As of the release of service-runner v2.3.0~[1] earlier today, we are no
longer supporting Node.js v0.1x platforms. The minimum Node version needed
to power your services is now set at v4.2.2, but we encourage the library's
users to develop and run their services on Node v6.x, the current Node TLS
release.
If this change is affecting your services in a negative way, please let us
know here on-list or by filing a task in Phabricator against the
service-runner tag~[2].
Best,
Marko Obrovac, PhD
Senior Services Engineer
Wikimedia Foundation
[1] https://github.com/wikimedia/service-runner/releases/tag/v2.3.0
[2] https://phabricator.wikimedia.org/project/board/1062/
https://www.mediawiki.org/wiki/Scrum_of_scrums/2017-04-19
*= 2017-04-19=*
contact: https://www.mediawiki.org/wiki/Wikimedia_Engineering
== Call outs:==
* Releng: if you have a scap3-deployed repo that has a patch
https://gerrit.wikimedia.org/r/#/q/topic:T162814+%28status:open%29 please
merge
* Analytics: Piwik is being upgraded tomorrow, April 20th, may have a 30
minute down-time
* Analytics: Wikistats 2.0 prototype consultation going on at
https://www.mediawiki.org/wiki/Wikistats_2.0_Design_Project/RequestforFeedb…
== Product ==
=== Reading ===
==== iOS ====
* Last Week
** Continued work on 5.4.1 -
https://phabricator.wikimedia.org/project/view/2600/
*** Background feed loading & coalescing
*** Crash fixes & performance enhancements
** 5.5 - https://phabricator.wikimedia.org/project/view/2602/
*** Places
*** JavaScript consolidation with Android
*** Move footer content to WebView
* This Week
** Testing 5.4.1
** Continue work on 5.5 (Places, JS consolidation)
==== Android ====
* Beta release this week containing Wikidata title description editing
expanded to many more languages, as well as various offline UX improvements
* Further improving offline functionality and surrounding UX polish
* Continuing work on cross-platform consolidation of CSS & JS
* Beginning discussion of implementing offline ZIM collections (Q4 goal)
* Current release board:
https://phabricator.wikimedia.org/project/view/2352/
==== Reading Infrastructure ====
* TemplateStyles CR, familiarizing with OCG
* MCS: Finally updating Parsoid version requested by MCS to 1.3.0. Working
on refactoring mobile-sections to a new, intermediary, mobile HTML endpoint.
=== Web ===
Wrapping up page previews work
Beginning work on a print specific stylesheet
=== Editing ===
==== Collaboration ====
* No deploys this week, but on Monday, planning to enable new RC Filters as
a Beta Feature on English Wikipedia (which does have ORES), plus all
non-ORES wikis (with the possible exception of German Wikipedia).
* Preview for when deployments restart:
** Working on transforming Wikidata user IDs so propagated edits show user
responsible
** Optimization so if we know a query will return 0 results, we won't do
the query at all. Some of these no-result queries have extremely poor
performance.
** Other bug fixes
==== Parsing ====
* Linter: Continuing to address bug reports and tweaking it. Was disabled
from large wikis last Friday because of performance issues (
https://phabricator.wikimedia.org/T148609 ). Problem is now fixed and will
be re-enabled next week. Decided to finish tweaking and improving output
before a wider announcement.
==== Language ====
* ContentTranslation disabled in all Wikis due to high load on x1 in DC
switch. See: https://phabricator.wikimedia.org/T163344 Ops/DBA aware. Team
will debug further on it.
* Work on CX + OOjs continue.
==== UI Standardization ====
* This week:
** Continued work to provide WikimediaUI Base variables in core
https://phabricator.wikimedia.org/T123359
* Updates:
** OOjs UI:
*** Release of v0.21.1 with 11 UI/a11y improvements
https://phabricator.wikimedia.org/diffusion/GOJU/browse/master/History.md –
among those:
**** MediaWiki theme: Ensure WCAG level AA contrast on unsupported
SelectFileWidget
**** MediaWiki theme: Make readonly TextInputWidget appearance clearer
**** MediaWiki theme: TagMultiselectWidget outlined UI improvements
**** MenuOptionWidget: Remove theme-independent 'check' icon (Prateek
Saxena)
**** DropdownInput-/RadioSelectInputWidget: Remove unnecessary ARIA
attributes
=== Wikidata ===
* continue work on federation and structured wiktionary
* deploying geoshape data type on Wikidata next Monday
* also enabling Cognate extension (interwiki links) on Wiktionary next
Monday
== Technology ==
=== Security ===
* Reviews
** Ex:WikibaseMediaInfo
** TemplateStyles re-review
=== Services ===
* Blockers: none
* Updates:
** Services DC switchover yesterday
** RESTBase summary endpoint now allows 5 minutes client-side caching
=== Analytics ===
* Ongoing work on EventLogging analysis support in Hadoop
* Ongoing work on Wikistats 2.0 data back-end
* Piwik being upgraded tomorrow, will have a short (30-minute or so)
downtime
* Wikistats 2.0 consultation on the visual design prototype happening now:
https://www.mediawiki.org/wiki/Wikistats_2.0_Design_Project/RequestforFeedb…
(prototype at https://analytics-prototype.wmflabs.org )
* Dashiki configuration articles on meta are all screwed up, can't fix them
until the codfw-related deployment moratorium is over
=== RelEng ===
* Blockers: none
* Blocking: none?
* '''Updates'''
** If you have a scap3-deployed repo that has a patch in
https://gerrit.wikimedia.org/r/#/q/topic:T162814+%28status:open%29 please
merge
=== Discovery ===
* No blockers
* New blog post about search:
https://blog.wikimedia.org/2017/04/10/searching-wikipedia/
* Made plan to deploy archive search:
https://phabricator.wikimedia.org/T163235 comments welcome
* Portal updates: https://phabricator.wikimedia.org/T128546
* Building infrastructure for machine learning assisted ranking (aka
MjoLniR)
* Working on Wikidata search improvement
=== Fundraising Tech ===
* More Paypal Express Checkout fixes
* Coordinating with Comms to update the WMF logo in various places:
https://phabricator.wikimedia.org/T144254
* CentralNotice: Banner sequence feature is in code review
https://phabricator.wikimedia.org/T144453
* CiviCRM: getting rid of the rest of our local core hacks
=== Community Tech ===
No blockers
* Pushed out Special:AutoblockList, enhancements coming
* Getting community feedback on LoginNotify (
https://www.mediawiki.org/wiki/Extension:LoginNotify)
* Analyzing cookie blocking on English Wikipedia prior to broader roll-out
to all wikis
* Work continuing on CodeMirror (syntax highlighting) (
https://www.mediawiki.org/wiki/Extension:CodeMirror)