Original version: https://www.mediawiki.org/wiki/Topic:Td5wfd70vptn8eu4
The Wikimedia Developer Summit 2017 is 13 weeks away!
PARTICIPANTS
* 68 people have requested an invitation.
* 19 of them have requested travel sponsorship as well.
Join us: https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit.
The deadline to request travel sponsorship is Monday, October 24th.
PROPOSALS
* 8 proposals submitted - https://phabricator.wikimedia.
org/project/view/2205/
** 1 in backlog
** 2 in Unconference
** 2 missing basic information
** 3 to be pre-scheduled
https://www.mediawiki.org/wiki/Wikimedia_Developer_
Summit/Call_for_participation
The deadline for submitting new proposals is Monday, October 31.
MAIN TOPICS
https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit/2017/Program
On track:
* A plan for the Community Wishlist 2016 top results
* Building on Wikimedia services: APIs and Developer Resources
* How to manage our technical debt
Missing basic information
* A unified vision for editorial collaboration
* Building a sustainable user experience together
* How to grow our technical community
Missing two facilitators
* A unified vision for editorial collaboration
* How to grow our technical community
Missing one facilitator
* Handling wiki content beyond plaintext
* Building a sustainable user experience together
* Artificial Intelligence to build and navigate content
ORGANIZATION
* Meet the Program committee. https://www.mediawiki.org/
wiki/Wikimedia_Developer_Summit/Program_committee
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hi,
YuviPanda, prtksxna, and myself (with help from Tim and Aaron) have been
working the UrlShortener extension, which is designed to implement the
URL shortener RfC[1] (specifically Tim's implementation suggestion).
I've filed T108557[2] to deploy the extension to Wikimedia wikis. We'd
like to use the "w.wiki" short domain, which the WMF is already in
control of.
A test wiki has been set up mimicking what Wikimedia's configuration
would be like: http://urlshortener.wmflabs.org/, and has an accompanying
"short" domain at us.wmflabs.org (e.g. http://us.wmflabs.org/3). Please
play with it and report any bugs you might find :)
[1] https://www.mediawiki.org/wiki/Requests_for_comment/URL_shortener
[2] https://phabricator.wikimedia.org/T108557
Thanks,
-- Legoktm
Hello all,
I am trying to develop a chrome extension to work with wikipedia.
Started exploring javascript for login api.
Got this example.
https://www.mediawiki.org/wiki/Example_login_code_in_JS_%28using_JQuery%29
Stored the code as test.html and opened in chrome.
Got the following error.
No 'Access-Control-Allow-Origin' header is present on the requested
resource. Origin 'null' is therefore not allowed access.
How to solve this error?
https://www.mediawiki.org/wiki/Manual:CORS
The examples here are not helpful.
Please guide me on how to login to mediawiki via javascript?
Thanks.
--
Regards,
T.Shrinivasan
My Life with GNU/Linux : http://goinggnu.wordpress.com
Free E-Magazine on Free Open Source Software in Tamil : http://kaniyam.com
Get Free Tamil Ebooks for Android, iOS, Kindle, Computer :
http://FreeTamilEbooks.com
Hi all,
just a moment ago I noticed, that Rob Lanphier seems to (silently) does not
work for the WMF anymore? At least based on a comment on his WMF-account's
talk page[1] and a commit in gerrit[2] it seems he doesn't work at the WMF
anymore.
However: What happened? I haven't seen any public info about this? Can
someone explain what happened? I'm really interested in it, as I really like
RobLa, both as a person (at least from what I saw/read so far) and the
professional /technical input and things he did. Would be nice, if someone
could bring some light into the darkness :P
Best,
Florian
[1] https://meta.wikimedia.org/wiki/User_talk:RobLa-WMF#Goodbye_and_thanks
[2] https://gerrit.wikimedia.org/r/#/c/318656/2
The Wikimedia continuous integration system will be unavailable while some
scheduled maintenance is done.
When: Thursday 3rd 2016 for two hours between 16:00 UTC to 18:00 UTC.
Impact: During that time, you will still be able to send patches to Gerrit but
no CI jobs will be run nor will patches be automatically merged when someone
votes "Code-Review +2". All patches sent during the operations will be sent to
the CI system for you as a convenience.
Why: The maintenance will move the core of the CI system (Jenkins and Zuul)
from an aged server to a fresh new machine.
More info: It will be done by Antoine Musso, Tyler Cipriani and Daniel Zahn.
You will be able to watch progress on IRC in the #wikimedia-operations channel.
See also: https://phabricator.wikimedia.org/T95757
Hi!
Soooo, it's that wonderful time of year where we start prepping for a new
general release
of MediaWiki! This one will be 1.28.0, and it'll be based on all of the
1.28 wmf branches we've
been doing over the past 6 months.
Step 1 is cutting the branch, which I plan to do tomorrow from the same
branch point which we
cut the 1.28.0-wmf.23. This is slightly different, in that we won't be
cutting from master a few days
after the WMF branch, and takes some of the pressure off of creating
1.29.0-wmf.1 the following
week.
So here's the timeline:
Tomorrow (Oct 25) - Cut REL1_28 from wmf.23, master goes to 1.29-alpha
Tues (Nov 1) - First deployment of 1.29 to WMF [wmf.1, obviously]
Wednesday (Nov 2) - Do rc.0 [giving us a few days for any backports that
came up in wmf.23 rollout]
Following two Wednesdays (Nov 9, 16) - Do rc.1 and rc.2
Wednesday (Nov 23) - Final release of MW
I'll be updating MW.org shortly.
Tyler Cipriani's assisting me with this release, so expect to see some RCs
with his name
(and signatures) on them :)
-Chad
Hi all!
Last week .gitreview for MediaWiki branches and extensions switched from
targeting a specific branch to using track=1[0].
This is a change that, going forward, should make it easier to do weekly
branching and releases without being too disruptive for developer
workflows.
The git-review version that allows for this change is 1.25.0. I have
updated the docs to reflect the use of this version[1].
It is also important to note that to use git-review with track=1 you
must be on a local branch that tracks an upstream – detached-head states
and branches that do no track upstream will cause strange errors.
-- Tyler
[0]. <https://phabricator.wikimedia.org/T146293>
[1]. <https://www.mediawiki.org/wiki/Gerrit/git-review>
Hi!
The next CREDIT showcase is in two days - Wednesday, 2-November-2016 at
1800 UTC (1100 San Francisco).
https://www.mediawiki.org/wiki/CREDIT_showcase
Got a demo? Add it here:
https://etherpad.wikimedia.org/p/CREDIT
Last month (WebM
<https://commons.wikimedia.org/wiki/File:CREDIT_-_October_2016.webm>,
YouTube <https://www.youtube.com/watch?v=PCn-oeHQnpU&feature=youtu.be&t=6>)
we saw great demos of a Raspberry Pi based network conditioner, Wikidata
credits for maps, extended OCR support for Indic Wikisource projects, an
intro to EventBus and ChangePropagation, data visualizations on maps, and
an alternative table of contents approach.
We're excited to see what's next! Whether you've just launched a new
feature or are just getting started with an idea, we welcome demos from
Wikimedia community members and staff alike.
See you soon. And if you would like to invite anyone to CREDIT, feel free
to use this template.
*Hi <FNAME>*
*I hope all is well with you! I wanted to let you know about CREDIT, a
monthly demo series that we’re running to showcase open source tech
projects from Wikimedia Community, Reading, Editing, Discovery,
Infrastructure and Technology.*
*CREDIT is open to the public, and we welcome questions and discussion. The
next CREDIT will be held on November 2nd at 11am PT / 2pm ET / 18:00 UTC. *
*Here’s a link to the YouTube live stream
<https://www.youtube.com/watch?v=NmfqtP3pr2Y>, which will be available
shortly before the event starts. There’s more info on MediaWiki.org
<https://www.mediawiki.org/wiki/CREDIT_showcase>, and on Etherpad
<https://etherpad.wikimedia.org/p/CREDIT>, which is where we take notes and
ask questions. You can also ask questions on IRC in the Freenode chatroom
#wikimedia-office (web-based access here
<https://webchat.freenode.net/?channels=%23wikimedia-office>). *
*Please feel free to pass this information along to any interested folks.
Our projects tend to focus on areas that might be of interest to folks
working across the open source tech community: language detection,
numerical sort, large data visualizations, maps, and all sorts of other
things.*
*Thanks, and I hope to see you at CREDIT.*
-
*YOURNAME*
Hi Guys,
I'm using CirrusSearch on mw1.27 and wondering if there's a way to add
custom parameters to the Elastica back-end such as for example,
"sort": ["namespace_text": "asc"]
Thanks,
Aran
Hi everyone,
I'm organizing a contest for people in Romania willing to contribute
to Wikimedia code. [1] In order to automatically grade the
contributions, we're using a tool already developed be our partners,
ROSEdu, which reviews changes made on github [2][3].
The current (github-based) workflow is:
1. The admins add a number of repositories that qualify for the contest
2. The paticipants login with their github account (using oauth)
3. The software retrieves all the pull requests they made to the
relevant projects.
4. A number of points is assigned for each pull request using a
predefined formula (based on the number of touched lines, if the
change was merged etc.; can by customized)
I need some guidance on how to replicate this workflow to Wikimedia's gerrit.
I've read the API docs [4] and looked at the gerrit uploader [5] and
it seems that retrieving the reviews is fairly straightforward, since
all the reviews seem to be available through unauthenticated access.
The real issue is how to match the user in the tool with the reviews
without user intervention. Any ideas or advice are appreciated, but
here are my thoughts on the issue:
1. Gerrit does not seem to support oauth authentication. I vaguely
remember that the gerrit account used to be linked to the mw.org
account. Is there any way I could use the mw.org auth to retrieve the
gerrit account and/or authenticate to gerrit with it? The gerrit
uploader seems to only use the mw account to put the username in the
committer field and then uploads the change as itself.
2. The simplest (although not so secure) solution would be to ask
people to submit their changes using the same email address used for
their github account. This will only work if the user is willing to
make their github address public (I'm not doing that, for instance).
3. Another idea would be to match the gerrit account with the github
account. This sounds even less reliable.
4. Give up and ask the users to submit the email/user used for gerrit
and check for cheaters manually (this should work as long as the
number of contributors is small)
Thanks,
Strainu
[1] https://www.mediawiki.org/wiki/Wikimedia_Challenge_powered_by_ROSEdu
[2] http://challenge.rosedu.org/
[3] https://github.com/rosedu/challenge
[4] https://gerrit.wikimedia.org/r/Documentation/rest-api.html
[5] https://github.com/valhallasw/gerrit-patch-uploader