Hi,
I think this is good news, but it also makes me wonder the following:
We saw it happen a couple of times the last months, now the app is in
official WMF hands how long will it take before they kick the volunteers
out, and let the WMF devs work on it?
I would rather have seen this volunteer (and other) volunteer projects in
the hands of volunteers. Every year the WMF asks for more money, and take
away more and more things the volunteers used to run for free.
I would rather see things moved to volunteers instead of the other way
arround.
Paul Vlaan
Hey everyone,
I wanted to invite all of you to the OpenStreetMap hackweekend on July 2
and 3 in Bangalore being hosted at the Mapbox office. We will spend two
full days working on projects like iD, openstreetmap-website, JOSM etc.
You can see all the projects and ideas in the wiki
http://wiki.openstreetmap.org/wiki/Bengaluru_Hack_weekend_July_2016. Pick
one or propose your own.
If you are in town and want to contribute to OpenStreetMap, this is a great
opportunity to get started alongside expert developers on the open mapping
stack! Wikimedia projects have already started integrating maps from OSM
https://www.mediawiki.org/wiki/Maps and this can be a useful space to begin
work on ideas for greater collaboration between the communities.
Please RSVP (https://osmhackweekend.splashthat.com/) and let me know if you
have any questions.
Read more: https://www.mapbox.com/blog/osm-hackweekend/
Looking forward!
Cheers,
Jinal Foflia
== What is happening? ==
Secure connections to RCStream[1] currently use an SSL/TLS certificate[2]
specific to stream.wikimedia.org. To streamline certificate management, we
are moving RCStream behind our misc caching cluster, which will allow us to
use the wildcard certificate[3] for *.wikimedia.org, making the
RCStream-specific certificate redundant. This will reduce operating costs
and improve performance in certain cases.
== When will this happen? ==
June 23rd.
== How could this affect me? ==
This change requires updating the DNS record for stream.wikimedia.org. We
do not expect any service disruptions. It is conceivable (but unlikely)
that you will need to restart your client. If your client is based on one
of the published examples[4], you should be fine. If you are not sure, feel
free to get in touch with me (ori(a)wikimedia.org).
If you are connecting to RCStream over an insecure (http) connection, now
would be a great time to migrate to https. http access to RCStream will
eventually be disabled; migrating now will protect you from any
interruptions down the line. In most cases, making your client use https is
as simple as prefixing 'stream.wikimedia.org' with 'https://'. Sample
client code on Wikitech[4] has been updated.
== How can I track this work? ==
By following https://phabricator.wikimedia.org/T134871.
[1]: https://wikitech.wikimedia.org/wiki/RCStream
[2]: https://en.wikipedia.org/wiki/Public_key_certificate
[3]: https://en.wikipedia.org/wiki/Wildcard_certificate
[4]: https://wikitech.wikimedia.org/wiki/RCStream#Clients
We just released a new version of Research:FAQ on Meta [1], significantly
expanded and updated, to make our processes at WMF more transparent and to
meet an explicit FDC request to clarify the role and responsibilities of
individual teams involved in research across the organization.
The previous version – written from the perspective of the (now inactive)
Research:Committee, and mostly obsolete since the release of WMF's open
access policy [2] – can still be found here [3].
Comments and bold edits to the new version of the document are welcome. For
any question or concern, you can drop me a line or ping my username on-wiki.
Thanks,
Dario
[1] https://meta.wikimedia.org/wiki/Research:FAQ
[2] https://wikimediafoundation.org/wiki/Open_access_policy
[3] https://meta.wikimedia.org/w/index.php?title=Research:FAQ&oldid=15176953
*Dario Taraborelli *Head of Research, Wikimedia Foundation
wikimediafoundation.org • nitens.org • @readermeter
<http://twitter.com/readermeter>
On Thu, Feb 4, 2016 at 8:20 AM, MZMcBride <z(a)mzmcbride.com> wrote:
> Federico Leva (Nemo) wrote:
>>Login pretty much never does what I expect nowadays, but I'm not sure my
>>expectations are correct so I can't identify actual bugs.
>
> There are various open tasks in Phabricator about user sessions currently,
> such as <https://phabricator.wikimedia.org/T124440>. Being unexpectedly
> logged out lately has been a bit annoying, though I don't know if it's
> related to the Performance team or some other team.
The origin of the unexpected logouts falls on the AuthManager project
and specifically the SessionManager component that rolled out in
1.27.0-wmf.11 [0]. We had various issues related to the session
handling changes including a bug that was overloading the storage
capacity of the Redis servers that store session data [1] and two
other issues which required rolling the wikis back to 1.27.0-wmf.10
[2][3].
Both rollbacks were accompanied by a run of the
"resetGlobalUserTokens.php" maintenance script which updates each
user's CentralAuth records in such a way that their authentication
session will be considered invalid the next time it is used on a wiki.
This was done from an abundance of caution point of view concerning
possible issues with sessions that had been issued by the
SessionManager software. The reset script is not fast [4], so session
invalidation has slowly worked its way across the CentralAuth user
table.
Part of the enhancements that are being applied in order to bring
SessionManager back to production with 1.27.0-wmf.13 is a new config
setting that can be used to give us a nearly instant switch to throw
to invalidate all active sessions. This setting is actually included
in 1.27.0-wmf.12, but the configuration on the Wikimedia cluster has
not been changed to make use of it yet. Invalidating all user sessions
is not something we plan to do for fun certainly, but there have been
in the past (and likely will be in the future) software and
configuration issues that necessitate the use of that heavy hammer
approach.
[0]: https://phabricator.wikimedia.org/T123451
[1]: https://phabricator.wikimedia.org/T125267
[2]: https://wikitech.wikimedia.org/wiki/Incident_documentation/20160123-Session…
[3]: https://tools.wmflabs.org/sal/log/AVKZtfQXW8txF7J0uNE2
[4]: https://phabricator.wikimedia.org/T124861
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
Hello,
>From the Wikimedia Foundation Discovery department here's this week's
updates.
* The Portal team turned off A/B test for languages by article count new
display [1]
* Portal team turned on survey for question 'how did you arrive at
wikipedia.org' (second running of survey) [2]
* Maps has moved to shiny new servers [3]
* Wikidata and Commons can now use maps [4] [5]
* Russian Wikivoyage has switched to the new <mapframe>/<maplink>s
* Various Discernatron updates: Default snippets open, open/close all
snippets button, larger click area for open/close snippet, snippet
highlighting
* Started AB test for interwiki search based on language detection on
enwiki, frwiki, eswiki, dewiki and itwiki
* Did you know? Wikimedia wikis see about 31M full text searches a day.
[1] https://phabricator.wikimedia.org/T131526
[2] https://phabricator.wikimedia.org/T136874
[3] https://phabricator.wikimedia.org/T137620
[4] https://phabricator.wikimedia.org/T138030
[5] https://phabricator.wikimedia.org/T138029
----
Feedback and suggestions on this weekly update are welcome!
The full update, and archive of past updates, can be found on Mediawiki.org:
https://www.mediawiki.org/wiki/Discovery/Status_updates
--
Yours,
Chris Koerner
Community Liaison - Discovery
Wikimedia Foundation
I'm happy to announce a new mirror for datasets other than the XML dumps.
This mirror comes to us courtesy of the Center for Research Computing,
University of Notre Dame, and covers everything "other" [1] which includes
such goodies as Wikidata entity dumps, pageview counts, titles of all files
on each wiki (daily), titles of all articles of each wiki (daily), and the
so-called "adds-changes" dumps, among other things. You can access it at
http://wikimedia.crc.nd.edu/other/ so please do!
Ariel
[1] https://dumps.wikimedia.org/other/
The Wikimedia Language team has been assembling monthly reports about
language support activities for one year. You can read the latest
report at:
https://www.mediawiki.org/wiki/Wikimedia_Language_engineering/Reports/2016-…
Highlights for May include: Special:Translate got an edit summary
field and modernization of web font formats: woff2 is in, eot is out.
Due to the nature of our work, the Language team [1] (Amir, Kartik,
Pau, Runa, Santhosh, and myself) alone cannot adequately support all
the languages of the Wikimedia movement. That is why the report
includes work by volunteers. We have bolded the names who we believe
are contributing as volunteers.
This report focuses on technical activities. You wont find future
plans or high level roadmap items on it. There is currently a major
omission: the i18n work of MediaWiki core itself. That is lacking
because it is more difficult to filter those activities and also
because we have not had much time for MediaWiki core i18n work.
To acknowledge the work of volunteers and to support them better, the
Language team released a statement of intent for code review [2] about
six months ago. To summarize: we attempt to review patches not by us
within a week, and patches stalled due to no updates after review for
three months will be abandoned -- unless we feel they are worth fixing
ourselves.
When we released the statement, we also agreed to reduce the existing
backlog of open patches. The results so far are positive, even though
it is easy to find examples where we have not been able to follow our
intent. Translate extension had 35 open patches when we started in
February -- at end of May it had only 12 open patches [3]. Universal
Language Selector had gone from 10 to 6, and fewer of them unreviewed.
Content Translation had gone from 15 to zero. Our jquery repositories
in GitHub have not fared as well, but we hope to achieve similar
results there in the future.
We excluded many repositories from the statement of intent in the fear
that we would add too much of a burden to ourselves. To our delight,
except MediaWiki core i18n, all those repositories have had swift
reviews and I count only two open patches in them.
- Niklas (on behalf of the Language team)
[1] https://www.mediawiki.org/wiki/Wikimedia_Language_engineering
[2] https://www.mediawiki.org/wiki/Wikimedia_Language_engineering/Code_review_s…
[3] The numbers change constantly. As of 2016-06-17 Translate has 23
open patches, but only 10 of them not from our team. Universal
Language Selector has 13 patches, 5 of them not from our team. Content
Translation currently has 6, one of them not from our team.