Hi,
On Tue, Mar 1, 2016 at 3:36 PM, David Strine <dstrine(a)wikimedia.org> wrote:
> We will be holding this brownbag in 25 minutes. The Bluejeans link has
> changed:
>
> https://bluejeans.com/396234560
I'm not familiar with bluejeans and maybe have missed a transition
because I wasn't paying enough attention. is this some kind of
experiment? have all meetings transitioned to this service?
anyway, my immediate question at the moment is how do you join without
sharing your microphone and camera?
am I correct thinking that this is an entirely proprietary stack
that's neither gratis nor libre and has no on-premise (not cloud)
hosting option? are we paying for this?
-Jeremy
Hi,
YuviPanda, prtksxna, and myself (with help from Tim and Aaron) have been
working the UrlShortener extension, which is designed to implement the
URL shortener RfC[1] (specifically Tim's implementation suggestion).
I've filed T108557[2] to deploy the extension to Wikimedia wikis. We'd
like to use the "w.wiki" short domain, which the WMF is already in
control of.
A test wiki has been set up mimicking what Wikimedia's configuration
would be like: http://urlshortener.wmflabs.org/, and has an accompanying
"short" domain at us.wmflabs.org (e.g. http://us.wmflabs.org/3). Please
play with it and report any bugs you might find :)
[1] https://www.mediawiki.org/wiki/Requests_for_comment/URL_shortener
[2] https://phabricator.wikimedia.org/T108557
Thanks,
-- Legoktm
On Thu, Feb 4, 2016 at 8:20 AM, MZMcBride <z(a)mzmcbride.com> wrote:
> Federico Leva (Nemo) wrote:
>>Login pretty much never does what I expect nowadays, but I'm not sure my
>>expectations are correct so I can't identify actual bugs.
>
> There are various open tasks in Phabricator about user sessions currently,
> such as <https://phabricator.wikimedia.org/T124440>. Being unexpectedly
> logged out lately has been a bit annoying, though I don't know if it's
> related to the Performance team or some other team.
The origin of the unexpected logouts falls on the AuthManager project
and specifically the SessionManager component that rolled out in
1.27.0-wmf.11 [0]. We had various issues related to the session
handling changes including a bug that was overloading the storage
capacity of the Redis servers that store session data [1] and two
other issues which required rolling the wikis back to 1.27.0-wmf.10
[2][3].
Both rollbacks were accompanied by a run of the
"resetGlobalUserTokens.php" maintenance script which updates each
user's CentralAuth records in such a way that their authentication
session will be considered invalid the next time it is used on a wiki.
This was done from an abundance of caution point of view concerning
possible issues with sessions that had been issued by the
SessionManager software. The reset script is not fast [4], so session
invalidation has slowly worked its way across the CentralAuth user
table.
Part of the enhancements that are being applied in order to bring
SessionManager back to production with 1.27.0-wmf.13 is a new config
setting that can be used to give us a nearly instant switch to throw
to invalidate all active sessions. This setting is actually included
in 1.27.0-wmf.12, but the configuration on the Wikimedia cluster has
not been changed to make use of it yet. Invalidating all user sessions
is not something we plan to do for fun certainly, but there have been
in the past (and likely will be in the future) software and
configuration issues that necessitate the use of that heavy hammer
approach.
[0]: https://phabricator.wikimedia.org/T123451
[1]: https://phabricator.wikimedia.org/T125267
[2]: https://wikitech.wikimedia.org/wiki/Incident_documentation/20160123-Session…
[3]: https://tools.wmflabs.org/sal/log/AVKZtfQXW8txF7J0uNE2
[4]: https://phabricator.wikimedia.org/T124861
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
Dear readers of the Wikitech mailing list,
I am a member of the Wikipedia community and I have started a project to
reduce the environmental impact of the Wikimedia movement
<https://meta.wikimedia.org/wiki/Environmental_impact>. The main idea is to
use renewable energy for running the Wikimedia servers and the main reason
for this is that by doing so, Wikipedia can set a great example for
environmental responsibility in the entire internet sector.
My project was started after Greenpeace USA published a report
<http://www.greenpeace.org/usa/global-warming/click-clean/> about the
energy consumption of the biggest sites on the Internet in 2015 and in
which Wikipedia, to my astonishment, performed poorly, receiving a "D"
score and only passing because of the Wikimedia Foundation's openness about
its energy consumption.
I would very much like to change that and set up a page called "Environmental
impact <https://meta.wikimedia.org/wiki/Environmental_impact>" on Meta. I
have already discussed the issue with a few people both from the Wikimedia
Foundation's management and from the Wikimedia community and have received
positive responses.
In order to further advance the project, I would like to learn more about
how much energy Wikipedia's servers use. As far as I can tell, these
figures are not public, but I believe they could very well be.
Also, I am interested to learn how changing a server site's energy sources
can be carried out on the operations side since the United States energy
sector hasn't been completely deregulated yet.
So, thank you very much for any comments! Maybe there also is an even
better forum to discuss these questions?
Finally, if you would like to support my project, please consider adding
your name to this list
<https://meta.wikimedia.org/wiki/Environmental_impact#Show_your_support>.
Thank you.
Kind regards,
Lukas Mezger / User:Gnom <https://meta.wikimedia.org/wiki/User:Gnom>
Hi all,
I would like to introduce to the Wikimedia community WikiToLearn, a FOSS
project of which I am a participant and which is lately getting a lot of
contributions and momentum.
It is a KDE project sponsored (among the others) by Wikimedia Italy and
recently joined by institutions such as HEP Software Foundation (CERN,
Fermilab, Princeton...) or Universities such as University of Pisa and Milano-
Bicocca. These institutions are already populating the website with content.
We aim to provide a platform where learners and teachers can complete, refine
and re-assemble lecture notes in order to create free, collaborative and
accessible textbooks, tailored precisely to their needs.
Although the project is quite young (only a few months old), it is already
growing in allure at an unexpected rate. Thanks to this we are now counting on
nearly 40 developers, and growing (including content developers).
We are different from Wikipedia and other WMF projects in several ways, and in
a sense, complementary. Our focus is on creating complete textbooks (and not
encyclopedic articles), drawing from a professor’s or a student’s own notes,
either existing or that have to be written down.
We also have a strong focus on offline use: all the content of WikiToLearn
should be easily printable by any student for offline use and serious
studying.
Besides a good team for content development, we can count on a small but
motivated team of developers, and we would like to improve communication with
upstream (a.k.a. you ;-) ), because we found ourselves developing a few
features which could probably be made available to the general public, with
some generalization and polishing. ;-)
Is this a right place to start such a discussion?
We would like to help as much as we can, but we might need some mentoring in
how to best approach MediaWiki development, as many of us are relatively new
to OSS/Web development.
Bye,
-Riccardo
We have decided to officially retire the rest.wikimedia.org domain in
favor of /api/rest_v1/ at each individual project domain. For example,
https://rest.wikimedia.org/en.wikipedia.org/v1/?doc
becomes
https://en.wikipedia.org/api/rest_v1/?doc
Most clients already use the new path, and benefit from better
performance from geo-distributed caching, no additional DNS lookups,
and sharing of TLS / HTTP2 connections.
We intend to shut down the rest.wikimedia.org entry point around
March, so please adjust your clients to use /api/rest_v1/ soon.
Thank you for your cooperation,
Gabriel
--
Gabriel Wicke
Principal Engineer, Wikimedia Foundation
I've noticed that a lot of my fellow MediaWiki devs using OS X or Linux
don't have much experience with Windows 10's new 'Edge' browser, so I wrote
up some notes about testing with it and what to expect regarding versioning
and bug reporting:
https://www.mediawiki.org/wiki/Microsoft_Edge_browser_testing_notes
In particular, note that although versioning of the browser engine is tied
to the Windows 10 operating system, the OS is updated much more
aggressively than older versions of Windows.
The 'Windows Insider' preview release program also gives a chance to check
new engine features for bugs, or confirm that a reported bug has been fixed
correctly, before major OS updates go out. Not as good as the nightly
browser builds we get from Mozilla and Google, but it's a big improvement
from IE days. :)
-- brion
Hello,
The WMF’s technology department has for this quarter the goal of testing
and temporarily switching the main operational data centre from Eqiad
(located in Chicago) to Codfw (located in Dallas)~[1,2]. This includes both
back-end-processing as well as serving live traffic from it.
As a part of this effort, we are scheduling a switch-over for RESTBase and
its back-end services, including: Parsoid, the Mobile Content Service,
CXServer, Mathoid, Citoid, Apertium and Zotero~[3]. Technically, it will
not be a real switch-over per se, because we will keep all of those
services active in both DCs. However, external traffic will be directed to
the Dallas DC only.
=== When is it and what does it mean for me? ===
The switch-over test is planned for this Thursday, 2016-03-17. We have
allotted a three-hour window for this~[4]. There is nothing users should
do before or after the switch; it will be transparent for them. There are
two things users should note, though:
1) At the time of the switch-over, users might receive error responses for
a while (both 4xx and 5xx status codes). While we will test most of the
things ahead of time, we cannot test the actual traffic shifting, so small
bumps might be noticed.
2) After the switch to the Dallas DC, users will likely see their response
latencies slightly elevated. During the test, some requests might
experience a slightly larger latency. This will occur because all of the
services that will be responding to live requests still need to contact the
main MediaWiki cluster, which will remain in Eqiad (the other DC) until a
complete switch-over of the infrastructure is performed. However, given the
multiple levels of caching, the 40 ms of penalty to go cross-DC for an
uncached API request does not seem too taxing.
=== Wait, what about my service X running in WMF production? ===
If you are a service owner of one the aforementioned services, there are no
explicit actions you should take prior to, during or after the switch-over
test. This test could, however, affect your service depending on whether it
usually serves live traffic or is mostly operational during various
internal updates. MediaWiki and JobQueue processing will still be performed
in Eqiad, so in the latter case your service should not see a change in the
usage pattern. If, however, your service is mostly in charge of responding
to live requests coming through RESTBase, those will be handled by
instances in Codfw. However, as these services are full replicas of their
Eqiad counterparts and are stateless, no major breakage will happen.
Should you have any questions or concerns, don’t hesitate to contact us
here or on IRC (#wikimedia-services @ freenode).
Best,
Marko Obrovac, PhD
Senior Services Engineer
Wikimedia Foundation
[1]
https://www.mediawiki.org/wiki/Wikimedia_Engineering/2015-16_Q3_Goals#Techn…
[2] https://phabricator.wikimedia.org/project/profile/1723/
[3] https://phabricator.wikimedia.org/T127974
[4]
https://wikitech.wikimedia.org/wiki/Deployments#Thursday.2C.C2.A0March.C2.A…
Hello everyone,
Related pages feature has been in beta for over two months now, the future
of the feature depends on our discussions. While we currently don't have a
clear process for deciding collaboratively on an all languages product,
Alsee and the reading team have put together this document on meta [0], as
a request for comment, seeking comments and ideas on modifications
required, and how to further test the feature. In fact, we are not sure if
an rfc is the best strategy to move forward with product decisions, but
lets see how the discussion evolves, and we might explore the need for a
different process, as we move on with this one.
We managed to translate a brief introduction about the topic, please feel
free to fully translate the document and/or further promote the discussion
on your wiki. We are trying hard to avoid having an English centric
discussion for a feature that could be available across all language
projects, and while we don't have a clear solution for this, we are trying
this method as an experiment, where at least our communities can leave
comments in their preferred language if they aren't comfortable writing in
English but they can understand it.
Please check the page, help with translation or promotion in your
Wikipedia, and most importantly, comment on how you think it can evolve. :)
Lets see how this works!
All the best,
M
[0] https://meta.wikimedia.org/wiki/Requests_for_comment/Related_Pages