On Thu, Feb 4, 2016 at 8:20 AM, MZMcBride <z(a)mzmcbride.com> wrote:
> Federico Leva (Nemo) wrote:
>>Login pretty much never does what I expect nowadays, but I'm not sure my
>>expectations are correct so I can't identify actual bugs.
> There are various open tasks in Phabricator about user sessions currently,
> such as <https://phabricator.wikimedia.org/T124440>. Being unexpectedly
> logged out lately has been a bit annoying, though I don't know if it's
> related to the Performance team or some other team.
The origin of the unexpected logouts falls on the AuthManager project
and specifically the SessionManager component that rolled out in
1.27.0-wmf.11 . We had various issues related to the session
handling changes including a bug that was overloading the storage
capacity of the Redis servers that store session data  and two
other issues which required rolling the wikis back to 1.27.0-wmf.10
Both rollbacks were accompanied by a run of the
"resetGlobalUserTokens.php" maintenance script which updates each
user's CentralAuth records in such a way that their authentication
session will be considered invalid the next time it is used on a wiki.
This was done from an abundance of caution point of view concerning
possible issues with sessions that had been issued by the
SessionManager software. The reset script is not fast , so session
invalidation has slowly worked its way across the CentralAuth user
Part of the enhancements that are being applied in order to bring
SessionManager back to production with 1.27.0-wmf.13 is a new config
setting that can be used to give us a nearly instant switch to throw
to invalidate all active sessions. This setting is actually included
in 1.27.0-wmf.12, but the configuration on the Wikimedia cluster has
not been changed to make use of it yet. Invalidating all user sessions
is not something we plan to do for fun certainly, but there have been
in the past (and likely will be in the future) software and
configuration issues that necessitate the use of that heavy hammer
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
Dear readers of the Wikitech mailing list,
I am a member of the Wikipedia community and I have started a project to
reduce the environmental impact of the Wikimedia movement
<https://meta.wikimedia.org/wiki/Environmental_impact>. The main idea is to
use renewable energy for running the Wikimedia servers and the main reason
for this is that by doing so, Wikipedia can set a great example for
environmental responsibility in the entire internet sector.
My project was started after Greenpeace USA published a report
<http://www.greenpeace.org/usa/global-warming/click-clean/> about the
energy consumption of the biggest sites on the Internet in 2015 and in
which Wikipedia, to my astonishment, performed poorly, receiving a "D"
score and only passing because of the Wikimedia Foundation's openness about
its energy consumption.
I would very much like to change that and set up a page called "Environmental
impact <https://meta.wikimedia.org/wiki/Environmental_impact>" on Meta. I
have already discussed the issue with a few people both from the Wikimedia
Foundation's management and from the Wikimedia community and have received
In order to further advance the project, I would like to learn more about
how much energy Wikipedia's servers use. As far as I can tell, these
figures are not public, but I believe they could very well be.
Also, I am interested to learn how changing a server site's energy sources
can be carried out on the operations side since the United States energy
sector hasn't been completely deregulated yet.
So, thank you very much for any comments! Maybe there also is an even
better forum to discuss these questions?
Finally, if you would like to support my project, please consider adding
your name to this list
Lukas Mezger / User:Gnom <https://meta.wikimedia.org/wiki/User:Gnom>
The next CREDIT showcase will be Thursday, 12-May-2016 at 1800 UTC (1100
For this one we'll use Hangouts on Air for presenters, and the customary
YouTube stream for viewers.
See you next month!
2016-04-12 14:01 GMT+03:00 Adrian Heine <adrian.heine(a)wikimedia.de>:
> Hi everyone,
> as some of you might know, I'm a software developer at Wikimedia
> Deutschland, working on Wikidata. I'm currently focusing on improving
> Wikidata's support for languages we as a team are not using on a daily
> basis. As part of my work I stumbled over a shortcoming in MediaWiki's
> message system that – as far as I see it – prevents me from doing the right
> thing(tm). I'm asking you to verify that the issue I see indeed is an issue
> and that we want to fix it. Subsequently, I'm interested in hearing your
> plans or goals for MediaWiki's message system so that I can align my
> implementation with them. Finally, I am hoping to find someone who is
> willing to help me fix it.
First of all, thanks for working on this issue. It is a real issue,
but not often requested. I think that is because manually checking in
every place whether the language code is unexpected (different from
the one in current context) would be cumbersome and always outputting
language codes on every tag would be bloaty. Best would be if this
checking was automated in a templating library, but so far templating
hasn't been much adopted in MediaWiki core. But of course this
information needs to be exposed first, which is what I understand you
> == The issue ==
> On Wikidata, we regularly have content in different languages on the same
> page. We use the HTML lang and dir attributes accordingly. For example, we
> have a table with terms for an entity in different languages. For missing
> terms, we would display a message in the UI language within this table. The
> corresponding HTML (simplified) might look like this:
> <div id="mw-content-text" lang="UILANG" dir="UILANG_DIR">
> <table class="entity-terms">
> <tr class="entity-terms-for-OTHERLANG1" lang="OTHERLANG1"
> <td class="entity-terms-for-OTHERLANG1-label">
> <div class="wb-empty" lang="UILANG" dir="UILANG_DIR">
> <!-- missing label message -->
> This works great as long as the missing label message is available in the UI
> language. If that is not the case, though, the message is translated
> according to the defined language fallbacks. In that case, we might end up
> with something like this:
> <div class="wb-empty" lang="arc" dir="rtl">No label defined</div>
> That's obviously wrong, and I'd like to fix it.
> == Fixing it ==
> For fixing this, I tried to make MessageCache provide the language a message
> was taken from . That's not too straight-forward to begin with, but while
> working on it I realized that MessageCache is only responsible for following
> the language fallback chain for database translations. For file-based
> translations, the fallbacks are directly merged in by LocalisationCache, so
> the information is not there anymore at the time of translating a message. I
> see some ways to fix this:
> * Don't merge messages in LocalisationCache, but perform the fallback on
> request (possibly caching the result)
> * Tag message strings in LocalisationCache with the language they are in
> (sounds expensive to me)
> * Tag message strings as being a fallback in LocalisationCache (that way we
> could follow the fallback until we find a language in which the message
> string is not tagged as being a fallback)
> What do you think?
The current localisation cache implementation quite obviously trades
space for speed. In this light I would suggest option two, to tag the
actual language the string is in.
However, this trade-off might not make sense anymore, as we have more
languages and more messages, resulting in almost gigabyte size caches.
See also for example https://phabricator.wikimedia.org/T99740. I added
wikitech-l to CC in hopes that people who have worked on localisation
cache more recently would comment on whether option one, to not merge
messages, would make more sense nowadays.
>  https://gerrit.wikimedia.org/r/282133
PhpStorm, InteliJ IDEA, Resharper and other JetBrain users, we just
received free upgraded licenses for all of their products.
Check your account at https://account.jetbrains.com/licenses and login with
that account inside your application to automatically use that license. If
you don't see the license, contact me or Sam Reed, and we will add you
right away. On IRC: yurik or reedy, or via email. We could also use a few
more admins for these licenses. You will no longer need to copy/paste any
Resharper: for C# and Microsoft Visual Studio C++
CLion: for C++
Right now, MediaWiki has 2 pure-PHP engines to produce diffs (there's also
a native PHP extension wikidiff2, but we're not discussing it right now):
* DairikiDiff is what everybody uses, and
* Wikidiff3, and alternative implementation by Guy Van den Broeck that was
around for 8 years but required a configuration change
While less battle-tested, Wikidiff3 offers vastly improved performance on
heavy diffs compared to DairikiDiff. The price, however, is that it makes
certain shortcuts if the diff is too complex. I ran through 100K diffs from
English Wikipedia, and 6% of diffs were different. Lots of changes were
seemingly insignificant but I need your help with determining if it's
I've built this tool
<https://diff-forge.wmflabs.org/wiki/Special:DiffCompare> to facilitate
the comparison. It displays two diffs from different algorithms side by
side (yeah, it can get too wide, I know:P). Which of them is which is
random. Parts with differences between the implementations are highlighted
in yellow. Below is the diff of differences for the reference. You can vote
with buttons above the diffs, no registration is required. If you see a
catastrophically bad diff please send me the link.
Unless the results are significantly worse, I'd like to go ahead and make
wikidiff3 the only implementation.
Max Semenik ([[User:MaxSem]])