If you guys had a bitcoin option in your donation form you'd get more
donations (like from me!). I don't have any balance in paypal, but
sending some bitcoin would be easy.
Cross-posting this cool blog post about how the Discovery Analysis team
'puppetized' the dashboard backend and learned a ton in the process:
https://blog.wikimedia.org/2017/08/21/discovery-dashboards-puppet/.
Cheers,
Deb
--
deb tankersley
irc: debt
Product Manager, Discovery
Wikimedia Foundation
---------- Forwarded message ----------
From: Mikhail Popov <mpopov(a)wikimedia.org>
Date: Mon, Aug 21, 2017 at 12:14 PM
Subject: Re: [discovery] Puppetized Discovery Dashboards and Shiny Server
module for Puppet
To: A public mailing list about Wikimedia Search and Discovery projects <
discovery(a)lists.wikimedia.org>
Howdy, Discoverers!
The blog post describing the dashboard Puppetization process just went
up[1]. It explains Puppet and includes tips & resources for learning Puppet
for non-Ops people. If you've been curious about the technology, I
recommend you check out the post.
Cheers,
Mikhail
[1] https://blog.wikimedia.org/2017/08/21/discovery-dashboards-puppet/
On Tue, Jun 20, 2017 at 12:16 PM, Mikhail Popov <mpopov(a)wikimedia.org>
wrote:
> Howdy,
>
> Happy to report that production[1] and development[2] sets of Discovery
> Dashboards are up and running again, this time managed by Puppet. (There
> was a bug with web proxies and DNS settings that delayed this
> announcement.) Theoretically they should be snappier to use now because
> there is no longer an extra virtualization (Vagrant) layer and they are
> running directly on Labs instances.
>
> R is a software and programming language mainly used for statistical
> inference, machine learning, and data wrangling & visualization. RStudio's
> Shiny[3] is a framework for developing web applications in R, and it's what
> Discovery's dashboards are written in.
>
> The Reading::Discovery::Analysis team (with guidance and help from
> Guillaume Lederrey) is proud to announce a new module available in Ops'
> Puppet repo: shiny_server[4], which installs & configures RStudio's Shiny
> Server[5] for serving R/Shiny applications. The module also provides
> resources for installing R packages from CRAN, GitHub, and other remote git
> repositories like Gerrit. For a practical example, refer to Discovery
> Dashboards base[6] and production[7] profiles.
>
> Cheers,
> Mikhail on behalf of Discovery Analysts
>
> [1] https://discovery.wmflabs.org
> [2] https://discovery-beta.wmflabs.org/
> [3] https://shiny.rstudio.com/
> [4] https://github.com/wikimedia/puppet/tree/production/modules/
> shiny_server
> [5] https://www.rstudio.com/products/shiny/shiny-server/
> [6] https://github.com/wikimedia/puppet/blob/production/modules/
> profile/manifests/discovery_dashboards/base.pp
> [7] https://github.com/wikimedia/puppet/blob/production/modules/
> profile/manifests/discovery_dashboards/production.pp
>
_______________________________________________
Discovery mailing list
Discovery(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/discovery
Sorry for cross-posting!
On Wednesday, August 23rd, 2017 at 3 pm UTC, we start with our weekly
Technical Advice IRC Meeting on #wikimedia-tech IRC channel.
The Technical Advice IRC meeting is open for all volunteer developers,
topics and questions. This can be anything from "how to get started" over
"who would be the best contact for X" to specific questions on your project.
If you know already what you would like to discuss or ask, please add your
topic to the next meeting: https://www.mediawiki.org/
wiki/Technical_Advice_IRC_Meeting
This meeting is an offer by WMDE’s tech team. Hosts of the meeting are:
@addshore, @CFisch_WMDE.
Hope to see you there!
Michi (for WMDE’s tech team)
--
Michael F. Schönitzer
Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
http://wikimedia.de
Stellen Sie sich eine Welt vor, in der jeder Mensch an der Menge allen
Wissens frei teilhaben kann. Helfen Sie uns dabei!
http://spenden.wikimedia.de/
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
Hi all!
So, Wikimania happened! Yay! It was a busy time, lots of things were discussed
and revived. My appologies for not writing a Radar mail for so long - it's so
hard to concentrate when there is so much going on at the conference. So, here
are the minutes from the last TWO weeks' TechCom meeting.
You can also find the minutes at
<https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Minutes/2017-0…>
and
<https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Minutes/2017-0…>.
See also the TechCom activity page on the RFC board
<https://phabricator.wikimedia.org/tag/mediawiki-rfcs/>.
On August 9th, we had our first public TechCom meeting during the Wikimania
Hackathon. Thanks to everyone who participated! Here is what we talked about:
* New RFC: Attribute anonymous contributions to the first IP address used in a
session <https://phabricator.wikimedia.org/T172477>
* Did NOT approve after last call, more discussion needed: HTML5 section ids
<https://phabricator.wikimedia.org/T152540>
* Input wanted: Allow anonymous recent changes to be filtered by IP range
<https://phabricator.wikimedia.org/T172376> (not resourced)
* Media format switching to webm for video (Brion)
* ContentTranslation to use VE, server-side adaptation logic (Roan)
* Progress on raising minimum PHP version for MediaWiki to 5.6/7
<https://phabricator.wikimedia.org/T172165>
* Input wanted: use-cases for delayed job execution
<https://phabricator.wikimedia.org/T172832>
* Active discussion: moving most of MediaWiki within a /core folder
<https://phabricator.wikimedia.org/T167038>
* TechCom to think about QA strategies/guidelines over the coming weeks (code
review, test coverage, etc).
I did not personally attend the Meeting on August 16, since I had an off-site to
attend. Here is what I gathered from the notes:
* LAST CALL for Comments: PostgreSQL schema change for consistency with MySQL
<https://phabricator.wikimedia.org/T164898>. Should no pertinent objections
remain unaddressed by August 30th, this RFC will be approved for implementation.
Please comment on the ticket.
* Discussion (Roan): want to work on filtering edits by “revertedness”. Could
use change tags, maybe.
* Work started on making MediaWiki core PSR4 compatible. Involves renaming every
class in core! <https://phabricator.wikimedia.org/T166010> (Tim)
* Ongoing work on migrating Job Queue to EventBus/Kafka (Gabriel)
* Discussion on how <section> tags may be used in the context of HTML5 section
ID migration (T152540) <https://phabricator.wikimedia.org/T114072> (Timo)
* Active discussion: JSON schema validation:
<https://phabricator.wikimedia.org/T147137>
* Active discussion: load multi-file packages via ResourceLoader
<https://phabricator.wikimedia.org/T133462>
PS: Goats are the new kittens <https://phabricator.wikimedia.org/tag/goatification/>
--
Daniel Kinzler
Principal Platform Engineer
Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.
Hello All,
I'm reaching out to invite you to an ongoing discussion regarding MediaWiki
Technical Debt. The topic of technical debt has been discussed in a number
of forums, and there's even been some work towards reducing it.
This SIG's purpose is to get better alignment on what technical debt is,
how to approach reducing it, and what are things we can do to avoid it
moving forward.
If you'd like to attend, please register here by August 22nd:
https://goo.gl/forms/6sFWuBxfgGfwNUZ93
I'm trying to get a sense of those that are interested in attending and
their timezone. I look to schedule sessions that are timezone friendly.
Thanks and look forward to some good discussion.
Cheers,
JR
Jean-Rene Branaa (irc: jrbranaa)
RelEng Team
The Wikimedia Technical Committee is hereby issuing a last call for
comments on the RFC "PostgreSQL schema change for consistency with MySQL".
https://phabricator.wikimedia.org/T164898
If no new objections are raised, this RFC will be approved on August 30.
This RFC proposes to resolve differences between the PostgreSQL and
MySQL support in MediaWiki by reducing the use of PostgreSQL-specific
features.
Note that I'm not planning implement this RFC in the current quarter.
I would welcome volunteer implementors.
-- Tim Starling
There are thousands of pages generating lua errors when accessing wikidata
with Lua (T170039 <https://phabricator.wikimedia.org/T170039>) and it
affects on most/all wikis. The issue is lasting for few weeks since first
reported.
It seems to have no root cause, but the best current guess (AFAIK) is due
to Lua-sandbox (T171166 <https://phabricator.wikimedia.org/T171166>).
[Please don't take it personally!]
1. What is the ETA for getting it deployed to production?
2. IMO It looks like there is a little bit miscommunication (between WMF &
WMDE?) in handling this case. For future cases, it would be nice if who can
better communicate urgency of blocker tasks, especially if it involves
other organizations.
Thanks,
Eran
The deployment of MediaWiki and extensions version 1.30.0-wmf.14 is
blocked as a message in the error log gradually worsened following the
roll out 1.30.0-wmf.14 to group1 wikis. The error:
Cannot flush pre-lock snapshot because writes are pending
is detailed on phabricator[0].
As of right now, group0 wikis are on php-1.30.0-wmf.14, group1 wikis are
on php-1.30.0-wmf.13 (excluding wikidatawiki which is on
php-1.30.0-wmf.11), the remaining wikis are on php-1.30.0-wmf.13.
The types of issues that will halt the train and the process and
procedures for when the train is halted are detailed on Wikitech[1].
The current version deployed per wiki is available on the wikiversions
toollabs page[2].
-- Tyler
[0]. <https://phabricator.wikimedia.org/T173462>
[1]. <https://wikitech.wikimedia.org/wiki/Deployments/Holding_the_train>
[2]. <https://tools.wmflabs.org/versions/>