Hi,
On Tue, Mar 1, 2016 at 3:36 PM, David Strine <dstrine(a)wikimedia.org> wrote:
> We will be holding this brownbag in 25 minutes. The Bluejeans link has
> changed:
>
> https://bluejeans.com/396234560
I'm not familiar with bluejeans and maybe have missed a transition
because I wasn't paying enough attention. is this some kind of
experiment? have all meetings transitioned to this service?
anyway, my immediate question at the moment is how do you join without
sharing your microphone and camera?
am I correct thinking that this is an entirely proprietary stack
that's neither gratis nor libre and has no on-premise (not cloud)
hosting option? are we paying for this?
-Jeremy
As of 950cf6016c, the mediawiki/core repo was updated to use DB_REPLICA
instead of DB_SLAVE, with the old constant left as an alias. This is part
of a string of commits that cleaned up the mixed use of "replica" and
"slave" by sticking to the former. Extensions have not been mass
converted. Please use the new constant in any new code.
The word "replica" is a bit more indicative of a broader range of DB
setups*, is used by a range of large companies**, and is more neutral in
connotations.
Drupal and Django made similar updates (even replacing the word "master"):
* https://www.drupal.org/node/2275877
* https://github.com/django/django/pull/2692/files &
https://github.com/django/django/commit/beec05686ccc3bee8461f9a5a02c607a023…
I don't plan on doing anything to DB_MASTER, since it seems fine by itself,
like "master copy", "master tape" or "master key". This is analogous to a
master RDBMs database. Even multi-master RDBMs systems tend to have a
stronger consistency than classic RDBMs slave servers, and present
themselves as one logical "master" or "authoritative" copy. Even in it's
personified form, a "master" database can readily be thought of as
analogous to "controller", "governer", "ruler", lead "officer", or such.**
* clusters using two-phase commit, galera using certification-based
replication, multi-master circular replication, ect...
**
https://en.wikipedia.org/wiki/Master/slave_(technology)#Appropriateness_of_…
***
http://www.merriam-webster.com/dictionary/master?utm_campaign=sd&utm_medium…
--
-Aaron
Following the recent outage, we've had a new series of complaints
about the lack of improvements in CX, especially related to
server-side activities like saving/publishing pages.
Now, I know the team is involved in a long-term effort to merge the
editor with the VE, but is there an end in sight for that effort? Can
I tell people who ask "look, 6 more months then we'll have a much
better translation tool"?
Is there a publicly available roadmap for this project and more
generally, for CX?
Thanks,
Strainu
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512
Hi,
MediaWiki code search is a fully free software tool that lets you
easily search through all of MediaWiki core, extensions, and skins
that are hosted on Gerrit. You can limit your search to specific
repositories, or types of repositories too. Regular expressions are
supported in both the search string, and when filtering by path.
Try it out: https://codesearch.wmflabs.org/search/
I started working on this because the only other options to searching
the entire MediaWiki codebase was either cloning everything locally
(takes up space, and need to manually keep it up to date) or using
Github (not free software, has extraneous repositories). The backend
is powered by hound, a code search tool written by etsy, based on
Google's Code Search.
Please let me know what you think! More documentation and links are
at: <https://www.mediawiki.org/wiki/Codesearch>.
- -- Legoktm
-----BEGIN PGP SIGNATURE-----
iQJLBAEBCgA1FiEE+h6fmkHn9DUCyl1jUvyOe+23/KIFAlo7NUoXHGxlZ29rdG1A
bWVtYmVyLmZzZi5vcmcACgkQUvyOe+23/KJn/w//YYSD6Fer5EfQAXj+frd02rB5
yx8cowO4ttPFG+52ZTt4RE24SdjSFcz42jnq6wuSQ47pQsZHgDc5qrr6JRsFGq9l
Bvnh7NIYsHHOdQDTkxwHHwaHBTb31u35Bt8+qSHPqbB3cCAHMirJJjvs5+yoilIi
wCmbjpxYoL4eUiMNeZRH/eYyUxpZJwHadc2FuuN3meUIgKoFAblHnKdxTmYoExqr
86PkjE36trbvOQkfrxaSyGJjG5Nm7l+83rm3pCo5pX9Fj/GZOdxcp0siRBKGaQ7W
OciRofZAPjtqmiUunf2pe/wVEAK51VS7EkobgWraSSOwBf62PN7hHVLXQanRn8bh
tQEcKHOxoVSXDlM/fl45cIBN/YGm9LEmRk0iB1HlZZ+QSC3XYj3kL/eMLlGorOuX
MtKZ+J1KOjNJ2fmCMBZhGDzdHPSN70VSAN2Th3kqpDTGzXLTcn3D0VqIT0gQ6eiz
lVyW0haiDuBS7JixZDdLFNr8RkMRLRWmJEdQQi/5VEp1I7K/UQmmt50HqzDBN4d6
/0iKw8p5lANdmjP1rsVzmRrc5C94IS6GN68VznfXMPD+iXI4j1PEeJ6cgEn4aD3y
oh2bD4nmX/T4YfBeigWxPVq3OyPHC5tPzTxdy8OHPNfko/xpwhlBMaf70fBIaBPy
Ciq+thh5hlKuCT1HdXI=
=Te+C
-----END PGP SIGNATURE-----
Work has begun to upgrade the base distribution for MediaWiki-Vagrant
to Debian Stretch [0]. The Wikimedia production cluster is preparing
for a similar upgrade [1] which will in part allow the Wikimedia wikis
to migrate to PHP 7 [2].
== What's new in the stretch-migration branch? ==
* Debian Stretch (Debian 9) base image
* Default PHP runtime is Zend PHP 7.0 (HHVM available via role)
* Database is MariaDB 10.1
* Provisioning via Puppet 4
Setting up a basic wiki (no roles enabled) seems to work pretty well.
Additional roles need testing and may require updated Puppet manifests
(Puppet syntax updates, erb syntax updates, package name changes,
additional packages). Help is needed to test roles, file bugs, and
submit patches. With some help I think we can be ready to switch to
Stretch as the default base image in early/mid January.
== Testing the Stretch base image and Puppet profiles ==
Its recommended to test with a fresh MediaWiki-Vagrant checkout so if
things go badly you can easily switch back to your original install
and keep working.
$ git clone --recursive
https://gerrit.wikimedia.org/r/mediawiki/vagrant mwv-stretch
$ cd mwv-stretch
$ git checkout stretch-migration
$ ./setup.sh
$ vagrant up
You can run `vagrant roles list -e -1` to get a nice list of the roles
you have enabled on your normal Trusty VM install to copy over to your
Stretch testing VM. This one-liner liner might even do it for you:
$ cd mwv-stretch
$ vagrant roles enable $(cd ../vagrant; vagrant roles list -e -1)
$ vagrant provision
[0]: https://phabricator.wikimedia.org/T181353
[1]: https://phabricator.wikimedia.org/T174431
[2]: https://phabricator.wikimedia.org/T176370
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services Boise, ID USA
irc: bd808 v:415.839.6885 x6855
Hello all!
Addshore last night merged the patch[1] that is the first major step towards
Multi-Content-Revisions[2]: it completely guts the Revision class and turns it
into a thin proxy for the new RevisionStore service. The new code is now live
on beta.
This is our second attempt: The first one, on December 18th, thoroughly
corrupted the beta database. It took us some time and a lot of help from Aaron
and especially Roan to figure out what was happening. A detailed post-mortem by
Roan can be found at [3].
Anyway - this stage of MCR development introduces the new multi-revision capable
interface for revision storage (and blob storage) [4]. It does not yet introduce
the new database schema, that will be the next step [5][6]. While doing the
refactoring, I tried to keep the structure of the existing code mostly intact,
just moving functionality out of Revision into the new classes, most importantly
RevisionRecord, RevisionStore, and BlobStore.
Beware that with the next deployment (due January 2nd) the live sites will start
using the new code. Please keep an eye out for any strangeness regarding
revision handling. Adam greatly improved test coverage of the relevant code
(thanks Adam!), but it's always possible that we missed some edge case, maybe
something about archived revisions that were partially migrated from on old
schema or something similarly fun.
Exiting times!
Cheers
Daniel
[1] https://gerrit.wikimedia.org/r/#/c/399174/
[2] https://www.mediawiki.org/wiki/Requests_for_comment/Multi-Content_Revisions
[3] https://phabricator.wikimedia.org/T183252#3853749
[4] https://phabricator.wikimedia.org/T174025
[5] https://phabricator.wikimedia.org/T174024
[6] https://phabricator.wikimedia.org/T174030
--
Daniel Kinzler
Principal Platform Engineer
Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512
Hi,
MediaWikiTestCase will now verify[1] that @covers tags are sane when
running normal PHPUnit tests. Previously PHPUnit would only verify
that when it ran the twice daily coverage job, and fail if even a
single one was wrong.
This also affects extensions, even though we don't generate coverage
reports for them yet[2]!
If your tests extend the PHPUnit_Framework_TestCase class, you can
just add "use MediaWikiCoversValidator" so the trait is invoked and
will still validate @covers tags.
[1] https://gerrit.wikimedia.org/r/#/c/399775/
[2] https://phabricator.wikimedia.org/T71685
- -- Legoktm
-----BEGIN PGP SIGNATURE-----
iQJLBAEBCgA1FiEE+h6fmkHn9DUCyl1jUvyOe+23/KIFAlpGqdUXHGxlZ29rdG1A
bWVtYmVyLmZzZi5vcmcACgkQUvyOe+23/KK3+Q/9GipyKlAjx1GqI/uJQNrpAbzR
IqWa8aAU2xueeVdrzfbsHblYmdpOJvu1xFNXuUY1X1/rG8sgVn45MaYy/KEE+wZO
W3uLqMfRZHRa8crUqoK8Y4jnj+CBKXvF9HpKqXRzg18O2g0L4ayyxC9kpkTSIyRx
WQy6+Pp1o+F0yywujpCDu6KAZ6LrpHgdOE3hdnXnWQLzuFmc/9UIYfx4/yp3u1Bf
yZ0z/+OmlCBXuReqrilTliy6Ow5F8I8GNQwlPHaB+cIOHubBa2Ubu8AA6O5TY1bh
zQeen8UeZfAzBofpX3NlvJcSvUfJTaZ+e7XKs/H+9g8vcThFHhVqU2uUj48GU3sO
fLFQqla4D7kZGXt4xlU2yDCnwKxBTgCHiz2wbqKdrSTzYxVEWr9zxwm0CxmUsH7g
4tqaxRo/NhKNj7F+g+bgfkbXxmqn49aBhJGVXTYYEFPCBQHu0xef6LzCApuVvVar
gjGA2jnw/OLlVrnTZdK+S0GChxtnonripTqCKUplUmZIFyrwQ7DFbckjBeyZ+etv
4e/9h0NljMytqnXu9vBjWkCGFQ1pyvU8Pz5+IzAS3umh6DSGx04wAXzDDvn5AQCI
NJxNzfmkBfe93khRL8m4jj+FNa5hIhcLNi4cEpzNb97ILjKhSTZ7O13J0npriiSe
Pl3RL3U3gfstPEGfgPQ=
=VemP
-----END PGP SIGNATURE-----
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512
Hello!
MediaWiki-CodeSniffer 15.0.0 is now available for use in your MediaWiki
extensions and other projects. This release features a lot of
different improvements, and new upstream features. Specifically,
there's a new comment syntax for enabling/disabling sniffs for
specific lines of code:
// phpcs:disable Some.Sniff.Name
weird_code();
// phpcs:enable Some.Sniff.Name
You can also use // phpcs:ignore Some.Sniff.Name
to just ignore the next line instead of doing a disable/enable pair.
The full conversion details are in the upstream release notes[1].
The full MediaWiki-CodeSniffer changelog since 14.1.0:
* Add sniff for using is_int over is_integer (Kunal Mehta)
* Allow _ in unit test method names (Gergő Tisza)
* Check function definitions for the same variable name (Kunal Mehta)
* Fix handling of alternative if in IfElseStructureSniff (Umherirrender)
* Forbid usage of extract() (Kunal Mehta)
* Ignore maintenance scripts in ClassMatchesFilenameSniff (Kunal Mehta)
* Improve phpdoc classname parsing (Gergő Tisza)
* Move phpcs.xml to .phpcs.xml (Umherirrender)
* Remove WhiteSpace.SpaceBeforeSingleLineComment.EmptyComment (Gergő
Tisza)
* Replace PEAR with Packagist in README.md link (Ricordisamoa)
* Require that an explicit visiblity is set on methods and properties
(Kunal Mehta)
* Rework ExtendClassUageSniff to avoid private class member
(Umherirrender)
* Skip inner functions in FunctionCommentSniff::processReturn
(Umherirrender)
* Update PHP_CodeSniffer to 3.2.2 (Ricordisamoa, Kunal Mehta)
* Use backticks in HISTORY.md (Ricordisamoa)
* Use only PSR2.Files.EndFileNewline (Kunal Mehta)
* Use upstream Generic.Files.OneObjectStructurePerFile sniff (Kunal Meht
a)
* Use upstream Generic.PHP.DiscourageGoto (Kunal Mehta)
* Warn on usage of create_function() (Kunal Mehta)
Thank you to all of the contributors for this release!
- -- Legoktm
[1] https://github.com/squizlabs/PHP_CodeSniffer/releases/tag/3.2.0
-----BEGIN PGP SIGNATURE-----
iQJLBAEBCgA1FiEE+h6fmkHn9DUCyl1jUvyOe+23/KIFAlpGyDoXHGxlZ29rdG1A
bWVtYmVyLmZzZi5vcmcACgkQUvyOe+23/KKL8RAAjXyjkwf5o+wTBxQBU+rssuim
LxytTUh7mzACyJxSWfHPhZN2UYJ5FjdlHu4P56880qrhA6ZAfcudSIMKqkvUsp9Z
/TZn4iYyc7WmpWME8TPQZX5iU/hDfcBvF0KFgK+IDqm+GFsH31EkjcuCSVFCSfjQ
UroX+iY59dUYieIFKZEQ9gZE7twJYOsRiBW3kt+VSCzXmeMEWXOc5Qg47+TYSWTN
AhGG4lwI55/Gpdt27NaaCai2d7hLeTra8XuI1pYo5sRfMA9Jr6imHD3mGQVCv+dA
wL0VP9IUsaDgXmSlb2Zt7FNq9N+a2u+xPQtPwDcA4nibvjXpvz+hYTVcXNrLY4yX
PZnN6ma+SeMbJtypv3ZP6U2axZ1Da4ZzcndyKPCKjrFhazQ2lDuAu6SLNVf361XF
PsCDmSYl/U1jOvzDuuxmM0cgZwHfCslYt0H1yACauHV8rCHRI4J1V6jfbTZlAorB
yXWF/Qv0chi4oXVxm2PcSNhRCF25WZ28YAeJgvED8Dtk0Y23qgpG6bfoVaqbtdR2
NkBPc2dEKD9IJo+t5zY4d8Y+2N5OjmxYk4OMlpKzMDydt+hsV3yUEUAKALVmZZ9I
G4LS8h0QIf7EPgzkGBYcH8DagMcUg3IQEaMKtarXimbhwAVKIZzLeWTLaff4G9ww
etV2oZ7Pc+qGwO05LDk=
=gIvp
-----END PGP SIGNATURE-----
Hi all,
TL;DR: TechCom has published a new version of the RFC process, at <
https://www.mediawiki.org/wiki/Requests_for_comment/Process>.
For comparison, here is the last revision before recent changes: <
https://www.mediawiki.org/w/index.php?oldid=2089457>.
## Highlights
* Reduce complexity of “Create a proposal” to only one required step.
(Create a Phabricator task.)
* Move responsibility to announce proposals from author to TechCom (via
TechCom Radar).
* Document the “Last Call” stage.
* No longer assign RFCs to individual TechCom members (aka “shepherding”).
* Document the role of each TechCom-RFC workboard column. Each stage is now
represented by a workboard column. The Phabricator workflow has been
simplified by removing and merging various columns.
Further below is a summary of other changes.
## Draft
This revision to our process document started last month, in early
November. Our workflows had rather drifted away from the documented
practice. We’ve been working to simplify the process and better reflect
current consensus of the committee. This update formalizes the improvements
and simplifications we made over the past year.
The IRC meeting on 2017-11-08 was dedicated to gathering input on the draft
[1] and our process in general.
The meeting notes can be found in meetbot logs. [2]
Once the draft was ready, the updated process document went on Last Call
for two weeks, from December 6 to December 20. No concerns were raised
during this period.
## Expectations
In addition to reduced complexity and formality for RFC authors, we’ve set
more specific expectations for ourselves. This makes it easier to
understand how the process will work from start to end, and how long it can
take.
With the updated process, useful feedback on new RFCs is expected within
one or two weeks, and RFCs could be approved within 4-6 weeks.
This is based on weekly triaging and announcement for new RFCs and a Last
Call period of typically two weeks.
## Summary
Notable changes, by section.
Section “Introduction”:
* Added an “Objective” section.
* Updated scope description to use same wording as the new TechCom Charter,
adopted earlier this year. <
https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Charter>
Section “Create a proposal”:
* Reduce complexity to only one (required) step: Create a TechCom-RFC task
on Phabricator.
* Remove duty of RFC authors to announce their RFC on Wikitech-l.
Section “Review”:
* Update wording to use “should” and “must” terms.
* Add: TechCom must announce all new RFCs.
* Add: TechCom should triage RFCs from the Inbox within two weeks.
* Remove: Shepherd process, e.g. formal assignment of RFCs to individual
TechCom members.
* Remove: “Needs shepherd” column of the TechCom-RFC workboard on
Phabricator. (Merged with “Under discussion”)
* Remove: “In progress” column of the TechCom-RFC workboard on Phabricator.
(Merged with “Under discussion”)
* Remove: “TechCom-Has-Shepherd” workboard on Phabricator. (Archived the
tag and untagged open tasks)
Section “Last Call”:
This stage was adopted in 2016 and already announced at the time
(“[wikitech-l] Last call: on the idea of last call”). The stage, however,
was not yet documented on the process page. This has now been corrected.
The name and principle of Last Call was inspired by similar processes used
by IETF, W3C, and Rust.
-- Timo Tijhof
[1] Draft was located at:
https://www.mediawiki.org/wiki/Requests_for_comment/Process/Draft
[2]
https://tools.wmflabs.org/meetbot/wikimedia-office/2017/wikimedia-office.20…