I apologise if some of the below does in fact exist, I'd gladly find out!
There are three major parts of my workflow as project maintainer and
contributor that I find lacking in Differential right now and are strong
reason for me to discourage anyone from migrating to it. In addition,
projects that have already migrated are essentially non-existent and blind
to my workflow as a result. I've tried but even if I accept loss in
productivity, I can't even find a workaround for these.
1. Casual notifications about new patches and merged commits (IRC
notifications). - https://phabricator.wikimedia.org/T116330
2. Discover pending patches.
While Diffusion is a fine repo browser, it appears to lack any link back to
Differential. Even when manually going to Differential, it doesn't appear
to be any concept of "repo". It only has a global concept of "diff" and
"reviewer" (e.g. you can list diffs for review, and your own open diffs).
Unlike Maniphest (which has a concept of projects that have their own pages
with useful outgoing links). There is no list of projects with a link to
see a project's patches. This unlike Gerrit or GitHub where projects are
linked to searches for open patches or open pull-requests. This makes it
very non-transparent for potential consumers of our code, and casual
contributors to find out about open patches. This is an important indicator
for developers to determine the health of a project. It's also an easy
way-in to for new reviewers, and an important interface for project
maintainers to easily list open patches from time to time. Without this, I
expect our code review habits to become even worse than they already are.
https://phabricator.wikimedia.org/differential/ (no projects listed) ->
https://phabricator.wikimedia.org/differential/query/all/ (no repos named
alongside the diffs) -> https://phabricator.wikimedia.org/D229 (links to
diffusion) -> https://phabricator.wikimedia.org/diffusion/GOJR/ (no link to
query to differential).
If I see this correctly, there is simply no way to naturally get to a list
of open patches of a project.
3. Notification about new patches and merged commits.
There is appears to be no way to subscribe to a repo (e.g. as a maintainer
of that repo) so that I may be notified of new diffs and/or landed commits.
This is worse due to #2, which would've been a workaround (albeit a costly
one, due to pull:N vs push:1).
In addition to these workflow concerns, there is also Continuous
integration of course. But that's a separate issue.
I'm bringing up these concerns now because contrary to what I expected,
Differential is being adopted by quite a few repos now despite it being
premature.
-- Krinkle
Hi all,
The next CREDIT showcase will be Thursday, 12-May-2016 at 1800 UTC (1100
SF).
https://www.mediawiki.org/wiki/CREDIT_showcase
For this one we'll use Hangouts on Air for presenters, and the customary
YouTube stream for viewers.
See you next month!
-Adam
I'm getting the following error when running `composer install` after
cloning MW core and checking out 1.25.5. I've gotten the error on two
separate systems:
1. Windows with PHP 5.4 and Composer version 1.2-dev
2. CentOS 7 with PHP 5.6 and Composer version 1.1.0
I did NOT get the error on the same Windows machine prior to running
`composer self-update`. It had Composer version 1.0-dev
(dbdd4978a7dd4cd29d0d7dfc912aab404245b736) prior to update.
*** Error message ***
Loading composer repositories with package information
Updating dependencies (including require-dev)
Your requirements could not be resolved to an installable set of packages.
Problem 1
- Installation request for wikimedia/composer-merge-plugin 1.0.0 ->
satisfiable by wikimedia/composer-merge-plugin[v1.0.0].
- wikimedia/composer-merge-plugin v1.0.0 requires composer-plugin-api
1.0.0 -> no matching package found.
Potential causes:
- A typo in the package name
- The package is not available in a stable-enough version according to
your minimum-stability setting
see <https://getcomposer.org/doc/04-schema.md#minimum-stability> for
more details.
Read <https://getcomposer.org/doc/articles/troubleshooting.md> for further
common problems.
*** End error message ***
Thanks,
James
Hi,
Sorry for the delay, I've had a few too many things to get done this week
so I haven't communicated as clearly as I should've.
The release branches (REL1_27) have been created for MW core, vendor,
all extensions and all skins. MediaWiki core is now on 1.28.0-alpha.
It's time to start testing out the release branch and make sure everything
is good and polished. You can do this on your vagrant by navigating to
the mediawiki/ directory and checking out the branches.
If you find a problem in the release, please file a task in Phabricator or
submit a patch.
Minor reminder: please don't start landing large breaking changes on
master now that we've swapped...it makes porting patches between
branches way more difficult than they need to be :)
Thanks and have a great week!
-Chad
Hey everyone,
since we have introduced ourselves, we have been quite busy reading
documentation and source code (of existing extensions) trying to understand
the dataflow within Mediawiki.
Today we have committed our first code that demonstrates how we plan to
integrate the code of our alpha version of the mooc interface to a stand
alone extension.
Your early feedback is crucial for us and thus highly appreciated. Since
from now on we would start to code up functionality
https://phabricator.wikimedia.org/diffusion/1892/browse/develop/
We have started with the cookiecutter template for the media wiki
extensions.
In MOOC.hooks.php we first register a parser hook looking for {{#MOOC: }}.
Once the registered function is called we register another HOOK id est
OutputPageBeforeHTML on which we use the $text variable to modify the html.
in our example case we use a DomParser to manipulate the css attribute of
h2 and plan to adapt other css in this way for the future development.
We use the resource loader to include our own stylesheet.
Questions:
0.) Could anyone quickly review our code and give us feedback weather we
understood the basic data flow correctly and are using the correct
workflows?
1.) is there any argument against using bootstrap or less for more
efficient css hacking?
2.) How stable is the HTML syntax of wikipages? Wikitext will probably not
change in future mediawikiversions. can we rely on the fact that the HTML
structure also won't change? If we introduce our own css classes should
they also start with mw- oder should we just provide our own prefix e.g.
mooc- ...
3.) is there a better Hook that OutputPageBeforeHTML that we should use for
our usecase?
4.) We need meta information from other articles where is the best way to
include database requests?
Thank you very much!
best regards Sebastian and Rene
--
--
www.rene-pickhardt.de
<http://www.beijing-china-blog.com/>
Skype: rene.pickhardt
mobile: +49 (0)176 5762 3618 office: +49 (0) 261 / 287 2765 fax: +49
(0) 261 / 287 100 2765
Hi,
When we deployed the first 1.28 release to the cluster yesterday, we got a
new error[0] relating to
unserialization of redis data. It's pretty spammy already, so I'm paranoid
about deploying wider until
we figure out why. Deploying some debugging work soon so we can figure out
what's going on.
If you've got any information you think would help, please chime in on the
bug.
-Chad
[0] https://phabricator.wikimedia.org/T134923
https://www.mediawiki.org/wiki/Scrum_of_scrums/2016-05-11
= 2016-05-11 =
== Product ==
=== Reading ===
==== Web ====
Lazy loaded images has been deployed on Bengali Wikipedia. Ops will see
less image traffic per page there. That's normal!
We have a change to NavigationTiming. FYI performance team.
Working with Fundraising towards an A/B test. Fundraising please be alert
in case we need you!
==== Android ====
* Login and editing were temporarily broken as a result of a networking
library bug after HTTP/2 was enabled. We found the bug Monday morning and
issued releases to work around the bug for now by forcing HTTP/1.1. For
more info see https://phabricator.wikimedia.org/T134817.
* Reading Lists are {{done}} and to be released in the next beta. On deck:
The Feed.
==== iOS ====
* 5.0.4
** Just launched Monday - a bugfix release
* 5.0.5
** Likely in 2 weeks so we can address a major crash caused by framework
loading on the iOS platform
** https://phabricator.wikimedia.org/T134805
* 5.1
** Scoped major features last week (Maps and better iPad support)
** Refining scope this week
** Planning beta for week of June 12
** Will have a dependency on the geosearch extension
==== Mobile Content Service ====
* Removed redirect handling from service. Still adding redirect=false query
param. https://phabricator.wikimedia.org/T134538
* As a result of a fix to remote config checking, MCS will soon serve most
Android app requests.
==== Reading Infrastructure ====
* Gergo emailing with Chad and Chris on RC timing with respect to
AuthManager
=== Editing ===
==== Collaboration ====
* '''Blocking''':
** Still working on External Store on Beta.
** Need to refactor Flow memcached for cross-data-center; just had another
meeting about that
* '''Blocked''':
* '''Updates''':
** Continuing notification work on:
*** Cross-wiki notifications coming by default tomorrow!
*** Echo plaintext email formatter done, HTML email formatter in-progress
*** Work continues on the new Echo MVC architecture
*** Echo recently enabled a feature to give users a web notification when
an email is sent to them. This has resulted in a few cases where users
have received the notification, but not the email. If someone in ops (and
anyone who wants to) could look at https://phabricator.wikimedia.org/T134886,
that would be helpful.
==== Language ====
* '''Blocking''':
* '''Blocked''':
* '''Updates''':
** cxserver migrated to scap3. Thanks to Marko and TechOps for guiding!
** Apertium to Jessie work will start from this week, co-ordinate with
TechOps/Alex on this.
** Work on Compact Language Links migrating out of Beta in progress (
https://phabricator.wikimedia.org/T66793)
==== Multimedia ====
* '''Blocking''': None
* '''Blocked''': None
* '''Updates''':
** Offsite happened last week in Brooklyn
** Security review for ImageTweaks done, thanks security
** Thumbor still not deployed but Gilles is working on it, once that's done
we just need to finish polishing the extension
** 3D support coming to media viewer via Gilles, cool story
** Gallery mode for slideshows in progress, maybe rolling out this quarter
** Upload dialog "E/F" test (follow up to "A/B/C/D" test) coming soon to a
wiki near you
==== Parsing ====
* Tidy replacement work going on -- visual diff tests is helping us figure
out things to fix and deprecate in terms of behavior (ex: self-closing tags
like <b/>, <div/> etc.). Still some ways to go. See
https://www.mediawiki.org/wiki/Parsing/Replacing_Tidy for details.
* Kunal working on refactoring Linker code -- work in progress, but linker
tests are passing with his new patches.
* OCG: C.Scott believes that he has unblocked all the ocg issues, and that
ops can proceed with re-imaging the ocg servers (which seems to be going on
already).
== Technology ==
=== Technical Operations ===
* '''Blocking''':
** None
* '''Blocked''':
** HHVM on jessie by https://phabricator.wikimedia.org/T58041
* '''Updates''':
** Support for HTTP/2 in all varnish cache clusters rolled out. SPDY
support removed
=== Security ===
* Reviews: TWL OAuth implementation
* Planning Authentication Service work
=== Research ===
* '''Blocking''':
** None
* '''Blocked''':
** ORES --> Production blocked on ops
*** We know about the datacenter migration work
*** We would appreciate any estimate for when Alex can work on ORES stuff
again
* '''Updates''':
** ORES now has advanced edit quality models for Russian, Wikidata and
Dutch. We've recently deployed some substantial performance improvements,
so scoring happens faster.
** New dashboard for ORES: https://grafana.wikimedia.org/dashboard/db/ores
=== Services ===
* RESTBase / Change-Prop / Parsoid outage on 2016-05-05:
https://phabricator.wikimedia.org/T134537
** most bugs fixed
** change-prop disabled for the time being, should bring it up soon(TM)
* Scap3 migration:
** migrated today CXServer and Mobile Content Service
** Graphoid, Maps, Parsoid still to go
=== Analytics ===
* '''Blocking''': none
* '''Blocked''': none
* Analytics Kafka cluster upgraded to 0.9
* Ops work on Druid for edit related stats
* Analytics event-schema changes: https://gerrit.wikimedia.org/r/#/c/288210/
=== Discovery ===
* '''Blocking''': none
* '''Blocked''': none
* Preparing for ElasticSearch 2.0 migration
* WDQS with geospatial search deployed
* A/B test for descriptive texts for wikis on portal launched
* Team offsite next week
==== Graphs ====
* New Vega versions deployed
* Fix for ImageMagick generation of images will be deployed today
=== Fundraising Tech ===
* coding new PayPal integration
* CiviCRM: making dedupe work in batches
* Using localStorage instead of cookies for more uses in CentralNotice
* more work towards replacing ActiveMQ
=== Release Engineering ===
* Blocking: Security release
* Blocked: nothing we know of right now
tl;dr starting today:
Use:
scap sync 'message for posterity'
instead of:
scap 'message for posterity'
Scap 3.2.0-1 is now alive and well in production which means scap
subcommands are live.
All subcommands are documented[0]. Additional documentation can be
seen by running `scap --help` (or `scap [subcommand] --help`). If you
have any questions feel free to ask them on-list or in IRC on #scap3
or #wikimedia-releng.
Thanks!
Tyler Cipriani and the Deployment Working Group
[0]. Mediawiki:
https://doc.wikimedia.org/mw-tools-scap/scap2/commands.html Scap3:
https://doc.wikimedia.org/mw-tools-scap/scap3/deploy_commands.html
Tonights RFC session on #wikimedia-office will be about "Overhaul Interwiki
map, unify with Sites and WikiMap", see
https://phabricator.wikimedia.org/E171https://phabricator.wikimedia.org/T113034
The meeting will start at 21:00 UTC <https://www.timeanddate.com/worldclock/
fixedtime.html?msg=RFC+session&iso=20160511T21&p1=1440&ah=1>
We have talked about overhauling interwiki before. Today, I would like to
revisit the topic, look at the current state of things, and discuss next steps
and open questions:
Status
-------
* Please review: //factor storage logic out of Interwiki// <https://
gerrit.wikimedia.org/r/#/c/250150/> (I7d7424345)
Next Steps
------------
* split CDB from SQL implementation
* implement array-based InterwikiLookup (loads from multiple JSON or PHP
files)
* indexes should be generated on the fly, if not present in the loaded data
* proposed structure: P3044
* that InterwikiLookup implementation should also implement SiteLookup.
Alternatively, only implement SiteLookup, and provide an adapter
(SiteLookupInterwikiLookup) that implements InterwikiLookup on top of a
SiteLookup.
* implement maintenance script that can convert between different interwiki
representations.
* use InterwikiLookup for (multipke) input sources (db/files),
InterwikiStore for output
* we want an InterwikiStore that can write the new array structure (as JSON
or PHP)
* we want an InterwikiStore that can write the old CDB structure (as CDB or
PHP)
* Provide a config variable for specifying which files to read interwiki info
from. If not set, use old settings and old interwiki storage.
Questions
-----------
* is this a good plan? (see below for rationale)
* how does interwiki/site info relate to local wiki config (wgConf/SiteMatrix/
WikiMap)?
* should all information always be loaded? (see also {T114772})
* do we need caching?
* do we need to support new features also for the SQL based InterwikiLookup?
* needs: interwiki_ids table, interwiki_groups table, and blob field with
JSON or an interwiki_props table.
* Should SiteMatrix continue to work based on wgConf, or should it be ported
to use Sites? Or combine both? Currently it has [[https://
gerrit.wikimedia.org/r/#/c/211119/|problems]] with Wikimedia-specific
configurations, e.g. for [[https://meta.wikimedia.org/wiki/
Special_language_codes|special language codes]].
Later
-------
* decide on how wikis on the WMF cluster should load their interwiki config
* proposal: three files: family (shared by e.g. all wikipedias), language
(shared by e.g. all english wikis), and local.
* create a script that generates the family, language, and local files for all
the wikis (as JSON or PHP) based on config. Should work like dumpInterwiki.
* check this: generating CDB based on the relevant family/language/local
file for a given wiki should return the same CDB as dumpInterwiki for that
site.
* create a deployment process that generates PHP files from the checked-in
JSON files, for faster loading.
* action=siteinfo&siprop=interwikimap could be ported to Sites and expose more
information. Distinction from SiteMatrix is becoming somewhat unclear then.
--
Daniel Kinzler
Senior Software Developer
Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.