Hey everyone,
since we have introduced ourselves, we have been quite busy reading
documentation and source code (of existing extensions) trying to understand
the dataflow within Mediawiki.
Today we have committed our first code that demonstrates how we plan to
integrate the code of our alpha version of the mooc interface to a stand
alone extension.
Your early feedback is crucial for us and thus highly appreciated. Since
from now on we would start to code up functionality
https://phabricator.wikimedia.org/diffusion/1892/browse/develop/
We have started with the cookiecutter template for the media wiki
extensions.
In MOOC.hooks.php we first register a parser hook looking for {{#MOOC: }}.
Once the registered function is called we register another HOOK id est
OutputPageBeforeHTML on which we use the $text variable to modify the html.
in our example case we use a DomParser to manipulate the css attribute of
h2 and plan to adapt other css in this way for the future development.
We use the resource loader to include our own stylesheet.
Questions:
0.) Could anyone quickly review our code and give us feedback weather we
understood the basic data flow correctly and are using the correct
workflows?
1.) is there any argument against using bootstrap or less for more
efficient css hacking?
2.) How stable is the HTML syntax of wikipages? Wikitext will probably not
change in future mediawikiversions. can we rely on the fact that the HTML
structure also won't change? If we introduce our own css classes should
they also start with mw- oder should we just provide our own prefix e.g.
mooc- ...
3.) is there a better Hook that OutputPageBeforeHTML that we should use for
our usecase?
4.) We need meta information from other articles where is the best way to
include database requests?
Thank you very much!
best regards Sebastian and Rene
--
--
www.rene-pickhardt.de
<http://www.beijing-china-blog.com/>
Skype: rene.pickhardt
mobile: +49 (0)176 5762 3618 office: +49 (0) 261 / 287 2765 fax: +49
(0) 261 / 287 100 2765
Hi,
When we deployed the first 1.28 release to the cluster yesterday, we got a
new error[0] relating to
unserialization of redis data. It's pretty spammy already, so I'm paranoid
about deploying wider until
we figure out why. Deploying some debugging work soon so we can figure out
what's going on.
If you've got any information you think would help, please chime in on the
bug.
-Chad
[0] https://phabricator.wikimedia.org/T134923
https://www.mediawiki.org/wiki/Scrum_of_scrums/2016-05-11
= 2016-05-11 =
== Product ==
=== Reading ===
==== Web ====
Lazy loaded images has been deployed on Bengali Wikipedia. Ops will see
less image traffic per page there. That's normal!
We have a change to NavigationTiming. FYI performance team.
Working with Fundraising towards an A/B test. Fundraising please be alert
in case we need you!
==== Android ====
* Login and editing were temporarily broken as a result of a networking
library bug after HTTP/2 was enabled. We found the bug Monday morning and
issued releases to work around the bug for now by forcing HTTP/1.1. For
more info see https://phabricator.wikimedia.org/T134817.
* Reading Lists are {{done}} and to be released in the next beta. On deck:
The Feed.
==== iOS ====
* 5.0.4
** Just launched Monday - a bugfix release
* 5.0.5
** Likely in 2 weeks so we can address a major crash caused by framework
loading on the iOS platform
** https://phabricator.wikimedia.org/T134805
* 5.1
** Scoped major features last week (Maps and better iPad support)
** Refining scope this week
** Planning beta for week of June 12
** Will have a dependency on the geosearch extension
==== Mobile Content Service ====
* Removed redirect handling from service. Still adding redirect=false query
param. https://phabricator.wikimedia.org/T134538
* As a result of a fix to remote config checking, MCS will soon serve most
Android app requests.
==== Reading Infrastructure ====
* Gergo emailing with Chad and Chris on RC timing with respect to
AuthManager
=== Editing ===
==== Collaboration ====
* '''Blocking''':
** Still working on External Store on Beta.
** Need to refactor Flow memcached for cross-data-center; just had another
meeting about that
* '''Blocked''':
* '''Updates''':
** Continuing notification work on:
*** Cross-wiki notifications coming by default tomorrow!
*** Echo plaintext email formatter done, HTML email formatter in-progress
*** Work continues on the new Echo MVC architecture
*** Echo recently enabled a feature to give users a web notification when
an email is sent to them. This has resulted in a few cases where users
have received the notification, but not the email. If someone in ops (and
anyone who wants to) could look at https://phabricator.wikimedia.org/T134886,
that would be helpful.
==== Language ====
* '''Blocking''':
* '''Blocked''':
* '''Updates''':
** cxserver migrated to scap3. Thanks to Marko and TechOps for guiding!
** Apertium to Jessie work will start from this week, co-ordinate with
TechOps/Alex on this.
** Work on Compact Language Links migrating out of Beta in progress (
https://phabricator.wikimedia.org/T66793)
==== Multimedia ====
* '''Blocking''': None
* '''Blocked''': None
* '''Updates''':
** Offsite happened last week in Brooklyn
** Security review for ImageTweaks done, thanks security
** Thumbor still not deployed but Gilles is working on it, once that's done
we just need to finish polishing the extension
** 3D support coming to media viewer via Gilles, cool story
** Gallery mode for slideshows in progress, maybe rolling out this quarter
** Upload dialog "E/F" test (follow up to "A/B/C/D" test) coming soon to a
wiki near you
==== Parsing ====
* Tidy replacement work going on -- visual diff tests is helping us figure
out things to fix and deprecate in terms of behavior (ex: self-closing tags
like <b/>, <div/> etc.). Still some ways to go. See
https://www.mediawiki.org/wiki/Parsing/Replacing_Tidy for details.
* Kunal working on refactoring Linker code -- work in progress, but linker
tests are passing with his new patches.
* OCG: C.Scott believes that he has unblocked all the ocg issues, and that
ops can proceed with re-imaging the ocg servers (which seems to be going on
already).
== Technology ==
=== Technical Operations ===
* '''Blocking''':
** None
* '''Blocked''':
** HHVM on jessie by https://phabricator.wikimedia.org/T58041
* '''Updates''':
** Support for HTTP/2 in all varnish cache clusters rolled out. SPDY
support removed
=== Security ===
* Reviews: TWL OAuth implementation
* Planning Authentication Service work
=== Research ===
* '''Blocking''':
** None
* '''Blocked''':
** ORES --> Production blocked on ops
*** We know about the datacenter migration work
*** We would appreciate any estimate for when Alex can work on ORES stuff
again
* '''Updates''':
** ORES now has advanced edit quality models for Russian, Wikidata and
Dutch. We've recently deployed some substantial performance improvements,
so scoring happens faster.
** New dashboard for ORES: https://grafana.wikimedia.org/dashboard/db/ores
=== Services ===
* RESTBase / Change-Prop / Parsoid outage on 2016-05-05:
https://phabricator.wikimedia.org/T134537
** most bugs fixed
** change-prop disabled for the time being, should bring it up soon(TM)
* Scap3 migration:
** migrated today CXServer and Mobile Content Service
** Graphoid, Maps, Parsoid still to go
=== Analytics ===
* '''Blocking''': none
* '''Blocked''': none
* Analytics Kafka cluster upgraded to 0.9
* Ops work on Druid for edit related stats
* Analytics event-schema changes: https://gerrit.wikimedia.org/r/#/c/288210/
=== Discovery ===
* '''Blocking''': none
* '''Blocked''': none
* Preparing for ElasticSearch 2.0 migration
* WDQS with geospatial search deployed
* A/B test for descriptive texts for wikis on portal launched
* Team offsite next week
==== Graphs ====
* New Vega versions deployed
* Fix for ImageMagick generation of images will be deployed today
=== Fundraising Tech ===
* coding new PayPal integration
* CiviCRM: making dedupe work in batches
* Using localStorage instead of cookies for more uses in CentralNotice
* more work towards replacing ActiveMQ
=== Release Engineering ===
* Blocking: Security release
* Blocked: nothing we know of right now
tl;dr starting today:
Use:
scap sync 'message for posterity'
instead of:
scap 'message for posterity'
Scap 3.2.0-1 is now alive and well in production which means scap
subcommands are live.
All subcommands are documented[0]. Additional documentation can be
seen by running `scap --help` (or `scap [subcommand] --help`). If you
have any questions feel free to ask them on-list or in IRC on #scap3
or #wikimedia-releng.
Thanks!
Tyler Cipriani and the Deployment Working Group
[0]. Mediawiki:
https://doc.wikimedia.org/mw-tools-scap/scap2/commands.html Scap3:
https://doc.wikimedia.org/mw-tools-scap/scap3/deploy_commands.html
Tonights RFC session on #wikimedia-office will be about "Overhaul Interwiki
map, unify with Sites and WikiMap", see
https://phabricator.wikimedia.org/E171https://phabricator.wikimedia.org/T113034
The meeting will start at 21:00 UTC <https://www.timeanddate.com/worldclock/
fixedtime.html?msg=RFC+session&iso=20160511T21&p1=1440&ah=1>
We have talked about overhauling interwiki before. Today, I would like to
revisit the topic, look at the current state of things, and discuss next steps
and open questions:
Status
-------
* Please review: //factor storage logic out of Interwiki// <https://
gerrit.wikimedia.org/r/#/c/250150/> (I7d7424345)
Next Steps
------------
* split CDB from SQL implementation
* implement array-based InterwikiLookup (loads from multiple JSON or PHP
files)
* indexes should be generated on the fly, if not present in the loaded data
* proposed structure: P3044
* that InterwikiLookup implementation should also implement SiteLookup.
Alternatively, only implement SiteLookup, and provide an adapter
(SiteLookupInterwikiLookup) that implements InterwikiLookup on top of a
SiteLookup.
* implement maintenance script that can convert between different interwiki
representations.
* use InterwikiLookup for (multipke) input sources (db/files),
InterwikiStore for output
* we want an InterwikiStore that can write the new array structure (as JSON
or PHP)
* we want an InterwikiStore that can write the old CDB structure (as CDB or
PHP)
* Provide a config variable for specifying which files to read interwiki info
from. If not set, use old settings and old interwiki storage.
Questions
-----------
* is this a good plan? (see below for rationale)
* how does interwiki/site info relate to local wiki config (wgConf/SiteMatrix/
WikiMap)?
* should all information always be loaded? (see also {T114772})
* do we need caching?
* do we need to support new features also for the SQL based InterwikiLookup?
* needs: interwiki_ids table, interwiki_groups table, and blob field with
JSON or an interwiki_props table.
* Should SiteMatrix continue to work based on wgConf, or should it be ported
to use Sites? Or combine both? Currently it has [[https://
gerrit.wikimedia.org/r/#/c/211119/|problems]] with Wikimedia-specific
configurations, e.g. for [[https://meta.wikimedia.org/wiki/
Special_language_codes|special language codes]].
Later
-------
* decide on how wikis on the WMF cluster should load their interwiki config
* proposal: three files: family (shared by e.g. all wikipedias), language
(shared by e.g. all english wikis), and local.
* create a script that generates the family, language, and local files for all
the wikis (as JSON or PHP) based on config. Should work like dumpInterwiki.
* check this: generating CDB based on the relevant family/language/local
file for a given wiki should return the same CDB as dumpInterwiki for that
site.
* create a deployment process that generates PHP files from the checked-in
JSON files, for faster loading.
* action=siteinfo&siprop=interwikimap could be ported to Sites and expose more
information. Distinction from SiteMatrix is becoming somewhat unclear then.
--
Daniel Kinzler
Senior Software Developer
Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.
New study (US only) by the Knight Foundation:
https://medium.com/mobile-first-news-how-people-use-smartphones-to ,
summarized here:
http://www.theatlantic.com/technology/archive/2016/05/people-love-wikipedia…
"People spent more time on Wikipedia’s mobile site than any other news
or information site in Knight’s analysis, about 13 minutes per month
for the average visitor. CNN wasn’t too far behind, at 9 minutes 45
seconds per month. BuzzFeed clocked in third at 9 minutes 21 seconds
per month. (BuzzFeed, however, slays both CNN and Wikipedia in time
spent with the sites’ apps, compared with mobile websites. BuzzFeed
users devote more than 2 hours per month to its apps, compared with
about 46 minutes among CNN app users and 31 minutes among Wikipedia
app loyalists.)
Another way to look at Wikipedia’s influence: Wikipedia reaches almost
one-third of the total mobile population each month, according to
Knight’s analysis, which used data from the audience-tracking firm
Nielsen."
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB
Over the past few months the TCB team at WMDE has been working on
re-factoring code in core surrounding watchlists.
You can find a full blog post about what we did, why we did it and how we
did it at the link below:
http://addshore.com/2016/05/refactoring-around-watcheditem-in-mediawiki
tl;dr This was work put into making introducing expiring watchlist entries
easier. Code was removed from various API modules, special pages and other
random places. The new extracted code now has basically 100% unit &
integration test coverage.
--
Addshore
Hi, I had two changes deployed today that change how UploadWizard and
the upload dialog are configured on testwiki and test2wiki. I don't
think this affects anybody but us at Multimedia, but just in case:
* test2wiki no longer has UploadWizard [1] enabled. The configuration
has always been somewhat broken and hardly anyone used it. It's
still enabled on testwiki. [2]
* test2wiki can now upload files cross-wiki to testwiki using the
upload dialog [3]. Upload dialog uploads from testwiki now also
go to testwiki (locally). Previously they both uploaded to Commons,
like production wikis. [4]
You can think of these as testwiki acting more like Commons and
test2wiki acting more like a Wikipedia site. Have fun testing!
[1] https://www.mediawiki.org/wiki/UploadWizard
[2] https://gerrit.wikimedia.org/r/287944
[3] https://www.mediawiki.org/wiki/Upload_dialog
[4] https://gerrit.wikimedia.org/r/285708
--
Bartosz Dziewoński
2016-04-12 14:01 GMT+03:00 Adrian Heine <adrian.heine(a)wikimedia.de>:
> Hi everyone,
>
> as some of you might know, I'm a software developer at Wikimedia
> Deutschland, working on Wikidata. I'm currently focusing on improving
> Wikidata's support for languages we as a team are not using on a daily
> basis. As part of my work I stumbled over a shortcoming in MediaWiki's
> message system that – as far as I see it – prevents me from doing the right
> thing(tm). I'm asking you to verify that the issue I see indeed is an issue
> and that we want to fix it. Subsequently, I'm interested in hearing your
> plans or goals for MediaWiki's message system so that I can align my
> implementation with them. Finally, I am hoping to find someone who is
> willing to help me fix it.
First of all, thanks for working on this issue. It is a real issue,
but not often requested. I think that is because manually checking in
every place whether the language code is unexpected (different from
the one in current context) would be cumbersome and always outputting
language codes on every tag would be bloaty. Best would be if this
checking was automated in a templating library, but so far templating
hasn't been much adopted in MediaWiki core. But of course this
information needs to be exposed first, which is what I understand you
are doing.
> == The issue ==
>
> On Wikidata, we regularly have content in different languages on the same
> page. We use the HTML lang and dir attributes accordingly. For example, we
> have a table with terms for an entity in different languages. For missing
> terms, we would display a message in the UI language within this table. The
> corresponding HTML (simplified) might look like this:
>
> <div id="mw-content-text" lang="UILANG" dir="UILANG_DIR">
> <table class="entity-terms">
> <tr class="entity-terms-for-OTHERLANG1" lang="OTHERLANG1"
> dir="OTHERLANG1_DIR">
> <td class="entity-terms-for-OTHERLANG1-label">
> <div class="wb-empty" lang="UILANG" dir="UILANG_DIR">
> <!-- missing label message -->
> </div>
> </td>
> </tr>
> </div>
> </div>
>
> This works great as long as the missing label message is available in the UI
> language. If that is not the case, though, the message is translated
> according to the defined language fallbacks. In that case, we might end up
> with something like this:
>
> <div class="wb-empty" lang="arc" dir="rtl">No label defined</div>
>
> That's obviously wrong, and I'd like to fix it.
>
> == Fixing it ==
>
> For fixing this, I tried to make MessageCache provide the language a message
> was taken from [1]. That's not too straight-forward to begin with, but while
> working on it I realized that MessageCache is only responsible for following
> the language fallback chain for database translations. For file-based
> translations, the fallbacks are directly merged in by LocalisationCache, so
> the information is not there anymore at the time of translating a message. I
> see some ways to fix this:
>
> * Don't merge messages in LocalisationCache, but perform the fallback on
> request (possibly caching the result)
> * Tag message strings in LocalisationCache with the language they are in
> (sounds expensive to me)
> * Tag message strings as being a fallback in LocalisationCache (that way we
> could follow the fallback until we find a language in which the message
> string is not tagged as being a fallback)
>
> What do you think?
The current localisation cache implementation quite obviously trades
space for speed. In this light I would suggest option two, to tag the
actual language the string is in.
However, this trade-off might not make sense anymore, as we have more
languages and more messages, resulting in almost gigabyte size caches.
See also for example https://phabricator.wikimedia.org/T99740. I added
wikitech-l to CC in hopes that people who have worked on localisation
cache more recently would comment on whether option one, to not merge
messages, would make more sense nowadays.
>
> [1] https://gerrit.wikimedia.org/r/282133
>
-Niklas