Hi!
PHP 7.0 is now available and usable in CI. As a small starter, we have
added a job that runs "composer test" using PHP 7.0 for all repositories
that run some form of MediaWiki or unit tests or composer tests. It's
not enabled by default, so you'll need to run "check experimental" on a
patch to see if it passes (thought I doubt there will be a lot of issues).
You can follow [1] for progress on enabling those jobs by default, and
[2] for the next step of running MediaWiki unit tests under PHP 7.
Thank you to ops and the CI team for all of their help and work on this :)
[1] https://phabricator.wikimedia.org/T144961
[2] https://phabricator.wikimedia.org/T144962
-- Legoktm
Dear users, developers and all people interested in semantic wikis,
We are very happy to announce the program of the 13th Semantic MediaWiki
Conference in Frankfurt am Main, Germany! People from business, academia
and non-profit organizations will give a variety of very interesting
talks and presentations about applying semantic wikis as well as about
newest developments in the field.
We would also like to remind you of the upcoming end to the early bird
registration registration period on September 14, 2016. Please register
via Tito [0] if you have not done so already.
Important facts reminder:
* Dates: September 28th to 30th 2016 (Wednesday to Friday)
* Location: German Institute for International Educational Research
(DIPF), Schloßstraße 29, Frankfurt am Main, Germany.
* Conference page: https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2016
* Participants: Everybody interested in semantic wikis, especially in
Semantic MediaWiki, e.g. users, developers, consultants, business
representatives and researchers.
News on the program:
* The tutorial program has been announced and made available [1]
* The conference program has now also been announced and made available
[2]. We are very happy that the keynote will be held by Prof. Dr. Sören
Auer of Fraunhofer Institute for Intelligent Analysis and Information
Systems IAIS, Enterprise Information Systems.
* A social program is also being prepared [3].
* We encourage contributions for poster sessions about semantic wikis;
for a list of topics, see [4]. Please add your proposals by e-mail to
one of the program chairs. (CC)
* Presentations will generally be video and audio recorded and made
available for others after the conference.
Organization:
* Deutsches Institut für Internationale Pädagogische Forschung (DIPF)
[5] and Open Semantic Data Association e. V. [6] have become the
official organisers of SMWCon Fall 2016.
* Very special thanks go to Wikimedia Deutschland [7] for being our
platinum as well as ArchiXL [8] for being our gold supporter.
If you have questions you can contact Sabine Melnicki, Kendra Sticht and
Christoph Schindler (Program Chairs), Karsten Hoffmeyer (General Chair)
or Lia Veja (Local Chair) by e-mail (CC).
We will be happy to see you in Frankfurt!
Sabine Melnicki, Kendra Sticht and Christoph Schindler (Program Board)
[0] <https://ti.to/smwconfall2016/frankfurt>
[1] <https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2016/Tutorial_Day>
[2]
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2016/Conference_Days>
[3]
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2016#Social_program>
[4] <https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2016/Announcement>
[5] <http://www.dipf.de/en/about-us/contact>
[6] <http://www.opensemanticdata.org/>
[7] <http://www.wikimedia.de/>
[8] <http://www.archixl.nl/>
Dear community, this week we enabled <maplink> support on all Wikipedia and
sister projects. This means that now an article can have a link to a map,
and that map may contain highlighted regions and popups with information.
[1],[3]
Our next step is to add an informational sidebar to the map, similar to
what is being shown on the "geohack" page (map link in the upper right
corner of most location articles). Check out proposed screenshots [2]
We now also have a geoshapes service. So if Open Street Maps community has
defined a region and assigned it a Wikidata ID, you can draw it on the map
with that ID. Or you can use Wikidata Query Service (via SPARQL language),
to query for those IDs and draw them on the map, coloring them and adding
popup information. [3] Geoshapes can also be used for the graphs. [4]
[1] Maps docs: https://www.mediawiki.org/wiki/Help:Extension:Kartographer
[2] sidebar prototype: https://phabricator.wikimedia.org/T131907#2616275
[3] US Governors
https://en.wikipedia.org/wiki/User:Yurik/maplink#/maplink/0
[4] Graph geoshapes
https://www.mediawiki.org/wiki/Template:Graph:Country_with_regions_and_capi…
DO NOT WRITE ME
THAK¨¨s
--------------------------------------------
El jue, 8/9/16, Rob Lanphier <robla(a)wikimedia.org> escribió:
Asunto: Re: [Wikitech-l] Opening up MediaWiki dev summit in January?
Para: "Wikimedia developers" <wikitech-l(a)lists.wikimedia.org>
Fecha: jueves, 8 de septiembre, 2016 13:57
On Thu, Sep 8, 2016 at
7:53 AM, Nuria Ruiz <nuria(a)wikimedia.org>
wrote:
>
> Seems that
defining what areas we want to cover in the summit is
> a prerequisite to do what Brion was
asking for in the initial e-mail, to
>
open the summit beyond WMF.
>
>
Good point. Let's
say that we had to pick only one of these questions to
answer at WikiDev17. Which would you
choose?
* how do we make manipulating the
information on our websites easier and
more
useful? (both for humans and computers)
*
how can we better distribute the information on our
websites?
* how can we help our software
development community be more inclusive and
productive (and attract many more people)?
* how can we improve the quality of our
software, and pay down the
technical debt
we've accumulated over the years?
* how
can we make our websites better learning environments?
* how can we make our websites better support
languages other than English
(and character
sets other than Latin)?
Rob
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I harisandy . student uin Suska Riau
I am a programmer at PT . Sela Express Tour , I have the ability in the
field of java desktop and CodeIgniter PHP framework , Yii , laravel .
Here I am interested in several projects in the offer google , I hope
google can respond to my request . thank you
Dear wikimedia,
I am taufik hidayat bacelor student from state islamic university sultan
syarif kasim of riau, indonesia. i'm currently studying computer and
science, especially text processing. My profile is at
orcid.org/0000-0002-4524-1882.
I'd love to do submit proposal for google summer of code 2017 in wikimedia
foundation about "Automatic Summarization of Articles" i found on this link
https://phabricator.wikimedia.org/T127038.
My skills are PHP, Java, NodeJS, SQL Database such as MariaDB and Postgre
SQL. I also know about HTML 5, CSS and Javascript.
i hope you can share advice or what should i do or prepare, and which
idea(s) that's a good fit for my skills. thank you very much in advance.
best regards,
Taufik Hidayat
I am Arif Hidayat from UIN Sultan Syarif Kasim Riau, Pekanbaru. I've learn
about cms. I so interest about developing cms especially mediawiki.
My favorite programming language is php but I can use java programming and
other component of website like css and js.
I hope can submit my google summer of code proposal to mediawiki project.
Thanks
https://www.mediawiki.org/wiki/Scrum_of_scrums/2016-09-07
= 2016-09-07 =
== Product ==
=== Reading ===
Q2 FY 2016-2017 (October - December 2016) goals drafted:
https://www.mediawiki.org/wiki/Wikimedia_Engineering/2016-17_Q2_Goals#Readi…
==== Reading Web ====
Current sprint:
* Lazy loaded images fixes
* Related pages improvements
* Shipping wikidata descriptions in mobile web to some wikis
* Disabled lazy loaded references, plan to analyze data
==== Mobile Content Service ====
* Services team added ChangeProp for Wikidata descriptions to
mobile-sections endpoint in addition to the /page/summary endpoint. :)
==== Android native app ====
Current sprint: https://phabricator.wikimedia.org/project/view/2178/
Next sprint: to be created
* Navigation overhaul continues
==== iOS native app ====
Current release board: https://phabricator.wikimedia.org/project/view/1736/
Next release board: https://phabricator.wikimedia.org/project/view/2150/
* 5.1 was released (iPad explore & article layout, find in page), as was
5.1.1 with crash fixes
* Working on 5.2 (iOS 10 Today Widgets - Top Read & Continue Reading) -
Released to beta, target to be released with iOS 10
==== Reading Infrastructure ====
* blocked: waiting on Security to validate the approach in
https://gerrit.wikimedia.org/r/#/c/306133/
** AB emailed bawolff during today's SoS, CC'ing dpatrick, anomie, tgr
* blocking: none
=== Community Tech ===
* Numeric sorting has been deployed to English Wikipedia, investigating
other languages
* PageAssessments has been deployed to English Wikivoyage, will do a
functional roll-out this afternoon
* Working on cross-wiki watchlist backend
* No blockers
=== Security ===
* Darian on PTO Sept. 6-9
** E-mail security(a)wikimedia.org for urgent issues
** E-mail security-team(a)wikimedia.org for non-urgent issues
* Security reviews outstanding for this week:
** Youdao MT (Brian is handling)
=== Analytics ===
* Andrew demoing Event Bus events flowing through kasocki (kafka ->
socket.io interface) in the CREDIT showcase, after this meeting
* Luca put up https://yarn.wikimedia.org which allows all folks with LDAP
to access yarn logs for hadoop jobs
* some problems loading pageview data left a couple of days behind, Joseph
reran those jobs and that data is now loaded
* rotating scrum of scrums participation from now on: Nuria, Dan, Marcel,
Joseph
* tentative: the new AQS cluster ready by next week
* tentative: edit history reconstructed by next week, Erik Z will vet it
but if others are interested, let us know
** as a reminder, this is all data collected in mediawiki databases,
reconstructed as an append-only event store
** logging table was mined for page deletes and moves, user deletes,
renames, rights changes, and blocks
** archive table and revision table were merged and their user and page
fields historified (values at the time of the event were reconstructed)
** everything is written under a single denormalized schema of the form
{metadata, data that was modified (page|user|revision)}
=== Services ===
* New format for the section transform API
**
https://en.wikipedia.org/api/rest_v1/#!/Transforms/post_transform_section_c…
* Change-Prop processing transclusions now
* Wikidata description added to summary endpoint
* Some optimisations in feeds endpoint
* Config deploys with scap3 under way
=== Editing ===
==== Language ====
* Blocked: Waiting for updates on security review of Youdao MT.
* Blocking: None.
* Updates:
** Work on ContentTranslation (template handling) continue.
** Kartik fixing last bits in Apertium packaging.
==== Parsing (No one around to attend meeting, update below) ====
* Waiting on feedback from VE team for the Parsoid's native implementation
of the <gallery> extension ( https://gerrit.wikimedia.org/r/#/c/264026/ )
* Work ongoing to support language variants for read view in Parsoid.
* Work ongoing with the refactoring of parser tests framework (merging the
php unit and parser tests scripts)
* Patch implementing magic-word based opt-out of global user pages awaiting
review ( https://gerrit.wikimedia.org/r/#/c/303912/ )
==== Collaboration ====
* '''Blocking''':
** Continuing work on Flow caching rewrite for multi-DC. This is being
reviewed now both within-team and by Aaron.
* '''Blocked'':
* '''Updates''':
** Continuing work on MessagePoster.
=== RelEng ===
* Blocking
** Add section for long-running tasks on the Deployment page (specially
for database maintenance)
*** greg had to take a half day yesterday, will catch up on it today
*** https://phabricator.wikimedia.org/T144661
* Blocked
** Move contint1001 to production network with public IP -
https://phabricator.wikimedia.org/T140257
*** Public IP is Blocked on ops
=== Technical Operations ===
* '''Blocked'''
** None
* '''Blocking'''
** None
* Updates
** varnish 4 upgrade process ongoing
** labsdbs will probably be reimaged and the entire service restructured in
the next Quarter
=== ArchCom ===
* Full status: https://www.mediawiki.org/wiki/ArchComStatus
* Last week: [[https://phabricator.wikimedia.org/E266|E266]]
* This week [[https://phabricator.wikimedia.org/E269|E269]]
** Ad hoc ArchCom office hour
*** Briefly touching on Shadow Namespaces ([[Phab:T91162]])
*** (last minute addition) [[Phab:T141938]]: Wikimedia Developer Summit
Program
=== Fundraising Tech ===
* Switched over four queues from ActiveMQ to Redis
* monitoring for hiccups, working on switching remaining three
* pushing more code into library that doesn't depend on MediaWIki or
Drupal/CiviCRM
* fixing a couple GeoIP issues in CentralNotice
* more polishing for CiviCRM batch de-duplication
=== Discovery ===
* No blockers
* BM25 A/B test is on
* Working on multiwiki indexes
* Turned off "did you mean" on wikidata due to it not being very useful
* SPARQL Workshop on September 8th:
https://office.wikimedia.org/wiki/SPARQL_workshop
* Discernatron & BM25 demo / luncheon on Sept 13th (remote-friendly, except
food)
* CREDIT demos today
=== Interactive team ===
* <maplink> is now available on all wikis
* maps got WDQS support (Demoing
https://www.mediawiki.org/wiki/User:Smalyshev_(WMF)/Govmap at CREDIT)
Hi all!
Getting MCR support into MediaWiki is going to be one of the big challenges of
the next half year or so, and I'm really happy that quite a few people seem to
be interested in seeing it done. If you are interested in helping with the
effort, with discussion, review, coding, or organizing, please let me know.
Anyway, I have now put up a design draft for Multi-Content-Revision support
here: <https://www.mediawiki.org/wiki/Multi-Content_Revisions>. The top level
page explains the concept and rationale, and it links to several subpages
discussing different parts of the overall design. Some of these parts are
already fairly mature, while others are still just collections of notes.
This is the first time I have made a writeup of how all the parts fit together,
and I'd be really grateful for feedback. I'm particularly interested in what you
think of the proposed database schema described at
<https://www.mediawiki.org/wiki/Multi-Content_Revisions/Content_Meta-Data>. Is
it appropriate? feasible? scalable?
The content meta-data storage design is the heart of MCR. While it doesn't
necessarily have to be done first, we need commitment on this before we do any
of the other parts.
I have however tried to isolate the different components to a large degree, so
they can be implemented independently of each other, as described at
<https://www.mediawiki.org/wiki/Multi-Content_Revisions#Implementation_Roadm…>
Another part besides the content meta-data that is already quite mature and
ready for implementation is the BlobStore facility
<https://www.mediawiki.org/wiki/Multi-Content_Revisions/Blob_Storage>. But I'm
also interested in your thoughts on the design of the higher level interfaces
described in
<https://www.mediawiki.org/wiki/Multi-Content_Revisions/Revision_Retrieval> and
<https://www.mediawiki.org/wiki/Multi-Content_Revisions/Page_Update_Controll…>.
I have linked to some code experiments throughout the design document. Note that
these experiments often do not completely line up with the draft document. In
some cases, the experiment has progressed since the draft, in others, the draft
contains lessons learned from the experiment that never made it into code. The
canonical proposal is always the one on the wiki.
Please comment on the talk pages.
Thanks!
--
Daniel Kinzler
Senior Software Developer
Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.