Hi,
this is an announcement to let you know that the service ($lang).
planet.wikimedia.org, an RSS feed aggregator for all Wikimedia related
blogs, has switched software.
You can find the English version at https://en.planet.wikimedia.org/
Other existing languages are listed on
https://wikitech.wikimedia.org/wiki/Planet.wikimedia.org#Which_languages_ex…
?
Today we moved away from planet-venus and to a newer package called
"rawdog" that does the same thing as before, fetching a bunch of RSS feed
and combining them into a single page and feed.
The reason is that planet-venus has been dropped in Debian stable (stretch)
because it was unmaintained, so we had to find an alternative to be able to
upgrade the underlying servers to a current OS version.
If you never heard of planet, here you can find more info:
https://wikitech.wikimedia.org/wiki/Planet.wikimedia.org
If you already use it but just subscribe to the "feed of feeds" then
nothing should change for you.
(Though note that we support RSS 2.0 but not a separate Atom feed anymore.
We are redirecting the old atom.xml URL to the new (and old) URL rss20.xml.)
If you already use it and look at the web UI, enjoy the new theme that
Paladox imported from KDE to make it look about 150% better than before.
(thanks to him for that theming work!)
We also applied patches to make it look more like our former planet for a
smooth transition. A "wmf1" package has been built and uploaded at
https://apt.wikimedia.org/wikimedia/pool/main/r/rawdog/
If you want to know more about "rawdog":
https://offog.org/code/rawdog/https://packages.debian.org/stretch/rawdog
If you want to add your blog feed, feel free to upload changes or just drop
me a mail.
Bugs can be reported here:
https://phabricator.wikimedia.org/project/view/413/
Tickets are: https://phabricator.wikimedia.org/T180498 ,
https://phabricator.wikimedia.org/T168490
Cheers,
Daniel
--
Daniel Zahn <dzahn(a)wikimedia.org>
Operations Engineer
TL;DR:
Scripts that reply on xml files numbered 1 through 4 should be updated to
check for 1 through 6.
Explanation:
A number of wikis have stubs and page content files generated 4 parts at a
time, with the appropriate number added to the filename. I'm going to be
increasing that thi month to 6.
The reason for the increase is that near the end of the run there are
usually just a few big wikis taking their time at completing. If they run
with 6 processes at once, they'll finish up a bit sooner.
If you have scripts that rely on the number 4, just increase it to 6 and
you're done.
This will go into effect for the June 1 run and all runs afterwards.
Thanks!
https://www.mediawiki.org/wiki/Scrum_of_scrums/2018-05-30
=2018-05-30=
== Callouts ==
* From last week, all WIndows NT operating systems are under the "Windows"
os family in analytics metrics
* Security: Security review for Wikidata queries data release proposal
https://phabricator.wikimedia.org/T190875
== Audiences ==
=== Readers ===
==== iOS native app ====
* Blocked by:
* Blocking:
* Updates:
** Continuing work on tech debt release, 5.8.2 (
https://phabricator.wikimedia.org/project/view/3358/ )
** Starting work on next major release, 5.9 (
https://phabricator.wikimedia.org/project/view/3238/ )
==== Android native app ====
* Blocked by:
* Blocking:
* Updates:
* Released maintenance update to production (improvements to reading
list syncing / fix no-such-project errors)
* Finalizing multilingual features based on user testing at Hackathon
-- on track for release in ~1 week.
==== Readers Web ====
* Blocked by:
* Blocking:
* Updates:
*Quarterly goal dependency update:
**[[metawiki:Wikimedia_Foundation_Annual_Plan/2017-2018/Draft/Programs/Product#Program_2:_Better_Encyclopedia|Outcome
1, Objective 4]]: Continue improving the ways that users can download
articles of interest for later consumption
*** Reading Web depends on SRE, RelEng, Reading Infra
==== Readers Infrastructure ====
* No updates this week.
*Quarterly goal dependency update:
**[[metawiki:Wikimedia_Foundation_Annual_Plan/2017-2018/Draft/Programs/Product#Program_2:_Better_Encyclopedia|Outcome
1, Objective 4]]: Continue improving the ways that users can download
articles of interest for later consumption
*** Reading Web depends on SRE, RelEng, Reading Infra
**[[Wikimedia Audiences/2017-18 Q4 Goals#Readers|Increase code sharing of
client apps by coalescing and moving more logic to the server]]
***Reading Infra depends on Parsing, Services
===== Maps =====
* Blocked by:
* Blocking:
* Updates:
==== Multimedia ====
* Blocked by:
* Blocking:
* Updates:
** Looking into OOUI migration for Wikibase per discussions at the
hackathon/elsewhere
** UploadWizard work for multilingual captions pretty much complete
** Indexing wikibase statements (searchable via haswbstatement:XXXX), work
continues on quantities
*Quarterly goal dependency update:
**[[Wikimedia Audiences/2017-18 Q4 Goals#Programs|Objective 3.1]] Prepare
for launch of the first Structured Data on Commons feature (multilingual
file captions)
***SDC depends on Multimedia,SRE, WMDE, Search Platform, MediaWiki
Platform, Research
** [[Wikimedia Audiences/2017-18 Q4 Goals#Programs|Objective 2.1]]
Integrate structured file captions into search
*** SDC depends on Search Platform, Multimedia
**[[metawiki:Wikimedia_Foundation_Annual_Plan/2017-2018/Final/Structured_Data#Segment_4:_Programs|Segment
4, Outcome 2]]: Develop a better understanding of existing needs for
Structured Commons- T171252
***Research depends on Multimedia
=== Contributors ===
==== Community Tech ====
* Blocked by:
* Blocking:
* Updates:
** Working on PageTriage improvements
==== Anti-Harassment Tools ====
* Blocked by:
* Blocking:
* Updates:
** Working on Blocking Tools
==== Editing ====
* Blocked by:
* Blocking:
* Updates:
** None
==== Parsing ====
* Blocked by:
* Blocking:
* Updates:
** Tidy -> RemexHtml: final switch planned for June 27th / July 11th
*Quarterly goal dependency update:
**[[metawiki:Wikimedia_Foundation_Annual_Plan/2017-2018/Final/Programs/Product#Program_3:_Increase_device_support_for_editing|Goal
3.6]] Support work towards unifying MediaWiki's parser implementations, in
liaison with Technology's MediaWiki team
*** Parsing depends on MediaWiki Platform, Services
**[[Wikimedia Audiences/2017-18 Q4 Goals#Readers|Increase code sharing of
client apps by coalescing and moving more logic to the server]]
Increase code sharing of client apps by coalescing and moving more logic to
the server.
*** Reading Infra depends on Parsing, Services
**
[[Wikimedia Technology/Goals/2017-18 Q4#Program 7. Smart tools for better
data|Outcome 2: Objective 1]]: Revision storage scaling
*** Services depends on SRE, Parsing
==== Collaboration ====
* Blocked by:
* Blocking:
* Updates:
** Patches to solve most of Echo's multi-DC incompatibilities now awaiting
review https://phabricator.wikimedia.org/T164860
** Deleted Echo's rspec tests to unbreak CI
==== Language ====
* Blocked by:
* Blocking:
* Updates:
*Quarterly goal dependency update:
**[[metawiki:Wikimedia_Foundation_Annual_Plan/2017-2018/Final/Programs/Product#Program_3:_Increase_device_support_for_editing|Goal
3.1]] Improve and consolidate our unified editing platform so that it's
great on all devices
***Language depends on Editing
=== Audiences Design ===
==== UI Standardization ====
* Blocked by:
* Blocking:
* Updates:
** OOUI – v0.27.1 released yesterday:
https://phabricator.wikimedia.org/diffusion/GOJU/browse/master/History.md;v…
*** 1 deprecation in this release. Additionally 9 style amendments, 9 code
& 2 accessibility improvement. Among those
**** Toolbar: Add a required 'name' property to toolgroup configs;
deprecating change (Ed Sanders)
**** Clarify and align TabSelectWidget focus (Volker E.)
https://phabricator.wikimedia.org/T194863
**** Allow dropdown menus to be larger than their handles (Ed Sanders)
https://phabricator.wikimedia.org/T195257
** Continuing work on Design Style Guide, this week starting 'Resources'
page with all repo resources
** Continuing work on a minor UI/UX issues in AdvancedSearch with WMDE team
== Technology ==
=== Analytics ===
* Blocked by:
* Blocking:
* Updates:
** Updated user agent string parsing regexes in eventlogging and refinery
** Migrated zookeeper to new hardware
** Migrating druid to debian stretch, should be finished this week
** Revision score now available in eventstreams
=== Cloud Services ===
* Blocked by:
* Blocking:
* Updates:
=== Fundraising Tech ===
* Blocked by:
* Blocking:
* Updates:
** Working on getting Ingenico Connect API campaign ready.
** Still working on backend pipeline for CentralNotice EventLogging stats
** Working on a Donor Deletion tool for Civi in response to GDPR
=== MediaWiki Platform ===
* Blocked by:
* Blocking:
* Updates:
* MCR:
** ar_rev_id deduplication is merged and maintenance script has been run
** API "templated parameters" patch was merged
* TemplateStyles:
** Implemented a feature request.
** An enwiki RFC about enabling it as soon as Remex is enabled there is
passing.
* MediaWiki core master now requires PHP 7.0 or HHVM
* MediaWiki-CodeSniffer 19.0.0 and 20.0.0 released, to allow for safe PHP 7
features to be used
* Bug triage and fixes for MassMessage and GlobalUserPage
* Finished development of "CoverMe": https://tools.wmflabs.org/coverme/ (
https://blog.legoktm.com/2018/05/29/introducing-coverme-find-the-most-calle…)
* Deployed postgres in CI, filed T195807 for failures
* Work towards PSR-4 in MediaWiki core, including enabling PSR-4 autoloader
for more directories, and implementing a structure test to validate PSR-4
compliance
* Hackathon projects:
** MySQL client wrapper to replace "sql" shell script:
https://gerrit.wikimedia.org/r/#/c/434188/
** Refactor API parameter validation:
https://gerrit.wikimedia.org/r/q/topic:%2522bug%252FT142080-api-param-valid…
** Script to import Phabricator task information into a wiki:
https://gerrit.wikimedia.org/r/#/c/433919/
* The usual code review and bug work
*Quarterly goal dependency update:
**[[metawiki:Wikimedia_Foundation_Annual_Plan/2017-2018/Final/Programs/Product#Program_3:_Increase_device_support_for_editing|Goal
3.6]] Support work towards unifying MediaWiki's parser implementations, in
liaison with Technology's MediaWiki team
***Parsing depends on MediaWiki Platform, Services
*** Planning underway for next FY Platform Evolution program
**[[Wikimedia Audiences/2017-18 Q4 Goals#Programs|Prepare for launch of the
first Structured Data on Commons feature]] (multilingual file captions)
***SDC depends on Multimedia/,SRE, WMDE, Search Platform, MediaWiki
Platform, Research
*** MCR capabilities in active development
** [[Wikimedia Audiences/2017-18 Q4 Goals#Programs|Objective 1.1]] Assist
with deploying MultiContent Revisions on Commons
[[phab:T174022|T174022]] Implement multi-content revisions,
[[phab:T174023|T174023]] Implement MCR storage layer,
[[phab:T174045|T174045]] DB schema migration for MCR,
[[phab:T174044|T174044]] Deploy MCR storage layer,
[[phab:T174043|T174043]]Deploy Multi-Content Revisions
***SDC depends on MediaWiki Platform, WMDE
*** MCR capabilities in active development
=== Performance ===
* Blocked by:
* Blocking: Readers Web (review of CitationUsage) - will be done this week
* Updates:
** performance perception survey live on several wikis
** Trialing mobile performance testing on SauceLabs device lab
** More fixes to ChronologyProtector
** Still pulling jQuery deps out of base javascript modules
** Bunch of resourceloader bug fixes
** mcrouter expected to be ready in prod this week, will test on testwiki
next week
=== Release Engineering ===
* Blocking
**
* Blocked
**
* Updates
** Train status: https://phabricator.wikimedia.org/T191051
** 1.32.0-wmf.5 got held up last week by an incident that happened during
the scheduled Thursday train deployment window.
***
https://wikitech.wikimedia.org/wiki/Incident_documentation/20180524-wikidata
*** We should be back on track this week, wmf.5 is clear of blockers and
wmf.6 should be rolling out as usual
*Quarterly goal dependency update:
**[[metawiki:Wikimedia_Foundation_Annual_Plan/2017-2018/Draft/Programs/Product#Program_2:_Better_Encyclopedia|Outcome
1, Objective 4]]: Continue improving the ways that users can download
articles of interest for later consumption
*** Reading Web depends on SRE, RelEng, Reading Infra
=== Research ===
* Blocked by: Performance: https://gerrit.wikimedia.org/r/#/c/432534/
* Blocking: None
* Updates:
** Deploying the gapfinder-tools app and gathering section mappings.
** Have been gathering synonym mappings.
** Analyzed the first incoming labels from WikiLabels citations campaign
*Quarterly goal dependency update:
**[[Wikimedia Audiences/2017-18 Q4 Goals#Programs|Prepare for launch of the
first Structured Data on Commons feature]] (multilingual file captions)
***SDC depends on Multimedia/,SRE, WMDE, Search Platform, MediaWiki
Platform, Research
**[[metawiki:Wikimedia_Foundation_Annual_Plan/2017-2018/Final/Structured_Data#Segment_4:_Programs|Segment
4, Outcome 2]]: Develop a better understanding of existing needs for
Structured Commons- [[phab:T171252|T171252]]
***Research depends on Multimedia
=== Scoring Platform ===
* Blocked by:
* Blocking:
* Updates:
=== Search Platform ===
* Blocked by: Security: https://phabricator.wikimedia.org/T190875
* Blocking:
* Updates:
** Wikidata reindexed and now external ID and string properties can be
searched for with haswbstatement keyword
** Deep category search enabled on all wikis except private ones:
https://phabricator.wikimedia.org/T194260
** all: keyword enabled on all wikis:
https://phabricator.wikimedia.org/T165110
** Exploring ideas for applying NLP to search:
https://www.mediawiki.org/wiki/User:TJones_(WMF)/Notes/Potential_Applicatio…
** Looking into fixing regex highlighting that does not time out as
expected: https://phabricator.wikimedia.org/T195491
** Working on query parsing refactoring:
https://phabricator.wikimedia.org/T185108
** Working on Polish analyzer: https://phabricator.wikimedia.org/T186046
** Working on fulltext search for Lexemes
*Quarterly goal dependency update:
**[[Wikimedia Audiences/2017-18 Q4 Goals#Programs|Prepare for launch of the
first Structured Data on Commons feature]] (multilingual file captions)
***SDC depends on Multimedia/,SRE, WMDE, Search Platform, MediaWiki
Platform, Research
**[[Wikimedia Audiences/2017-18 Q4 Goals#Programs|*Objective 2.1]]
Integrate structured file captions into search
*** SDC depends on Search Platform, Multimedia
=== Security ===
* Blocked by:
* Blocking:
* Updates:
*Quarterly goal dependency update:
**[[metawiki:Wikimedia_Foundation_Annual_Plan/2017-2018/Final/Programs/Product#Program_3:_Increase_device_support_for_editing|Goal
3.6]]: Support work towards unifying MediaWiki's parser implementations, in
liaison with Technology's MediaWiki team
***Parsing depends on MediaWiki Platform, Services
=== Services ===
* Blocked by:
** Parsing on language variants transformation support
** Echo on non JSON-serializable job
https://phabricator.wikimedia.org/T192945
** Who knows who's resonsible for GWToolset extension? It has
non-json-serializable job https://phabricator.wikimedia.org/T192946
* Blocking: none?
* Updates:
** Kafka queue enabled for all jobs everywhere except some exceptions and
cirrus search
** revision-score event exposed via event streams
*Quarterly goal dependency update:
**[[Wikimedia Audiences/2017-18 Q4 Goals#Readers|Increase code sharing of
client apps by coalescing and moving more logic to the server.]]
***Reading Infra/Parsing, Services
**[[metawiki:Wikimedia_Foundation_Annual_Plan/2017-2018/Final/Programs/Product#Program_3:_Increase_device_support_for_editing|Goal
3.6]] Support work towards unifying MediaWiki's parser implementations, in
liaison with Technology's MediaWiki team
***Parsing depends on MediaWiki Platform, Services
[[Wikimedia Technology/Goals/2017-18 Q4#Program 7. Smart tools for better
data|Outcome 2: Objective 1]]: Revision storage scaling
*** Services depends on SRE, Parsing
=== Site Reliability Engineering ===
* Blocked by:
** Collaboration for flow, T172025
* Blocking:
** None
* Updates:
** Had a wikidata outage
https://wikitech.wikimedia.org/wiki/Incident_documentation/20180524-wikidata
** row C move+upgrade went quite well
** mcrouter to be deployed this week
*Quarterly goal dependency update:
**[[metawiki:Wikimedia_Foundation_Annual_Plan/2017-2018/Draft/Programs/Product#Program_2:_Better_Encyclopedia|Outcome
1, Objective 4]]: Continue improving the ways that users can download
articles of interest for later consumption
*** Reading Web depends on SRE, RelEng, Reading Infra
**[[Wikimedia Audiences/2017-18 Q4 Goals#Programs|Prepare for launch of the
first Structured Data on Commons feature]] (multilingual file captions)
***SDC depends on Multimedia/,SRE, WMDE, Search Platform, MediaWiki
Platform, Research
[[Wikimedia Technology/Goals/2017-18 Q4#Program 7. Smart tools for better
data|Outcome 2: Objective 1]]: Revision storage scaling
*** Services depends on SRE, Parsing
== Wikidata ==
* Blocked by:
* Blocking:
* Updates:
*Quarterly goal dependency update:
**[[Wikimedia Audiences/2017-18 Q4 Goals#Programs|Prepare for launch of the
first Structured Data on Commons feature]] (multilingual file captions)
***SDC depends on Multimedia/,SRE, WMDE, Search Platform, MediaWiki
Platform, Research
**[[Wikimedia Audiences/2017-18 Q4 Goals#Programs|Objective 1.1]] Assist
with deploying MultiContent Revisions on Commons
[[phab:T174022|T174022]] Implement multi-content revisions,
[[phab:T174023|T174023]] Implement MCR storage layer,
[[phab:T174045|T174045]] DB schema migration for MCR,
[[phab:T174044|T174044]] Deploy MCR storage layer,
[[phab:T174043|T174043]]Deploy Multi-Content Revisions
***SDC depends on MediaWiki Platform, WMDE
== German Technical Wishlist ==
* Blocked by:
* Blocking:
* Updates:
== SoS Meeting Bookkeeping ==
* Updates:
INTRODUCTION
Machine-utilizable lexicons can enhance a great number of speech and natural language technologies. Scientists, engineers and technologists – linguists, computational linguists and artificial intelligence researchers – eagerly await the advancement of machine lexicons which include rich, structured metadata and machine-utilizable definitions.
Wiktionary, a collaborative project to produce a free-content multilingual dictionary, aims to describe all words of all languages using definitions and descriptions. The Wiktionary project, brought online in 2002, includes 139 spoken languages and American sign language [1].
This letter hopes to inspire exploration into and discussion regarding machine wiktionaries, machine-utilizable crowdsourced lexicons, and services which could exist at https://machine.wiktionary.org/ .
LEXICON EDITIONING
The premise of editioning is that one version of the resource can be more or less frozen, e.g. a 2018 edition, while wiki editors collaboratively work on a next version, e.g. a 2019 edition. Editioning can provide stability for complex software engineering scenarios utilizing an online resource. Some software engineering teams, however, may choose to utilize fresh dumps or data exports of the freshest edition.
SEMANTIC WEB
A machine-utilizable lexicon could include a semantic model of its contents and a SPARQL endpoint.
MACHINE-UTILIZABLE DEFINITIONS
Machine-utilizable definitions, available in a number of knowledge representation formats, can be granular, detailed and nuanced.
There exist a large number of use cases for machine-utilizable definitions. One use case is providing natural language processing components with the capabilities to semantically interpret natural language, to utilize automated reasoning to disambiguate lexemes, phrases and sentences in contexts. Some contend that the best output after a natural language processing component processes a portion of natural language is each possible interpretation, perhaps weighted via statistics. In this way, (1) natural language processing components could process ambiguous language, (2) other components, e.g. automated reasoning components, could narrow sets of hypotheses utilizing dialogue contexts, (3) other components, e.g. automated reasoning components, could narrow sets of hypotheses utilizing knowledgebase content, and (4) mixed-initiative dialogue systems could also ask users questions to narrow sets of hypotheses. Such disambiguation and interpretation would utilize machine-utilizable definitions of senses of lexemes.
CONJUGATION, DECLENSION AND THE URL-BASED SPECIFICATION OF LEXEMES AND LEXICAL PHRASES
A grammatical category [2] is a property of items within the grammar of a language; it has a number of possible values, sometimes called grammemes, which are normally mutually exclusive within a given category. Verb conjugation, for example, may be affected by the grammatical categories of: person, number, gender, tense, aspect, mood, voice, case, possession, definiteness, politeness, causativity, clusivity, interrogativity, transitivity, valency, polarity, telicity, volition, mirativity, evidentiality, animacy, associativity, pluractionality, reciprocity, agreement, polypersonal agreement, incorporation, noun class, noun classifiers, and verb classifiers in some languages [3].
By combining the grammatical categories from each and every language together, we can precisely specify a conjugation or declension. For example, the URL:
https://machine.wiktionary.org/wiki/lookup.php?edition=2018&language=en-US&…
includes an edition, a language of a lemma, a lemma, a lexical category, and conjugates (with ellipses) the verb in a language-independent manner.
We can further specify, via URL query string, the semantic sense of a grammatical element:
https://machine.wiktionary.org/wiki/lookup.php?edition=2018&language=en-US&…
Specifying a grammatical item fully in a URL query string, as indicated in the previous examples, could result in a redirection to another URL.
That is, the URL:
https://machine.wiktionary.org/wiki/lookup.php?edition=2018&language=en-US&…
could redirect to:
https://machine.wiktionary.org/wiki/index.php?edition=2018&id=12345678
or to:
https://machine.wiktionary.org/wiki/2018/12345678/
and the URL with a specified semantic sense:
https://machine.wiktionary.org/wiki/lookup.php?edition=2018&language=en-US&…
could redirect to:
https://machine.wiktionary.org/wiki/index.php?edition=2018&id=12345678&sens…
or to:
https://machine.wiktionary.org/wiki/2018/12345678/4/
The URL https://machine.wiktionary.org/wiki/2018/12345678/ is intended to indicate a conjugation or declension with one or more meanings or senses. The URL https://machine.wiktionary.org/wiki/2018/12345678/4/ is intended to indicate a specific sense or definition of a conjugation or declension. A feature from having URL’s for both conjugations or declensions and for specific meanings or senses is that HTTP request headers can specify languages and content types of the output desired for a particular URL.
The provided examples intended to indicate that each complete, language-independent conjugation or declension can have an ID number as opposed to each headword or lemma. Instead of one ID number for all variations of “fly”, there is one ID number for “flew”, another for “have flown”, another for “flying”, and one for each conjugation or declension. Reasons for indexing the conjugations and declensions instead of traditional headwords or lemmas include that, at least for some knowledge representation formats, the formal semantics of the definitions vary per conjugation or declension.
CONCLUSION
This letter broached machine wiktionaries and some of the services which could exist at https://machine.wiktionary.org/ . It is my hope that this letter indicated a few of the many exciting topics with regard to machine-utilizable crowdsourced lexicons.
REFERENCES
[1] https://en.wiktionary.org/wiki/Index:All_languages#List_of_languages
[2] https://en.wikipedia.org/wiki/Grammatical_category
[3] https://en.wikipedia.org/wiki/Grammatical_conjugation
[4] https://en.wikipedia.org/wiki/List_of_HTTP_header_fields#Request_fields
Sorry for cross-posting!
Reminder: Technical Advice IRC meeting again **tomorrow, Wednesday 3-4 pm
UTC** on #wikimedia-tech.
The Technical Advice IRC meeting is open for all volunteer developers,
topics and questions. This can be anything from "how to get started" over
"who would be the best contact for X" to specific questions on your project.
If you know already what you would like to discuss or ask, please add your
topic to the next meeting:
https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting
Hope to see you there!
Michi (for WMDE’s tech team)
--
Michael F. Schönitzer
Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
http://wikimedia.de
Stellen Sie sich eine Welt vor, in der jeder Mensch an der Menge allen
Wissens frei teilhaben kann. Helfen Sie uns dabei!
http://spenden.wikimedia.de/
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
Hello,
Thanks to Kunal Mehta, CI now supports running MediaWiki tests using the
Postgres database! It has been requested since 2012 ( T39602 ) and
finally came to fruition.
Unsurprisingly, a few tests are failing and require some assistance.
Anyone is welcome to help on that front. See T195807 for a list.
Please celebrate Kunal!
[T39602] Jenkins: Set up PHPUnit testing on PostgreSQL backend
https://phabricator.wikimedia.org/T39602
[T195807] Fix failing MediaWiki core tests on Postgres database backend
https://phabricator.wikimedia.org/T195807
--
Antoine "hashar" Musso
One of the projects that I worked on during the Hackathon was resolving the
11-year old request to add a page creation log to MediaWiki (
https://phabricator.wikimedia.org/T12331). Up til now, we've logged page
deletions and page moves, but not page creations. Having such a log is
important for two reasons:
* Currently, it is difficult for non-administrators to identify abusive
spammer/PR accounts since much of the paper trail of their work is hidden
by page deletions.
* Currently, it is difficult to get data/metrics about page creation since
the page table doesn't include information about when an article was
created (or by whom) and the revision and recentchanges tables don't
include deleted revisions.
The patch to add this logging still needs to be reviewed/merged and any
assistance with that would be greatly appreciated:
https://gerrit.wikimedia.org/r/#/c/399897/
The feature has been put behind a feature flag so that any wikis worried
about logging blot (i.e. Wikidata) can keep it disabled.