We are half way through this year's Google Code-in contest:
https://www.mediawiki.org/wiki/Google_Code-in_2015
Thanks to our GCI contributors (and the help of Wikimedia's 31 mentors)
we have seen more than 150 tasks resolved already!
To list only some of the students' achievements:
* More than a dozen of MediaWiki extensions converted to use extension
registration
* Confirmation dialogs in UploadWizard + TimedMediaHandler use OOjs-UI
* Vagrant roles created for the EmbedVideo and YouTube extensions
* Two more scraping functions in the html-metadata node.js library
(used by Citoid)
* Many MediaWiki documentation pages marked as translatable
* lc, lcfirst, uc and ucfirst magic words implemented in jqueryMsg
* Screenshots added to some MW extension homepages on mediawiki.org
* ConfirmEdit's ReCaptchaNoCaptcha uses the UI language for the
captcha
* MobileFrontend, MultimediaViewer, UploadWizard, Newsletter, Huggle,
and Pywikibot received numerous improvements (too many to list)
* ...and many, many, many more.
Please join me in congratulating and thanking our contributors and
mentors for working hard on improving Wikimedia!
I'd also like to thank everybody on the #mediawiki and #wikimedia-dev
IRC channels who has helped with onboarding and answering questions.
Sounds interesting? Got a small well-defined task in mind you are
willing to mentor? Join and become a mentor if you aren't already! Read
https://www.mediawiki.org/wiki/Google_Code-in_2015#Mentors.27_corner
and contact us if you need help!
Cheers,
andre
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/
tl;dr: What's the right way for a tag extension to execute JavaScript provided by the user? (On a private wiki without Internet access.)
Details:
I run a private wiki for developers (not accessible from the Internet) that lets any wiki page author run JavaScript on a page by adding a tag:
<javascript> alert("hi"); </javascript>
(We understand the security implications, which is why the wiki isn't accessible by the world.) When we upgraded to MediaWiki 1.26 (from 1.24), a problem occurred: the <javascript> tag stopped recognizing the "mediawiki" and "mw" objects, but otherwise works. The following code reports an undefined variable "mw":
<javascript> mw.loader.using(....) </javascript>
I assume this is because the <javascript> extension builds a <script> tag as a string and uses the SkinAfterBottomScripts hook to add it to the page, rather than using ResourceLoader. However, I cannot figure out how to use ResourceLoader to add JavaScript provided on the wiki page like my small examples above. We can't use the array $wgResourceModules[$name]['scripts'] because the JavaScript isn't in a static file.
So... what's the right method for injecting author-supplied JavaScript in this manner?
I've already tried using ResourceLoader to add 'mediawiki' to $wgResourceModules[$name]['dependencies']. It didn't work, complaining that 'mediawiki' was not a known dependency. I also read https://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_for_extension… but did not find an answer.
Thanks for any advice!
DanB
The Wikimedia Developer Summit starts this Monday, Jan. 4!
There will be an information and discussion session about the
in-progress Code of Conduct for technical spaces
(https://www.mediawiki.org/wiki/Code_of_Conduct/Draft) on Monday.
Thanks,
Matt Flaschen
Sorry I forgot to copy this list.
---------- Forwarded message ----------
From: *James Salsman* <jsalsman(a)gmail.com>
Date: Tuesday, December 22, 2015
Subject: Database administration support (was Re: IRC office hours: Shared
hosting)
To: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>
On Sunday, December 20, 2015, Brian Wolff <bawolff(a)gmail.com
<javascript:_e(%7B%7D,'cvml','bawolff(a)gmail.com');>> wrote:
> If you want to get Dispenser his hard disk space, you should take it
> up with the labs people, or at the very least some thread where it
> would be on-topic.
>
The labs people are so understaffed that two extremely important anti-spam
bots recently had to be taken offline for much longer than in recent years.
I propose Foundation management allocate the necessary resources and
recommend the hiring of sufficient personnel and purchasing of sufficient,
non NSA-compatible (i.e., discount and homebrew style) equipment
to properly support both existing infrastructural bots and similar projects
such as Dispenser's reflinks cache.
I would also like to propose that the Foundation oppose the TPP provisions
deleterious to our interests, and that this position be endorsed on the
Public Policy list.
> Then by definition it wouldn't be a third-party spam framework if WMF
> was running it.
I am not proposing that the WMF take the bots over, just meet their
necessary service level requirements.
Sincerely,
Jim
Hi,
I had a project running named algo-news. I can still login and "become"
algo-news, but all my files seem to have disappeared. There was a Python
app in ~/www/python/src/ as suggested by the docs, but not any more. Any
idea what happend? And how can I prevent this in the future?
Best, Fako
Hello!
MediaWiki-Codesniffer 0.5.1 is now available for use in your MediaWiki
extensions and other projects. This release shouldn't contain any
changes in the ruleset or sniffs, so upgrading should be trivial.
Here are the notable changes since the last release (0.5.0):
* Avoid in_array for performance reasons (Thiemo Mättig)
* Remove dead code from SpaceBeforeSingleLineCommentSniff (Thiemo Mättig)
* Revert "CharacterBeforePHPOpeningTagSniff: Support T_HASHBANG for HHVM
>=3.5,<3.7" (Legoktm)
* Simplify existing regular expressions (Thiemo Mättig)
* Update squizlabs/php_codesniffer to 2.5.0 (Paladox)
Special thanks to Thiemo Mättig looking into the performance of PHPCS
and Paladox for following the upstream changelog.
-- Legoktm
On Mon, Dec 28, 2015 at 12:51 PM, Henning Schlottmann <h.schlottmann(a)gmx.net
> wrote:
> Switching dead links to the archive is a move to a dead end, instead of
> looking for
>
> a) the new correct URL, as many links were just moved.
> b) alternative sources for the same fact.
>
An automated process can't reliably do either of those, while having the
archive link available will make it easier for human editors to do both of
those since they'll have the actual content of the dead link available
rather than just what information is preserved in the citation (URL, title,
author, maybe a short quotation).
As part of T113210 [1], which is a broader discussion on track for the
developer summit, I am hosting an IRC office hour [2] this Thursday at
19:00 UTC.
Since shared hosting is a broad topic, this session will focus specifically
on brainstorming ways to reconnect with the shared hosting community.
Shared hosting mediawiki users are currently underrepresented in the
greater mediawiki community. We rarely run into them in phabricator, on
gerrit or on the mailing lists. Which means that people often have to think
on their behalf about their use cases and issues, instead of getting direct
input.
There must be practical ways to bring those thousands of mediawiki users
back into the fold, so to speak. Hopefully we can come up with interesting
ideas to achieve that.
And if you happen to be a shared hosting user, by all means, please join
this IRC office hour :)
[1] https://phabricator.wikimedia.org/T113210
[2] https://meta.wikimedia.org/wiki/IRC_office_hours#Upcoming_office_hours
Hi,
I want to implement something similar to ~~~~ expansion or the
pipe-trick
in internal links in an extension.
That is, I need to execute code
- on save and preview only,
- on wikitext ony,
- on page content only,
- before the pre-save-parser treats links in the wikitext,
- altering the wikitext of the page.
I tried several hooks and ideas, but was unable to make it specific
enough.
Help is appreciated.
Thank you.
Purodha
https://www.mediawiki.org/wiki/Scrum_of_scrums/2015-12-23
= 2015-12-23 =
== Reading ==
=== Web ===
Read More: morelike query and caching optimizations discussed at
https://phabricator.wikimedia.org/T121254
=== Mobile Content Service ===
Rolling out progressively for beta Android app
Pregeneration will be a speed win (prereq for stable channel app) - resume
discussion in mid-January?
=== Android ===
Nothing to report, final release of calendar year shipped
=== iOS ===
org.wikimedia.wikipeda TestFlight Beta recruitment has started - please
join per
https://lists.wikimedia.org/pipermail/mobile-l/2015-December/010011.html
=== Reading Infrastructure ===
* Security: If you could squeeze in a look at
https://gerrit.wikimedia.org/r/#/c/259066 and related patches (it's another
thing needed for AuthManager that isn't "in" AuthManager itself), we'd
appreciate it.
== Infrastructure ==
=== Technical Operations ===
* Apologies, could not make it today
* Blockers: none
* Blocking: none
* Updates:
* cxserver update with Language Engineering
* some outages in mathoid/citoid
* code freeze week
=== Services ===
* CXServer moved to service-runner
:* standardised development and deployment == easier to maintain
* mediawiki + services in containers -
https://github.com/wikimedia/mediawiki-containers
* AQS - need to move to the new config on next deploy
:* https://phabricator.wikimedia.org/T122249
:* we need to coordinate
* EventBus proxy service deployed via Scap3 !
=== Release Engineering ===
* *Blocking*: (none)
* *Blocked by*: (none)
* *Updates*:
** Scap3 refactoring and tech debt cleanup
*** https://phabricator.wikimedia.org/project/view/1449/
*** Pybal mocking/testing being worked on in Beta Cluster
*** New command for checking whether security patches are applied
** Investigating rise in production errors since around Dec 18
*** https://grafana.wikimedia.org/dashboard/db/releng-kpis
** Completed upgrades of browser-test suites to use MW-Selenium 1.x
(finally!)
*** One pending merge for CirrusSearch will be looked at today
== Discovery==
* Completion suggester beta feature released as scheduled, 1300+ enabled on
en.wiki so far
** See also sliides from Lightning talk on it:
https://docs.google.com/presentation/d/1n1_NKzMvmaKtZhWnywhs2XjljtkM0gLdgnT…
* Two new team members, for Ops (Guillaume Lederrey, Feb 1) & and PM
(Deborah Tankersley, Jan 4) positions
* Portal A/B test will need to be rerun, data collection failed, target
date - January 4
** Improved docs on A/B tests to try and avoid it in the future:
https://meta.wikimedia.org/wiki/Discovery/Testing
* Working on alternative language detectors - TextCat (ported to PHP) and
Cybozu ES plugin
* Analytics->ES communication enabled, will need to talk about moving it to
Services/Kafka infrastructure when ready
* No blockers
== Maps & Graphs ==
* Re-imported database, attempting to automate data-updated
** Had a short term maps outage
* Waiting for Kartographer extension security check
* Waiting for Ops for the 16 varnish servers
== Fundraising Tech ==
* deployed ipv6 geolocation fix (thanks Timo!)
* Central Notice translation bugfix
* continuing CiviCRM and internal dashboard work
* examining anti-fraud rules for existing and backup processor
* investigating miscellaneous weirdness reported by donors
=== Collaboration ===
* Continuing work on Echo notification, including cross-wiki and messaging
refinements, and a multi-wiki MediaWiki-Vagrant setup to work on this
locally.
* Fixed a production issue with a page on English Wikipedia
* We had a problem with Nuke in production that wasn't caught locally, so
had to revert release branch. FIx is about to be merged.
* That revealed our local setup needs to be closer to production. I
started working on that. Preliminary step was to have MWV stop using root
for DB access, which was planned anyway. Later we will have Flow and Echo
use a separate DB locally.