Hi,
YuviPanda, prtksxna, and myself (with help from Tim and Aaron) have been
working the UrlShortener extension, which is designed to implement the
URL shortener RfC[1] (specifically Tim's implementation suggestion).
I've filed T108557[2] to deploy the extension to Wikimedia wikis. We'd
like to use the "w.wiki" short domain, which the WMF is already in
control of.
A test wiki has been set up mimicking what Wikimedia's configuration
would be like: http://urlshortener.wmflabs.org/, and has an accompanying
"short" domain at us.wmflabs.org (e.g. http://us.wmflabs.org/3). Please
play with it and report any bugs you might find :)
[1] https://www.mediawiki.org/wiki/Requests_for_comment/URL_shortener
[2] https://phabricator.wikimedia.org/T108557
Thanks,
-- Legoktm
Hi all,
I would like to introduce to the Wikimedia community WikiToLearn, a FOSS
project of which I am a participant and which is lately getting a lot of
contributions and momentum.
It is a KDE project sponsored (among the others) by Wikimedia Italy and
recently joined by institutions such as HEP Software Foundation (CERN,
Fermilab, Princeton...) or Universities such as University of Pisa and Milano-
Bicocca. These institutions are already populating the website with content.
We aim to provide a platform where learners and teachers can complete, refine
and re-assemble lecture notes in order to create free, collaborative and
accessible textbooks, tailored precisely to their needs.
Although the project is quite young (only a few months old), it is already
growing in allure at an unexpected rate. Thanks to this we are now counting on
nearly 40 developers, and growing (including content developers).
We are different from Wikipedia and other WMF projects in several ways, and in
a sense, complementary. Our focus is on creating complete textbooks (and not
encyclopedic articles), drawing from a professor’s or a student’s own notes,
either existing or that have to be written down.
We also have a strong focus on offline use: all the content of WikiToLearn
should be easily printable by any student for offline use and serious
studying.
Besides a good team for content development, we can count on a small but
motivated team of developers, and we would like to improve communication with
upstream (a.k.a. you ;-) ), because we found ourselves developing a few
features which could probably be made available to the general public, with
some generalization and polishing. ;-)
Is this a right place to start such a discussion?
We would like to help as much as we can, but we might need some mentoring in
how to best approach MediaWiki development, as many of us are relatively new
to OSS/Web development.
Bye,
-Riccardo
We have decided to officially retire the rest.wikimedia.org domain in
favor of /api/rest_v1/ at each individual project domain. For example,
https://rest.wikimedia.org/en.wikipedia.org/v1/?doc
becomes
https://en.wikipedia.org/api/rest_v1/?doc
Most clients already use the new path, and benefit from better
performance from geo-distributed caching, no additional DNS lookups,
and sharing of TLS / HTTP2 connections.
We intend to shut down the rest.wikimedia.org entry point around
March, so please adjust your clients to use /api/rest_v1/ soon.
Thank you for your cooperation,
Gabriel
--
Gabriel Wicke
Principal Engineer, Wikimedia Foundation
Hi,
we are considering a policy for REST API end point result format
versioning and negotiation. The background and considerations are
spelled out in a task and mw.org page:
https://phabricator.wikimedia.org/T124365https://www.mediawiki.org/wiki/Talk:API_versioning
Based on the discussion so far, have come up with the following
candidate solution:
1) Clearly advise clients to explicitly request the expected mime type
with an Accept header. Support older mime types (with on-the-fly
transformations) until usage has fallen below a very low percentage,
with an explicit sunset announcement.
2) Always return the latest content type if no explicit Accept header
was specified.
We are interested in hearing your thoughts on this.
Once we have reached rough consensus on the way forward, we intend to
apply the newly minted policy to an evolution of the Parsoid HTML
format, which will move the data-mw attribute to a separate metadata
blob.
Gabriel Wicke
Google Code-in 2015 has come to an end.
Thanks to our students for resolving 461 Wikimedia tasks. Thanks to our
35 mentors for being available, also on weekends & holidays. Thanks to
everybody on IRC for your friendliness, patience, and help provided to
new contributors.
Some more achievements, apart from those already mentioned in
https://lists.wikimedia.org/pipermail/wikitech-l/2015-December/084421.html :
* The CommonsMetadata extension parses vcards in the src field
* The MediaWiki core API exposes "actual watchers" as in "action=info"
* MediaWiki image thumbnails are interlaced whenever possible
* Kiwix is installable/moveable to the SD card, automatically opens
the virtual keyboard for "find in page", (re)starts with the last
open article
* imageinfo queries in MultimediaViewer are cached
* Twinkle's set of article maintenance tags was audited and its XFD
module has preview functionality
* The RandomRootPage extension got merged into MediaWiki core
* One can remove items from Gather collections
* A new MediaWiki maintenance script imports content from text files
* Pywikibot has action=mergehistory support implemented
* Huggle makes a tone when someone writes something
* Many i18n issues fixed and strings improved
* Namespace aliases added to MediaWiki's export dumps
* The Translate extension is compatible with PHP 7
The Grand Prize winners & finalists will be announced on February 8th.
Again congratulations everybody, and thanks for the hard work.
See you around on IRC, mailing lists, Gerrit, and Phabricator!
Cheers,
andre
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/
The Winter break took longer than expected. Busy times!
You can help develop the next summary.
*Developer Relations focus*
* Wrapping up Google Code-in 2015
* Wikimedia Hackathon 2016 travel sponsorship requests
* Wikimedia Developer Summit 2016/Lessons Learned
* Developer Relations strategy and annual plan
There is a lot more at
https://www.mediawiki.org/wiki/Developer_Relations/Weekly_summary#2016-01-26
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
I've been working on a little redesign project for the Main Page on
wikitech [0] and three key sub pages it points to since 2016-01-01 in
my User space. Tonight I decided that although it is far from perfect
it is better enough. I hope that some of you like it better than the
old page and that none of you hate it with a fiery passion that
compels you to revert it rather than helping me make it better.
[0]: https://wikitech.wikimedia.org/wiki/Main_Page
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
Hello everyone,
The public Parsoid endpoint http://parsoid-lb.eqiad.wikimedia.org is
being decommissioned [1] once we migrate over all straggler references
to that endpoint [1] possibly as soon as 3 weeks from now.
As far as we know, there are very few requests to that endpoint right
now, but if you have been using that endpoint, please switch over to
using the RESTbase service instead. You can access Parsoid HTML for the
wikimedia wikis via their REST API endpoint. For example,
https://en.wikipedia.org/api/rest_v1/?doc is the REST API url for
English Wikipedia content.
Thanks,
Subbu.
1. https://phabricator.wikimedia.org/T110474
Within the next few weeks, I'll be deploying VisualEditor for 1000+ authors here at Cimpress/Vistaprint, in a wiki with 225,000 articles.
I plan to collect feedback from our users on what works and what doesn't. What would be the best way to communicate this feedback to Wikimedia?
Here are some early, informal comments from our pilot group of 20 users:
"I am in love!"
"TemplateData integration is awesome."
"How do I insert <code> tags?" (He didn't notice the "More" link in the font menu.)
"How do I set the width of a table column?"
"How do I copy and paste table rows?"
"Editing links is fiddly" (https://phabricator.wikimedia.org/T124305)
"I didn't see the editing tools at first because they're white on a white background."
"Can we save the page without having a pop-up?"
Thanks,
DanB
I am writing to announce mediawiki-containers [1], a simple installer for
MediaWiki with VisualEditor, Parsoid, RESTBase and other services, using
Linux containers.
The main goal of this project is to make it really easy to set up and
maintain a fully featured MediaWiki system on a wide range of platforms.
The project is in an early stage, but already supports full installation on
Ubuntu, Debian and other systemd-based distributions, as well as starting
containers on OS X via the Docker toolbox [6].
These are the basic steps involved in setting up your own MediaWiki
instance with VisualEditor:
1) Get a Linux VM in labs or from a hosting provider, and select Debian
(Jessie or newer) or Ubuntu 15.04+ as the distribution. Commercial VMs with
reasonable specifications cost about $5 per month [2].
2) Log into your VM, and run this command [7]:
curl
https://raw.githubusercontent.com/wikimedia/mediawiki-containers/master/med…
| sudo bash
3) Answer the questions in the installer.
Here is a screencast of an installer run, illustrating steps 2) and 3):
https://people.wikimedia.org/~gwicke/mediawiki-containers-install.ogv
Under the hood, mediawiki-containers uses several Docker containers:
- wikimedia/mediawiki [3] with MediaWiki 1.27-wmf9 and VisualEditor.
- wikimedia/mediawiki-node-services [4] with Parsoid and RESTBase running
in a single process to minimize memory use.
- MariaDB as the database backend [5].
Data and configurations are stored on the host system in
/srv/mediawiki-containers/data, which means that upgrading is as simple as
fetching the latest container images by re-running the installer.
Optionally, the installer can set up automated nightly updates, which helps
to keep your wiki installation up to date.
The project is brand new, so there is a fair chance that you will encounter
bugs. Please report issues at
https://phabricator.wikimedia.org/maniphest/task/create/?projects=mediawiki…
.
Here are some ideas we have for the next steps:
- Forward `/api/rest_v1/` to RESTBase & configure RESTBase updates. Enable
Wikitext / HTML switching in VE.
- Improve security:
- Run each container under a different, unprivileged user.
- Secure the install / update process with signatures.
- Add popular extensions, and streamline the support for custom extensions.
- Add services like mathoid, graphoid.
- Use the HHVM PHP runtime instead of Zend, possibly using ideas from
https://github.com/kasperisager/php-dockerized.
- Support developer use cases:
- Optionally mount code volumes from the host system.
- Improve configuration customization support.
- Support for more distributions.
Let us know what you think & what you would like to see next at
https://phabricator.wikimedia.org/T92826.
Happy holidays,
Gabriel Wicke and the Services team
[1]: https://github.com/wikimedia/mediawiki-containers
[2]:
http://serverbear.com/compare?Sort=BearScore&Order=desc&Server+Type=VPS&Mon…
[3]: https://github.com/wikimedia/mediawiki-docker and
https://hub.docker.com/r/wikimedia/mediawiki
[4]: https://github.com/wikimedia/mediawiki-node-services and
https://hub.docker.com/r/wikimedia/mediawiki-node-services/
[5]: https://hub.docker.com/_/mariadb/
[6]: https://docs.docker.com/mac/step_one/
[7]: We agree that `curl | bash` has its risks, but it is hard to beat for
simplicity. The Chef project has a good discussion of pros, cons &
alternatives at
https://www.chef.io/blog/2015/07/16/5-ways-to-deal-with-the-install-sh-curl…
.