Hi,
We need around 20 minutes of downtime to all services that write to db9 for
replication maintenance. Outside of services that support the ops team,
this primarily means bugzilla, etherpad, and civicrm. It will remain
available for read queries, however, so read usage of services such as the
tech blog should continue along fine. I'm planning to start this on Weds
at 18:00 PST and will send follow-up mail at the start and completion of
maintenance.
-Asher
The improvements to the Upload Wizard are very welcome, but socially,
I think it is still broken. Please correct me if I overlook something
or overlook another extension.
Socially, I believe many mediawiki extensions need a way to ask for
images on a topic page, provide an upload wizard AND display the
results on the topic page. Presently, even if the image is added to
the wiki or a commons repository, it simply disappears in a black hole
from the perspective of the contributing image author.
I believe it is possible to have a wizard option which does the following:
* store the page context from which it was called.
* upload images to local wiki or a repository
* open the page context in edit mode
* search for some form of new-images-section
** a possible implementation of this could be a div with id=newimages
containing a gallery tag
* if new-images-section exists: add images, if not create with new images.
* Save context page.
Presently, WMF is possibly the biggest driver of open content (CC
BY/CC BY-SA) but is able to collect images only from the small
population that is the intersection of the population or people able
to edit mediawiki and the huge population able to provide quality
images.
The new-images-section solution would probably not directly work for
wikipedia itself; here some more complex review mechanism (new images
gallery would be shown only to some users, including image uploader,
or so) would be needed, perhaps in combination with flagged rev. I
view this feature however potentially as a two step process: implement
with direct addition to page, modify to optimize for flagged revs.
However, I think something the described feature would be needed;
presently all these crowdsourcing images are mostly collected by
projects that either use no open content license at all, or the NC
license at best. WMF is not able to exert its potential pull towards
open content in this area.
Also reported as
https://bugzilla.wikimedia.org/show_bug.cgi?id=33234
Gregor
Hi,
Danny B. and I adapted the script used for non-existing wikis
(trunk/tools/web-scripts/missing.php) so it redirects to the respective
test wiki at Wikimedia Incubator. This would be a huge usability
improvement (however, only for languages configured in Wikimedia). For
Wikiversity & Wikisource, the "wiki does not exist" message remains,
because BetaWV and OldWikisource don't have a logical page naming system to
redirect to.
So I would like to ask if someone can review & deploy this (Commits are
here:
https://www.mediawiki.org/wiki/Special:Code/MediaWiki?path=/trunk/tools/web…
it may be easier to just review current trunk version). That would be
great :)
The relevant bug report is
https://bugzilla.wikimedia.org/show_bug.cgi?id=30206 (the initial idea was
to do it via DNS/Apache config)
Thank you,
SPQRobin
Two new committers have extensions access:
Vedmaka will be working on SocialProfile and Semantic Social Profile.
You can see the current work at http://vedmaka-smw.dyndns.org and
git://github.com/vedmaka/smp-autocomplete.git (flomaster branch).
Van de Bugger (van-de-bugger) will be working on SubPageList, Semantic
MediaWiki, Semantic Forms, other extensions, and localizing extensions
to Russian.
Welcome, Van and Vedmaka!
The commit access queue is now empty. I'm on vacation next week so Tim
Starling will be taking care of commit access requests next week, and
then I'll take over again the first week of January. Some backlog may
accumulate but we'll take care of it in January.
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
You are invited to the Pune Wikimedia hackathon.
Dates: 10-12 February 2012
Venue: Symbiosis Institute of Computer Studies & Research (SICSR) at
Symbiosis International University, Pune
Extremely rough event page, soon to get more details:
https://www.mediawiki.org/wiki/Pune_Hackathon_Feb_2012
A Wikimedia hackathon is a chance to learn how to develop using
MediaWiki, Phonegap, and our other technologies, and to work alongside
experts. Software engineers, designers, and translators are welcome.
We're planning to focus on internationalisation and localisation, mobile
Wikipedia access, and the JavaScript-based gadgets framework.
Registration link: http://is.gd/rjpNOA
If you're interested, please register to request an invitation, and feel
free to spread the word. Thanks!
(If you need a travel subsidy to attend this event, apply for a
Participation Grant: https://meta.wikimedia.org/wiki/Grants:Participation )
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
Hi,
I wrote a nagios parser for wmflabs, it downloads instance list from
labsconsole and using simple application it parses it and generate
nagios configs, it's currently running periodicaly on
nagios.wmflabs.org and thanks to that nagios is updated when you
create a new instance, it parses the gerrit classes from nova and
define services according to that, currently it knows only apache
service, but I can of course insert more classes there. The source
code is located in trunk/tools/nparser. Thanks to Ryan Lane for the
semantic query in order to get a list.
If you would find any problem with the parser or source, feel free to fix it!
To make life easier for the fundraising translators, I added CLDR
support to XML::languageSelector(). This means you can now request a
language select list in any language you want (rather than just getting
the native names). This would also be useful for things like the Upload
Wizard.
Unfortunately, we only have 1 available deployment day left before the
holidays, so if there's any chance someone could review the revision
before 2pm tomorrow, it would be greatly appreciated.
http://www.mediawiki.org/wiki/Special:Code/MediaWiki/106912
The change should be 100% backwards compatible. Thanks.
Ryan Kaldari
MediaWiki is a mature, living breathing beast. When making lots and lots of
little tweaks, please try to make sure they actually work correctly prior
to committing.
I'm becoming convinced that a huge benefit of the git migration will be in
moving to pre-commit review. If it doesn't work, don't let it into to core
yet... The very idea what we can have a big "review backlog" on things
ALREADY COMMITTED is a large part of why we have so many regressions when
we finally catch up and update.
That is all; please go on about your business, citizens.
</rant>
-- brion vibber (bvibber @ wikimedia.org)
Everyone,
I wanted to let everyone know that we recently began testing some new
versions of the Article Feedback tool. As you may remember, the first
version of the tool (launched earlier this year), focused on having readers
provide feedback on the quality of articles [1]. The new versions try a
different approach, based in part on feedback from the community. Rather
than ask readers for feedback on quality, the new feedback forms prompt the
user to help improve the encyclopedia. For example, one version we're
testing asks the reader "Did you find what you're looking for?". If the
reader answers "No", the tool prompts them to explain what is missing. The
intent is to provide editors with some idea of feedback on what readers are
actually hoping to see when they read a Wikipedia article, so that the
editing community may incorporate that feedback when developing the
article. Hopefully, some of these readers will also become editors.
Here is the blog post with more details:
http://blog.wikimedia.org/2011/12/20/a-new-way-to-contribute-to-wikipedia/
Right now, there are three test versions running on approximately 10,000
randomly selected rticles.
Is this type of feedback actually going to be helpful? We don't know. So
the next step is to evaluate the comment streams coming in from each test
version to see which one offers the most number of constructive comments
accompanied by the least amount of noise [2]. We'll be doing this with
community members, so if you'd like to be involved, please drop either me
or Oliver Keyes (okeyes at wikimedia dot org) a line. We're also tracking
progress on the project page [3].
Howie
[1]
http://blog.wikimedia.org/2011/07/15/%E2%80%9Crate-this-page%E2%80%9D-is-co…
[2] The comment streams aren't going to be viewable by the public yet. One
of the next phases of the project is to design a "Feedback Page" which
displays the comment stream on a per article basis. We'll be collaborating
closely with the editing community on the design of this page.
[3] http://en.wikipedia.org/wiki/Wikipedia:Article_Feedback_Tool/Version_5
I've been informally mentoring André, Tiago, Diego, and César. They
are four students at Minho University who are currently working on a
project to improve DB2 database support in MediaWiki.
So far, they've:
- Fixed several outstanding issues with DB2 support involving
character encoding, Windows vs Linux, etc
- Added DB2 support to the new MediaWiki 1.17 Installer and Updater
- Put in the appropriate Updater sql patches to reflect database
schema changes since 1.14
MediaWiki already had some DB2 support, but it's been broken since
1.15 and never complete. As a result of their work, it's now possible
to successfully install MediaWiki on DB2 out of the box and to use the
core wiki features.
I'll shortly commit their first patch using my SVN account (leonsp).
I've taken some care to look over the code and make sure it abides by
the MediaWiki code guidelines.
Regards,
Leons Petrazickis
http://lpetr.org/blog/