Hi there,
Asking for a task to volunteer, Sumana encouraged me to look at the
topic of community metrics. She pointed to
https://wikitech.wikimedia.org/view/Pentaho as a starting point.
After a first look at Pentaho and what some colleagues at the MeeGo
project did with it [1], I searched (a bit) for any wiki pages or
discussions about community metrics here, but couldn't find any.
http://www.mediawiki.org/wiki/User:Qgil/MediaWiki_Community_Metrics
Edits welcome!
I'm looking for feedback, help, and a first prototype of an
automatically refreshed report hopefully sooner than later. Something
simple to build upon.
Even if it's too tempting to define the first prototype thinking first
on tools or data available, you are encouraged to start by proposing
what questions do you want actually answered. What community trends do
you want to know?
See
http://www.mediawiki.org/wiki/User:Qgil/MediaWiki_Community_Metrics#Trends_…
- in few days we should have agreed on the first and most important
trends we want to visualize.
[1] http://wiki.meego.com/Metrics/Dashboard
--
Quim
Hi everyone,
You might've seen Gerrit go down for about 5 minutes a little while
ago. We deployed a custom 2.4.2 build that includes a patch from
OpenStack[0]. This was needed to support Jenkins + Zuul.
Where's 2.5? We're still testing the release and hope to have an
upgrade date in place sometime soon. Right now there's a pretty
major regression in LDAP support that needs fixing before the
production instance can be upgraded.
Happy coding,
-Chad
[0] https://gerrit-review.googlesource.com/#/c/37930/
Hello,
Wikimedia is holding a technical meetup in Bangalore, India at the Indian
Institute of Management campus. This DevCamp[1] is a chance to participate
in development of Javascript based internationalization and localization
tools as well as mobile applications using Phonegap and LAMP technologies,
and to work alongside experts. Software engineers, UX/UI designers, and
translators are welcome!
Developers from technical and non-technical backgrounds with a focus on
browser and mobile technologies are welcome at Wikimedia DevCamp. Prior
experience with MediaWiki software is not required but you could contribute
the most if you're an active developer or UI designer. Translators with a
linguistic background who can help improve language support for Wikimedia
technology projects are also welcome.
Registrations are open. If you have suggestions on topics that people who
are new to Wikimedia universe can work on during the event, please feel
free to add them in the topics page[2].
[1] https://www.mediawiki.org/wiki/Bangalore_DevCamp_November_2012
[2] https://www.mediawiki.org/wiki/Bangalore_DevCamp_November_2012/Topics
--
Srikanth L
Wikimedia Language Engineering Team
Hi,
I massively re-use medias from commons and I'm unable to simply
(automatically) get the related author and license "attached" to each
document I copy. As far as I know this is not possible (for example
using the API, or dealing directly with information in the DB coming
with the dumps).
I'm sorry if this topic was already well discussed in the past. If this
is the case, please share the right pointer with me and simply ignore
the rest.
So, I fail to respect the license/copyright in my derivative works. I'm
not comfortable with that situation. This is a problem for me... but,
because our goal, as a movement, is to provide reusable content ; I
consider this is also a global problem. Those information are mandatory
to respect the law, we should provide a way to retrieve them easily.
I do not see any solution without saving both license/author in the
database for each document... and building afterward code to deal with
this new properties. Do we have project in that direction? Maybe
decisions were already token regarding this topic?
Regards
Emmanuel
I said I would lay out my thoughts regarding MW releases this weekend,
so here goes.
First: I want to provide a regular schedule so users know what to
expect, but something that a volunteer (me, for now) can achieve.
Second: I want to provide something that Linux distributors can
incorporate into their distributions.
To fulfill the first point, I think a release twice a year -- like
Ubuntu releases -- makes a lot of sense. This schedule also works for
Linux distributors like Ubuntu, Fedora, and OpenSuSE
Since I started out using Debian (which has now adopted a 2 year freeze
cycle), I think it also makes sense to provide LTS support. Platonides
and I (but mostly Platonides) have been working with the Debian
developers to get 1.19 into Wheezy which was frozen in June.
With that in mind, here is what I propose:
1.18.0 | Security updates till 1.20
1.19.x | April 2012 (LTS)
1.20.0 | October 2012
1.21.0 | April 2013 (Start in May)
1.22.0 | October 2013 (Start in September)
1.23.0 | April 2014 (LTS)
1.24.0 | October 2014
1.25.0 | April 2015
1.26.0 | October 2015
1.27.0 | April 2016 (LTS)
LTS releases will updates until (at least) the next LTS release. This
means security updates, but other updates that don't require schema
changes if people are interested in providing them. Since a couple of
people have put the 1.20.0 milestone on a handful of bugs, I'm assuming
now that they think those are worth merging to the 1.20 series. I'd
like to get the fixes backported to 1.19 as well, if possible.
Well, that's pretty much it what I was thinking. How does this sound to
you guys?
--
http://hexmode.com/
Any time you have "one overriding idea", and push your idea as a
superior ideology, you're going to be wrong. ... The fact is,
reality is complicated -- Linus Torvalds <http://hexm.de/mc>
Axel Thimm and Patrick Uiterwijk are working on packaging MediaWiki 1.19
for Fedora and have asked for advice on how to enable their MediaWiki
package to support WikiFarms out of the box.[0]
I really don't know how to help them, but I'm sure that someone on this
list has something that would be helpful.
If you have experience, please join in the discussion!
[0-short] http://hexm.de/mh
[0-long]
http://lists.wikimedia.org/pipermail/mediawiki-distributors/2012-October/00…
--
http://hexmode.com/
Any time you have "one overriding idea", and push your idea as a
superior ideology, you're going to be wrong. ... The fact is,
reality is complicated -- Linus Torvalds <http://hexm.de/mc>
Sorry for the spam, but the ContentHandler changes especially may affect
you -- if you have any time this weekend or next week to do some
testing, we'd appreciate it. Thanks.
-Sumana
-------- Original Message --------
Subject: Please notice and report big glitches - changes coming
Date: Fri, 12 Oct 2012 17:14:05 -0400
From: Sumana Harihareswara <sumanah(a)wikimedia.org>
Organization: Wikimedia Foundation
To: Coordination of technology deployments across languages/projects
<wikitech-ambassadors(a)lists.wikimedia.org>
On Monday we start deploying a new version of MediaWiki, 1.21wmf2, to
the sites, starting with mediawiki.org and 2 test wikis
(https://www.mediawiki.org/wiki/MediaWiki_1.21/Roadmap). 1.21wmf2 will
have 3 big new things in it and we need your help to test on the "beta"
test site http://deployment.wikimedia.beta.wmflabs.org/wiki/Main_Page
now to see if there are any really critical bugs.
1) The new ContentHandler (
https://www.mediawiki.org/wiki/ContentHandler ) might affect handing of
CSS and JavaScript pages, import/export (including PDF export), and API
stuff, especially when rendering and editing. I'd suggest we also look
out for issues in template rendering, images and media handling,
localisation, and mobile device access. (merged on Oct 9)
2) High-resolution image support. This work-in-progress will try to
give higher-res images to high-density screens that can support it, like
new Retina displays. More info at
https://gerrit.wikimedia.org/r/#/c/24115/ . One of the bigger risks of
the high res stuff is load-based, since we may see substantial new load
on our image scalers. So *all* image scaling might be impacted. (merged
on Oct 11)
3) "Sites" is a new backend to represent and store information about
sites and site-specific configuration. This code is meant to replace
the current interwiki code, but does not do so just yet. Still, keep an
eye out for site-specific configuration or interwiki issues.
Right now the version of MediaWiki on the beta cluster dates from 9 Oct
and thus has ContentHandler but not the high-res image support or Sites.
So please test on the beta sites now and look out for these issues on
your sites in the weeks ahead.
https://www.mediawiki.org/wiki/Category:MediaWiki_test_plans has some
ideas on how to find errors.
Thanks! With your help we can find bugs early and get them fixed before
they affect lots of readers and editors.
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
This was just posted on [[en:Wikepedia:Village Pump (technical)]]:
<start message>
Hi,
I am the admin for an ISP and we are now deploying IPv6 to some customers
bits.wikimedia.org is failing over IPv6 from the the range 2a02:3d8::/32
the routing is broken upstream of our primary IPv6 provider between
tele2.net and wikimedia so it may also be affecting other ip6 address ranges
[[bminish@redbox ~]$ tracepath6 bits.wikimedia.org
1?: [LOCALHOST] 0.017ms pmtu 1500
1: gw6.mayo.lan 0.178ms
1: gw6.mayo.lan 0.146ms
2: 2a02:3d8:1:ffff::1:1 0.757ms
3: brendan1-brendan2.westnet.ie 0.983ms
4: isl-kw-brendan1.westnet.ie 1.766ms
5: 2a02:3d8:ffff:104::1 2.549ms
6: ktm12-kw.westnet.ie 4.917ms
7: piglet-eth2v3006.westnet.ie 5.308ms
8: mole-ge2.westnet.ie 12.328ms
9: 2001:978:2:60::3:1 11.503ms
10: te3-7.ccr01.dub01.atlas.cogentco.com 27.049ms asymm 17
11: te1-4.ccr01.man01.atlas.cogentco.com 26.786ms asymm 17
12: te1-6.ccr02.lhr01.atlas.cogentco.com 26.927ms asymm 17
13: 2001:978::112 27.599ms asymm 17
14: peer-as174.cbv.tele2.net 27.216ms
15: cbv-core-2.gigabiteth4-4.tele2.net 29.172ms
16: cbv-core-3.tengige0-0-0-0.tele2.net 35.459ms !N
Resume: pmtu 1500
<end message>
The message was unsigned, but posted from 2A01:7B8:2000:A6:0:0:0:10.
--
Erwin Dokter
On 10/14/2012 12:14 PM, Platonides wrote:
> As these look a bit like "shipping with ugly bugs", I have done a quick
> check of their severity:
Thank you so much for this. I'd like to get an assessment like this of
of these issues incorporated into the release notes. I think it will
better help people understand if the issue affects them.
(As far as "shipping with ugly bugs"... I think there are some bugs in
Bugzilla that have proven very irksome to *some* users so I don't think
we're any worse off by saying publicly that we have scheduled this
particular set to be fixed.
> PS: I have a tag 1.20.0rc1 pointing to 1.12.0rc1 (a 2008 revision), but
> doesn't appear on
> https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=tags
> Tagging failure or user error?
PEBKAC? I don't know enough about how Gerrit handles tags from the
non-MW group, but I suspect it would have to have some intervention of
someone in that group (I'm not a member) for you to see tags that
someone else put on a revision.
Mark.
When I was hired as QA Lead almost seven months ago, WMF lacked a test
environment where
* code was routinely deployed ahead of production
* the test environment emulated the production environment closely
* aspects of the test environment (config, permissions, etc.) could be
easily and reliably manipulated for testing purposes
Today I am happy to announce that beta labs fulfills those needs.
Beta labs is intended to host the upcoming release of Mediawiki, plus those
extensions scheduled for deployment to production, for the purpose of
testing and investigation.
As of a little while ago, Mediawiki, AFTv5, New Pages Feed/Page Curation,
and UploadWizard are being deployed to beta labs from git automatically and
reliably. The configurations for those extensions are also being managed
in git. The environment itself is managed via puppet, and emulates
production to the greatest extent possible. Many many thanks to Antoine
Musso for making this possible.
As of this week, all these extensions are up, running, and configured to be
useful. Note that they are not perfect, just useful. For example, right
now on beta enwiki both AFTv4 and AFTv5 input forms appear on the same page
in many cases, because I was experimenting with what happens when these
extensions are not configured correctly. Some actions from the Page
Curation toolbar never complete. As these glitches become important to
testing, we will get them working correctly, and likely will find out some
interesting things about the software along the way.
The timing for this announcement is excellent, because new QA Engineers
will be joining WMF soon (more on that next week), and beta labs will be a
prime target for the browser-level end-to-end automated tests we will
shortly be creating. Also, we have been wanting to retire the 'prototype'
host for some time, and having AFTv5 etc. on beta labs should make that
possible.
In summary, beta labs is up and running with current code for Mediawiki and
critical extensions, and at this point the best way to improve beta labs is
to use it.
http://en.wikipedia.beta.wmflabs.org/wiki/Special:ArticleFeedbackv5http://en.wikipedia.beta.wmflabs.org/wiki/Special:NewPagesFeedhttp://commons.wikimedia.org/wiki/Special:UploadWizard