Hey All,
Upgraded from 1.20 to 1.23rc3 yesterday, ran update.php, everything worked
fine -- except for Special:SpecialPages. Disabled all extensions to see if
that was doing it, but no change. It comes back with a Connection Reset
error. I went to Apache's error logs and came across this for each failed
attempt:
[Thu Jun 05 09:10:51 2014] [notice] child pid 18983 exit signal
Segmentation fault (11)
Uh oh.
Thought that maybe 1.23.0 would fix the issue, woke up this morning,
updated again, and the page loaded!! However, with warnings:
PHP Warning: Illegal offset type in isset or empty in
/var/www/ops_dev/includes/User.php on line 1390
PHP Warning: Illegal offset type in isset or empty in
/var/www/ops_dev/includes/User.php on line 1390
PHP Warning: Illegal offset type in isset or empty in
/var/www/ops_dev/includes/User.php on line 1390
PHP Warning: Illegal offset type in isset or empty in
/var/www/ops_dev/includes/User.php on line 1390
PHP Warning: Illegal offset type in isset or empty in
/var/www/ops_dev/includes/User.php on line 1390
I thought, great! Maybe one of the extensions is causing the error. So I
went and disabled all extensions, and reloaded the page and got the
'Connection reset'.
[Thu Jun 05 09:15:11 2014] [notice] child pid 19200 exit signal
Segmentation fault (11)
Re-enabled extensions, Seg fault still.
Tried again a few seconds ago, Warning: Illegal offset type in isset or
empty in /var/www/ops_dev/includes/User.php on line 1390 (x10)
Any help would be greatly appreciated!
Thanks,
Tom
---------------------------------------------------------------more info
MediaWiki 1.23.0 <https://www.mediawiki.org/wiki/MediaWiki_1.23> (1346cdb)
<https://git.wikimedia.org/commit/mediawiki%2Fcore.git/1346cdbba7e7f4560b572…>
16:57, 4 June 2014 PHP 5.3.6-13ubuntu3.10 (apache2handler) MySQL 5.1.52
Ubuntu 11.10
Server version: Apache/2.2.20 (Ubuntu)
Server built: Mar 8 2013 15:58:04
Server's Module Magic Number: 20051115:28
Server loaded: APR 1.4.5, APR-Util 1.3.12
Compiled using: APR 1.4.5, APR-Util 1.3.12
Architecture: 64-bit
Server MPM: Prefork
threaded: no
forked: yes (variable process count)
My logging changes [0][1][2][3] are getting closer to being mergeable
(the first has already been merged). Tony Thomas' Swift Mailer change
[4] is also progressing. Both sets of changes introduce the concept of
specifying external library dependencies, both required and suggested,
to mediawiki/core.git via composer.json. Composer can be used by
people directly consuming the git repository to install and manage
these dependencies. I gave a example set of usage instructions in the
commit message for my patch that introduced the dependency on PSR-3
[0]. In the production cluster, on Jenkins job runners and in the
tarball releases we will want a different solution.
My idea of how to deal with this is to create a new gerrit repository
(mediawiki/core/vendor.git?) that contains a composer.json file
similar to the one I had in patch set 7 of my first logging patch [5].
This composer.json file would be used to tell Composer the exact
versions of libraries to download. Someone would manually run Composer
in a checkout of this repository and then commit the downloaded
content, composer.lock file and generated autoloader.php to the
repository for review. We would then be able to branch and use this
repository as git submodule in the wmf/1.2XwmfY branches that are
deployed to production and ensure that it is checked out along with
mw-core on the Jenkins nodes. By placing this submodule at $IP/vendor
in mw-core we would be mimicking the configuration that direct users
of Composer will experience. WebStart.php already includes
$IP/vendor/autoload.php when present so integration with the rest of
wm-core should follow from that.
It would also be possible to add this repo to the tarballs for
distribution. There will probably need to be some adjustments for that
process however and the final result may be that release branches
update the mediawiki/core composer.json and provide a composer.lock
along with a pre-populated vendor directory. I would be glad to
participate in discussions of that use case, but we will have about 6
months before we need to solve it (and a new release management RFC to
resolve between now and then).
There are several use cases to consider for the general solution:
== Adding/updating a library ==
* Update composer.json in mediawiki/core/vendor.git
* Run `composer update` locally to download library (and dependencies)
* Run `composer dump-autoload --optimize` to make an optimized autoloader.php
* Commit changes
* Push changes for review in gerrit
== Hotfix for an external library ==
At some point we will run into a bug or missing feature in a Composer
managed library that we need to work around with a patch. Obviously we
will attempt to upstream any such fixes (otherwise what's the point of
this whole exercise?). To keep from blocking things for our production
cluster we would want to fork the upstream, add our patch for local
use and upstream the patch. During the time that the patch was pending
review in the upstream we would want to use our locally patched
version in production and Jenkins.
Composer provides a solution for this with its "repository" package
source. The Composer documentation actually gives this exact example
in their discussion of the "vcs" repository type [6]. We would create
a gerrit repository tracking the external library, add our patch(es),
adjust the composer.json file in mediawiki/core/vendor.git to
reference our fork, and finally run Composer in
mediawiki/core/vendor.git to pull in our patched version.
== Adding a locally developed library ==
The Platform Core team has been talking about extracting libraries
from mw-core and/or extensions to be published externally. This may be
done for any and all of the current $IP/includes/libs classes and
possibly other content from core such as FormatJson.
My idea for this would be to create a new gerrit repository for each
exported project. The project repo would contain a composer.json
manifest describing the project correctly to be published at
packagist.org like most Composer installable libraries. In the
mediawiki/core/vendor.git composer.json file we would pull these
libraries just like any third-party developed library. This isn't
functionally much different than the way that we use git submodules
today. There is one extra level of indirection when a library is
changed. The mediawiki/core/vendor.git will have to be updated with
the new library version before the hash for the git submodule of
mediawiki/core/vendor.git is updated in a deploy or release branch.
== wmf/1.XwmfY branches ==
The make-wmf-branch script (found in mediawiki/tools/release.git) is
used to create the weekly release branches that are deployed by the
"train" on each Thursday. This script would be updated to branch the
new mediawiki/core/vendor.git repository and add the version
appropriate branch as a submodule of mediawiki/core.git on the wmf/*
branch. This is functionally exactly what we do for extensions today.
== Updating a deployment branch ==
SWAT deploys often deploy bug fixes for extensions and core that can't
wait for the next train release. It is a near certainty that
mediawiki/core/vendor.git will have the same need. The process for
updating mediawiki/core/vendor.git will be almost the same as updating
an extension.
* Follow the adding/updating library or hotfix instructions to get the
changes merged into the mediawiki/core/vendor.git master branch.
* Cherry-pick the change into the proper deployment branch
* Merge the cherry-pick
* Update the git submodule for mediawiki/core/vendor.git in the
appropriate deployed branch
* Pull update to tin
* sync-dir to deploy to cluster
== Security fixes ==
This is a special case of upstreaming a patch. A security patch would
be applied directly on the deployed branch of
mediawiki/core/vendor.git as we would do for any extension. The
vulnerability and patch must then be submitted upstream in a
responsible manner and tracked for resolution.
== Jenkins ==
The Jenkins jobs that checkout and run tests involving mediawiki/core
would need to be amended to also checkout the
mediawiki/core/vendor.git in the appropriate location before running
tests.
What use cases did I miss? What other concerns do we have for this process?
[0]: https://gerrit.wikimedia.org/r/#/c/119939/
[1]: https://gerrit.wikimedia.org/r/#/c/119940/
[2]: https://gerrit.wikimedia.org/r/#/c/119941/
[3]: https://gerrit.wikimedia.org/r/#/c/119942/
[4]: https://gerrit.wikimedia.org/r/#/c/135290/
[5]: https://gerrit.wikimedia.org/r/#/c/119939/7/libs/composer.json,unified
[6]: https://getcomposer.org/doc/05-repositories.md#vcs
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
Hello,
I have upgraded Zuul (our gateway between Jenkins and Gerrit) a few
minutes a go. The main change is that the merge of patches with tip of
the target branch is now done in a different process (zuul-merge).
Beside that new feature, I don't expect any trouble. But we never know
hence this short announcement =)
--
Antoine "hashar" Musso
So here I am working late night / early morning trying to get the Signpost published, and I see something new at the top of image pages on English Wikipedia such as https://en.wikipedia.org/wiki/File:Wikidata.png
We now have "View on Wikimedia Commons" and "Add local description" tabs. Cool!
Can we get the local description tab to appear on Commons also?
Also, because the local description tab opens a free-text entry space, can that tab be split into "Add location" and "Add description" with the former opening up places to enter geolocation data, nearby landmarks, or an address?
Pine
Should the Config and GlobalConfig classes and the associated
RequestContext methods be reverted from 1.23 as an incomplete feature?
As far as I can tell, it is not yet used anywhere, so reverting it
should be easy.
getConfig() was added to IContextSource in 101a2a160b05[1]. Then
the method was changed to return a new class of object (Config) instead
of a SiteConfiguration object in fbfe789b987b[2]; however, the Config
class faces significant changes in I5a5857fc[3].
[1]: https://gerrit.wikimedia.org/r/#/c/92004/
[2]: https://gerrit.wikimedia.org/r/#/c/109266/
[3]: https://gerrit.wikimedia.org/r/#/c/109850/
--
Kevin Israel - MediaWiki developer, Wikipedia editor
http://en.wikipedia.org/wiki/User:PleaseStand
Good news Everyone <ref>https://www.youtube.com/watch?v=T2BNmn8TYdE</ref>
We just released a stable version of huggle 3 which is likely full of
some "unstable features" (some call these bugs).
DWRTLD (don't want to read too long documents):
Huggle is a fast-diff browser for mediawiki that allows you to revert
bad edits very quickly. It works on many wmf projects and can be
installed to any mediawiki wiki. It works on all major operating
systems and you can find out more here:
https://en.wikipedia.org/wiki/WP:Huggle
EOF
We didn't release any mac bundle, because we have nobody with a mac in
our team, so in case you have mac and you have some programming
skills, you can figure out how to build huggle (need Qt and CMake) and
once you do that, let us know how you did it, so that we can update
the manual.
If you are willing to contribute more permanently, you can become
release manager for Mac builds and provide them for download every new
release. Let us know in e-mail or irc://chat.freenode.net/#huggle
Thank you
Given a Wikimedia Commons description page URL - such as:
https://commons.wikimedia.org/wiki/File:Van_Gogh_-_Starry_Night_-_Google_Ar…
I would like to be able to programmatically retrieve the information in the
"Summary" header. (Values for "Artist", "Title", "Date", "Medium",
"Dimensions", "Current location", etc.)
I believe all this information is in "Template:Artwork". I can't figure
out how to get the wikitext/json-looking template data.
If I use the API and call:
https://commons.wikimedia.org/w/api.php?action=query&format=xml&titles=File…
Then I don't get the information I'm looking for. This shows the most
recent revision, and its changes. Unless the most recent revision changed
this data, it doesn't show up.
To see all the information I'm looking for, it seems I'd have to specify
rvlimit=max and go through all the past revisions to figure out which is
most current. For example, if I do so and I look at revid 79665032, that
includes: "{{Artwork | Artist = {{Creator:Vincent van Gogh}} | . . . | Year
= 1889 | Technique = {{Oil on canvas}} | . . ."
Isn't there a way to get the current version in whatever format you'd call
that - the wikitext/json looking format?
In my API call, I can specify rvexpandtemplates which even with only the
most recent revision gives me the information I need, but it's largely in
HTML tables/divs/etc format rather than wikitext/json/xml/etc.
Minutes and slides from last week's quarterly review of the
Foundation's Mobile Contributions team are now available at
https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_r…
.
On Wed, Dec 19, 2012 at 6:49 PM, Erik Moeller <erik(a)wikimedia.org> wrote:
> Hi folks,
>
> to increase accountability and create more opportunities for course
> corrections and resourcing adjustments as necessary, Sue's asked me
> and Howie Fung to set up a quarterly project evaluation process,
> starting with our highest priority initiatives. These are, according
> to Sue's narrowing focus recommendations which were approved by the
> Board [1]:
>
> - Visual Editor
> - Mobile (mobile contributions + Wikipedia Zero)
> - Editor Engagement (also known as the E2 and E3 teams)
> - Funds Dissemination Committe and expanded grant-making capacity
>
> I'm proposing the following initial schedule:
>
> January:
> - Editor Engagement Experiments
>
> February:
> - Visual Editor
> - Mobile (Contribs + Zero)
>
> March:
> - Editor Engagement Features (Echo, Flow projects)
> - Funds Dissemination Committee
>
> We’ll try doing this on the same day or adjacent to the monthly
> metrics meetings [2], since the team(s) will give a presentation on
> their recent progress, which will help set some context that would
> otherwise need to be covered in the quarterly review itself. This will
> also create open opportunities for feedback and questions.
>
> My goal is to do this in a manner where even though the quarterly
> review meetings themselves are internal, the outcomes are captured as
> meeting minutes and shared publicly, which is why I'm starting this
> discussion on a public list as well. I've created a wiki page here
> which we can use to discuss the concept further:
>
> https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_r…
>
> The internal review will, at minimum, include:
>
> Sue Gardner
> myself
> Howie Fung
> Team members and relevant director(s)
> Designated minute-taker
>
> So for example, for Visual Editor, the review team would be the Visual
> Editor / Parsoid teams, Sue, me, Howie, Terry, and a minute-taker.
>
> I imagine the structure of the review roughly as follows, with a
> duration of about 2 1/2 hours divided into 25-30 minute blocks:
>
> - Brief team intro and recap of team's activities through the quarter,
> compared with goals
> - Drill into goals and targets: Did we achieve what we said we would?
> - Review of challenges, blockers and successes
> - Discussion of proposed changes (e.g. resourcing, targets) and other
> action items
> - Buffer time, debriefing
>
> Once again, the primary purpose of these reviews is to create improved
> structures for internal accountability, escalation points in cases
> where serious changes are necessary, and transparency to the world.
>
> In addition to these priority initiatives, my recommendation would be
> to conduct quarterly reviews for any activity that requires more than
> a set amount of resources (people/dollars). These additional reviews
> may however be conducted in a more lightweight manner and internally
> to the departments. We’re slowly getting into that habit in
> engineering.
>
> As we pilot this process, the format of the high priority reviews can
> help inform and support reviews across the organization.
>
> Feedback and questions are appreciated.
>
> All best,
> Erik
>
> [1] https://wikimediafoundation.org/wiki/Vote:Narrowing_Focus
> [2] https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings
> --
> Erik Möller
> VP of Engineering and Product Development, Wikimedia Foundation
>
> Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
>
> _______________________________________________
> Wikimedia-l mailing list
> Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
--
Tilman Bayer
Senior Operations Analyst (Movement Communications)
Wikimedia Foundation
IRC (Freenode): HaeB
Hi,
I'm looking for someone to review and (hopefully) accept a very small (3
line) patch I wrote for the ImageMap extension. The patch improves the
behavior of image maps with links to [[Media:...]] files. Instead of
linking to the image page, these links should go directly to the media file
(comparable to how regular [[Media:...]] links in wiki text work).
My patch has been open for over three months now, and there is no
registered maintainer of the ImageMap extension (
https://www.mediawiki.org/wiki/Developers/Maintainers#MediaWiki_extensions_…).
I've tried to ping some folks whom I thought might be able to review my
patch, but none of them has responded.
The patch is in Gerrit, and can be viewed at
https://gerrit.wikimedia.org/r/#/c/114439/
Kind regards,
Remco de Boer