Good morning,
I'm glad to announce git-review 1.20 is now available in the FreeBSD port tree.
Installation:
cd /usr/ports/devel/git-review/
make install
Vocabulary:
- a port in FreeBSD is a Makefile "cookbook" to download, compile when
needed and install an application;
- a package is the binary distribution prepared from this port; it is
automatically generated from FreeBSD build machines.
I will maintain this port.
--
Best Regards,
Sébastien Santoro aka Dereckson
http://www.dereckson.be/
tl;dr: Please help in getting https://bugzilla.wikimedia.org/38638
resolved and ensure that your code doesn't cause additional issues for
translators.
At translatewiki.net we have an [ask question] button for translators,
and they have really used it to report various kinds of issues with
the messages they are translating. In fact, we barely have time to
sort those things into correct places, let alone poking developers to
act on them. We are experimenting with different ways to improve this
process – the last experiment is using Semantic MediaWiki to track
open issues, see [1].
Basically the backlog is growing, and we need some help to reverse the
direction. We have tried filing bugs and poking developers on IRC and
via email, but that is not scaling anymore, moreover bugs and pokes
are sometimes ignored[2].
There are many ways you can help:
1. Follow the best i18n practices like documenting new messages as
they are added [4]. Some of you may have noticed that we have started
to review any i18n and L10n changes more closely. This is not to annoy
you. This is to help you write code that can be translated better and
without too many questions from translators.
2. Have a look at [1] and act on issues.
3. Have a look at [1] and poke someone else to act on issues.
4. We need maintainers for [[Support]] [3] to sort stuff into the
correct places.
5. When committing code which has i18n changes, add Nikerabbit (me) or
Siebrand to reviewers.
6. Ideas on how to improve this process are welcome.
We think that we are all responsible for providing them with the
information they need. We feel it's our (translatewiki.net staff's)
obligation to make sure translators can do their work without too many
distractions and wasted efforts.
[1] https://translatewiki.net/wiki/Support/Open_requests
[2] https://bugzilla.wikimedia.org/38638
[3] https://translatewiki.net/wiki/Support
[4] https://www.mediawiki.org/wiki/Localisation
--
Niklas Laxström
Siebrand Mazeland
Raimond Spekking
Federico Leva
Amir E. Aharoni
Hi, here you have a first draft about MediaWiki Groups, and implicitly
MediaWiki reps:
http://www.mediawiki.org/wiki/User:Qgil/MediaWiki_groups
MediaWiki groups organize open source community activities within the
scope of specific topics and geographical areas. They extend the
capacity of the Wikimedia Foundation in events, training, promotion and
other technical activities benefiting Wikipedia, the Wikimedia movement
and the MediaWiki software.
Imagine MediaWiki Germany Group, MediaWiki Lua Group...
These groups may become a significant source of growth and wider
diversity of our community.
Please bring your ideas to the discussion page - or here. Thank you!
--
Quim Gil
Technical Contributor Coordinator
Wikimedia Foundation
Hello all,
After the new version of LabeledSectionTransclusion (LST) was deployed on
itwikisource, performance issues popped up. itwikisource's main page makes
heavy use of LST, and the new version is clearly heavier than the old one.
In this mail, I'll try to describe the aims of the new version, how the old
version worked and how the new version works.
Aims
-------
In the old situation, it was possible to transclude sections of pages by
marking them with <section> tags. However, it was impossible
to include those tags from within a template. I.e. given
page P: something before <section start='a'>something with a</section
end='a'> something after
page Q: {{#lst:P|a}}
then Q was rendered as
something with a
However, it was not possible to do something like:
page O: ===<section start='header'>{{{1}}}</section end='header'>===
page P: {{O|Some header text}}
page Q: {{#lst:P|header}}
Changes in the #lst parser
--------------------------------------
This was because in the old situation, the #lst mechanism did something
along these lines:
1) get DOM using $parser->getTemplateDom( $title ); - note that this is a
non-expanded DOM, as in templates are not expanded
2) traverse this DOM, find section tags, and call
$parser->replaceVariables(....) on the relevant sections
In the new situation, the #lst mechanism does something like:
1) get expanded wikitext using
$parser->preprocess("{{:page_to_be_transcluded}}")
2) get the DOM by calling $parser->preprocessToDom() on the expanded
wikitext
3) traverse this DOM, find section tags, and call
$parser->replaceVariables(....)
on the relevant sections (unchanged)
One obvious performance issue is that (1) and (2) are not cached - not
within one response (so if a page {{#lst}}'s the same page twice, that page
is processed twice), and not between responses (no caching).
In general, I think it would be preferrable not to do a full parse, but
just to expand the DOM of the templates. Unfortunately, I have not been
able to find a simple way to do this: PPFrame::Expand expands the templates
to their final form, not to an 'expanded DOM'.
I don't know MediaWiki caching well enough to say something about which
caches are used (or not), and what would be an effective caching strategy.
Any ideas on how to do LST without bluntly doing a full page parse for
every transcluded page, or on caching strategies, would be very welcome.
Best,
Merlijn
Hey all,
For a while now we have .jshintrc rules in the repository and are able
to run node-jshint locally.
TL;DR: jshint is now running from Jenkins on mediawiki/core
(joining the linting sequence for php and puppet files).
I cleaned up the last old lint failures in the repo yesterday in
preparation to enable it from Jenkins (like we already do for PHP and
Puppet files). After some quick testing in a sandbox job on Jenkins to
confirm it passes/fails accordingly, this has now been enabled in the
main Jenkins job for mediawiki/core.
Right now only master and REL1_20 pass (REL1_19 and wmf branches do
not, the next wmf branch will however pass).
Therefore is has only been enabled on the master branch for now.
Example success:
* https://gerrit.wikimedia.org/r/#/c/24249/
* https://integration.mediawiki.org/ci/job/MediaWiki-GIT-Fetching/7730/console
22:16:41 Running "jshint" task
22:16:48 OK
22:16:48
22:16:48 Done, without errors.
Example failure:
* https://gerrit.wikimedia.org/r/#/c/34433/
* https://integration.mediawiki.org/ci/job/MediaWiki-GIT-Fetching/7732/console
22:24:01 Running "jshint" task
22:24:08 >> resources/mediawiki/mediawiki.js: line 5, col 5, Identifier 'bla_bla' is not in camel case.
22:24:08 >> resources/mediawiki/mediawiki.js: line 5, col 12, 'bla_bla' is defined but never used.
22:24:08 >>
22:24:08 >> 2 errors
22:24:08 Warning: Task "jshint" failed.
So if your commit is marked as failure, just like with failures from
phplint, puppetlint or phpunit: Click the link from jenkins-bot and
follow the trail.
-- Timo Tijhof
I am trying to merge a bunch of changes that have occurred in the master
branch of the MobileFrontend extension to a remote branch (esisupport), but
git review errors out, failing on a change that had been abandoned in the
remote branch.
First I merged the changes from master to a local checkout of the remote.
After fixing merge conflicts, I ran 'git review esisupport' and this is
what happened:
You have more than one commit that you are about to submit.
The outstanding commits are:
503bd3d (HEAD, esisupport) getArticleUrl returns canonical url (bug 41286)
8c70e8b Fix adding unnecessary callbacks in mf-cleanuptemplates
<snip>
Is this really what you meant to do?
Type 'yes' to confirm: yes
remote: Resolving deltas: 100% (315/315)
remote: Processing changes: refs: 1, done
To ssh://
awjrichards@gerrit.wikimedia.org:29418/mediawiki/extensions/MobileFrontend.git
! [remote rejected] HEAD -> refs/publish/esisupport/bug/41286 (change
32896 closed)
error: failed to push some refs to 'ssh://
awjrichards@gerrit.wikimedia.org:29418/mediawiki/extensions/MobileFrontend.git
'
What is going on? Am I doing this wrong? Thanks for any help!
--
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
== Situation ==
In Wikimedia Bugzilla you can set a priority for a bug report.
Some people and teams set highest priority often (meaning "These issues
should get fixed first in the next weeks").
Some don't set it at all (and likely related: Some teams don't really
use Bugzilla but other tools).
Some are in-between.
This use might work well for each team.
* Currently there are 15 open tickets with highest priority:
https://bugzilla.wikimedia.org/buglist.cgi?priority=Highest&resolution=---&…
(You might only see 14 if you don't have access to security bugs)
* 5 of them have highest priority for more than 30 days:
https://bugzilla.wikimedia.org/buglist.cgi?priority=Highest&resolution=---&…
* 2 of them for more than 90 days (Huggle vs IPv6; moving to EQIAD):
https://bugzilla.wikimedia.org/buglist.cgi?priority=Highest&resolution=---&…
The latter imply either missing maintainership (Huggle?) or tasks that
take longer (EQIAD) and could be broken down into subtasks.
== Problem ==
Currently "Highest Priority" has no single (cross-team) meaning.
This makes it hard for people outside of a team (e.g. Engineering
Management) to see at a glance what's most important and urgent for each
team.
== Proposal ==
Proposing the following definitions for Priority:
* highest: Needs to be fixed as soon as possible, a week at the
most. A human assignee should be set in the "Assigned to" field.
* high: Should be fixed within the next four weeks.
andre
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/