In the next RFC meeting, we will discuss the following RFCs:
* Request timeouts and retries
<https://phabricator.wikimedia.org/T97204>
* Re-evaluate varnish-level request-restart behavior on 5xx
<https://phabricator.wikimedia.org/T97206>
The meeting will be on the IRC channel #wikimedia-office on
chat.freenode.net at the following time:
* UTC: Wednesday 21:00
* US PDT: Wednesday 14:00
* Europe CEST: Wednesday 23:00
* Australia AEST: Thursday 07:00
-- Tim Starling
I've noticed that the image previews in Hovercards ('Popups' extension) do
not respect high-density displays and can end up a little blurry because of
this.
While patching the extension, someone recommended to me to bracket the
detected density to the values we use for default thumb generation on the
wiki (the 1, 1.5, and 2x densities we specify in 'srcset' attribute on
<img>s), so browsers that are zoomed slightly off from default or devices
that are not quite on the most common densities don't force extra thumbnail
renders.
Do folks have any preference for whether I should add that as a separate
function like $.bracketedDevicePixelRatio() or just directly bracket the
output of the $.devicePixelRatio wrapper function?
A quick look at code using $.devicePixelRatio() indicates most uses are
multiplying an image size to get a thumbnail, so that might be convenient
but I don't want to cause surprises. ;)
Task: https://phabricator.wikimedia.org/T97935
Core patch: https://gerrit.wikimedia.org/r/#/c/208820/
Hovercards patch: https://gerrit.wikimedia.org/r/#/c/208515/
Current version of the patch adds a separate $.bracketedDevicePixelRatio().
-- brion
I made a patch [0] for T39665 [1] about 6 months ago. It has been
rotting in gerrit since.
The core bug is related to glibc's iconv implementation and PHP (and
HHVM as well I think). To work around the iconv bug I wrote a little
helper function that will use mb_convert_encoding() instead if it is
present. in review PleaseStand pointed out that the libmbfl used by
mb_convert_encoding has some differences in the supported character
sets and character set naming [2] vs iconv.
I was hoping that someone on this list could step in and either
convince me to abandon this patch and pretend I never investigated the
problem or help design a solution that will plaster over these
differences in a reasonable way.
[0]: https://gerrit.wikimedia.org/r/#/c/172101/
[1]: https://phabricator.wikimedia.org/T39665
[2]: https://php.net/manual/en/mbstring.encodings.php
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
On Wed, May 6, 2015 at 12:13 AM Greg Grossmeier <greg(a)wikimedia.org> wrote:
> Quick general question: are you proposing this for pywikibot only? I
> think the answer is yes, just making sure.
>
> No, I'm proposing to do this in general. I never mentioned pywikibot as
the goal I just said I did a test in pywikibot and it worked well.
> <quote name="Amir Ladsgroup" date="2015-05-05" time="07:05:48 +0000">
> > Hey,
> > Github has a huge community of developers that collaborating with them
> can
> > be beneficial for us and them but Wikimedia codes are in gerrit (and in
> > future in phabricator) and our bug tracker is in phabrictor. sometimes It
> > feels we are in another planet.
> > Wikimedia has a mirror in github but we close pull requests immediately
> and
> > we barely check issues raised there. Also there is a big notice in
> > github[1], "if you want to help, do it our way". Suddenly I got an idea
> > that if we can synchronize github activities with gerrit and phabricator,
> > it would help us by letting others help in their own way. It made me so
> > excited that I wrote a bot yesterday to automatically duplicates patches
> of
> > pull requests in gerrit and makes a comment in the pull request stating
> we
> > made a patch in gerrit. I did a test in pywikibot and it worked well
> [2][3].
> >
> > Note that the bot doesn't create a pull request for every gerrit patch
> but
> > it creates a gerrit patch for every (open) pull requests.
> >
> > But before I go on we need to discuss on several important aspects of
> this
> > idea:
> > 1- Is it really necessary to do this? Do you agree we need something like
> > that?
> > 2-I think a bot to duplicate pull requests is not the best idea since it
> > creates them under the bot account and not under original user account.
> We
> > can create a plugin for phabrictor to do that but issues like privacy
> would
> > bother us. (using OAuth wouldn't be a bad idea) What do you think? What
> do
> > you suggest?
> > 3- Even if we create a plugin, still a bot to synchronize comments and
> code
> > reviews is needed. I wrote my original code in a way that I can expand
> this
> > to do this job too, but do you agree we need to do this?
> > 4- We can also expand this bot to create a phabricator task for each
> issue
> > that has been created (except pull requests). Is it okay?
> >
> > I published my code in [4].
> >
> > [1]: https://github.com/wikimedia/pywikibot-core "Github mirror of
> > "pywikibot/core" - our actual code is hosted with Gerrit (please see
> > https://www.mediawiki.org/wiki/Developer_access for contributing"
> > [2]: https://github.com/wikimedia/pywikibot-core/pull/5
> > [3]: https://gerrit.wikimedia.org/r/208906
> > [4]: https://github.com/Ladsgroup/sync_github_bot
> >
> > Best
>
> > _______________________________________________
> > Pywikipedia-l mailing list
> > Pywikipedia-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
>
>
> --
> | Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
> | identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
>
> _______________________________________________
> Pywikipedia-l mailing list
> Pywikipedia-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
>
Hello,
A quick reminder about Wikimedia Language Engineering team's IRC office
hour later today at 1430 UTC[1] on #wikimedia-office. Please see below for
the original announcement, local time, and agenda. We will post logs on
metawiki[2] after the event.
Thanks
Runa
[1] http://www.timeanddate.com/worldclock/fixedtime.html?iso=20150505T1430
[2] https://meta.wikimedia.org/wiki/IRC_office_hours#Office_hour_logs
---------- Forwarded message ----------
From: Runa Bhattacharjee <rbhattacharjee(a)wikimedia.org>
Date: Thu, Apr 30, 2015 at 7:29 PM
Subject: [x-post] Next Language Engineering IRC Office Hour is on 5th May
2015 (Tuesday) at 1430 UTC
To: MediaWiki internationalisation <mediawiki-i18n(a)lists.wikimedia.org>,
Wikimedia developers <wikitech-l(a)lists.wikimedia.org>, Wikimedia Mailing
List <wikimedia-l(a)lists.wikimedia.org>, "Wikimedia & GLAM collaboration
[Public]" <glam(a)lists.wikimedia.org>
[x-posted announcement]
Hello,
The next IRC office hour of the Language Engineering team of the Wikimedia
Foundation will be on May 5, 2015 (Tuesday) at 1430 UTC on
#wikimedia-office. We missed a few of our regular monthly office hours, but
from May onwards we will be back on schedule.
There has been significant progress around Content Translation[1] and it is
now available as a beta feature on several Wikipedias[2]. We’d love to hear
comments, suggestions and any feedback that will help us make this tool
better.
Please see below to check local time and event details. Questions can also
be sent to me ahead of the event.
Thanks
Runa
[1] http://blog.wikimedia.org/2015/04/08/the-new-content-translation-tool/
[2]
https://www.mediawiki.org/wiki/Content_translation/Languages#Available_lang…
Monthly IRC Office Hour:
==================
# Date: May 5, 2015 (Tuesday)
# Time: 1430 UTC (Check local time:
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20150505T1430 )
# IRC channel: #wikimedia-office
--
Language Engineering - Outreach and QA Coordinator
Wikimedia Foundation
--
Language Engineering - Outreach and QA Coordinator
Wikimedia Foundation
Hi all --
I wanted to give you an update on the Community Tech team. We've posted job
descriptions for open positions on our jobs page that we'd like to bring
your attention to:
Community Tech Developer
<https://boards.greenhouse.io/wikimedia/jobs/62666?t=m5pcy0#.VUgfhjvF_FI>
Community Tech Engineering Manager
<https://boards.greenhouse.io/wikimedia/jobs/62669?t=d51bks#.VUgfhjvF_FI>
Please encourage qualified folks to apply!
I want to say that I'm really excited to be working with Luis to help build
this team. I'm very appreciative that Lila and the other execs have
identified this gap in our community support and have made resources
available to address it.
-Toby
After a lot of work, we're ready to provide a more sensible data layout for
format=json results (and also format=php). The changes are generally
backwards-compatible for API clients, but extension developers might have
some work to do. If your extension is maintained in Gerrit, much of the
necessary conversion has already been done for you (the major exception
being booleans that were violating the old API output conventions).
The general theme is that the ApiResult arrays now have more metadata,
which is used to apply a backwards-compatible transformation for clients
that need it and optional transformation so JSON output needn't be limited
by restrictions of XML. At the same time, improvements were made to
ApiResult and ApiFormatXml to hopefully make it easier for developers to
use.
Relevant changes include:
- Several ApiResult methods were deprecated. If your extension is
maintained in Gerrit, these should have already been taken care of for you
(with the exception of T95168 <https://phabricator.wikimedia.org/T95168>
where work is ongoing), but new code will need to avoid the deprecated
methods.
- All ApiResult methods that operate on a passed-in array (rather than
internal data) are now static, and static versions of all relevant data-
and metadata-manipulation methods are provided. This should reduce the need
for passing ApiResult instances around just to be able to set metadata.
- Properties with names beginning with underscores are reserved for API
metadata (following the lead of existing "_element" and "_subelements"),
and will be stripped from output. Such properties may be marked as
non-metadata using ApiResult::setPreserveKeysList(), if necessary.
- PHP-arrays can now be tagged with "array types" to indicate whether
they should be output as arrays or hashes. This is particularly useful to
fix T12887 <https://phabricator.wikimedia.org/T12887>.
- The "*" property is deprecated in favor of a properly-named property
and special metadata to identify it for XML format and for
back-transformation. Use ApiResult::setContentValue() instead of
ApiResult::setContent() and all the details are handled for you.
- ApiFormatXml will no longer throw an exception if you forget to call
ApiResult::setIndexedTagName()!
- ApiFormatXml will now reversibly mangle tag and attribute names that
are not valid XML, instead of irreversibly mangling spaces and outputting
invalid XML for other stuff.
- ApiResult will now validate data added (e.g. adding resources or
non-finite floats will throw an exception) and auto-convert objects. The
ApiSerializable interface can be used to control object conversion, if
__toString() or cast-to-array is inappropriate.
- Actual booleans should now be added to ApiResult, and will be
automatically converted to the old convention (empty-string for true and
absent for false) when needed for backwards compatibility. Code that was
violating the old convention will need to use the new
ApiResult::META_BC_BOOLS metadata property to prevent this conversion.
- Modules outputting as {"key":{"*":"value"}} to avoid large strings in
XML attributes can now output as {"key":"value"} while still maintaining
<container><key>value</key></container> in XML format, using
ApiResult::META_BC_SUBELEMENTS. New code should use
ApiResult::setSubelementsList() instead.
- Modules outputting hashes as
[{"name":"key1","*":"value1"},{"name":"key2","*":"value2"}] (due to the
keys being invalid for XML) can now output as
{"key1":"value1","key2":"value2"} in JSON while maintaining <container><item
name="key1">value1</item><item name="key2">value2</item></container> in
XML format, using array types "kvp" or "BCkvp".
I apologize for forgetting to announce this sooner. If developers need
assistance with API issues or code review for API modules, please do reach
out to me.
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
The Wikimedia Reading Infrastructure team [0] was formed during the
recent Wikimedia Foundation Engineering reorganization [1]. The team
currently consists of former members of the Wikimedia MediaWiki API
team [2] which was formed (briefly) from the Wikimedia MediaWiki Core
and Multimedia teams [3]. The new team's mission has a slightly
different scope that the API team did, but the security, stability and
performance of the API remains a top tier goal in support of the
Reading team's projects and the projects of other WMF and community
developers.
Towards that end, the team would like to put itself (and particularly
Brad "anomie" Jorsch) forward as an available consulting resource for
all other WMF teams and volunteer contributors who are enhancing the
MediaWiki API by adding or updating new code to API modules in core or
extensions. Brad has a long history of both consuming and maintaining
API related code. For several years he has been considered the "go to
guy" by his peers in the former MediaWiki Core team for reviewing API
changes and his name is likely well know to those of you who regularly
work with the API and other projects like Scrubunto [4].
We aren't asking to be the sole arbiters or implementers of API
related change. Rather we would like to have a chance to help ease
implementation pains and provide insight on both the good and bad
patterns that recur in typical API module development. Chances are
good that the Gerrit watches we have will notice patches as they move
through the review process even without explicit inclusion, but we
would appreciate being invited into these conversations when possible.
[0]: https://www.mediawiki.org/wiki/Wikimedia_Reading_Infrastructure_team
[1]: https://lists.wikimedia.org/pipermail/wikimedia-l/2015-April/077619.html
[2]: https://www.mediawiki.org/wiki/Wikimedia_MediaWiki_API_Team
[3]: https://lists.wikimedia.org/pipermail/wikitech-l/2015-March/081357.html
[4]: https://www.mediawiki.org/wiki/Extension:Scribunto
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
Hi,
We're planning on deploying a global user merge tool to Wikimedia sites
shortly. As the name suggests, it merges multiple users into one.
This means that if your extension is storing user ids or user names, it
will need to listen to one of the UserMerge hooks
(UserMergeAccountFields, MergeAccountFromTo,
UserMergeAccountDeleteTables, or DeleteAccount) to make sure it isn't
referring to non-existent users. Reedy & I previously did an audit last
November of all deployed extensions, however new ones have been deployed
since then. Please check your extension(s) and if they need updating,
file bugs that block T49918[1] and T69758[2].
[1] https://phabricator.wikimedia.org/T49918
[2] https://phabricator.wikimedia.org/T69758
-- Legoktm