Hi,
On 01/18/2014 03:42 AM, Matthew Walker wrote:
> We've just finished our second sprint on the new PDF renderer. A
> significant chunk of renderer development time this cycle was on non latin
> script support, as well as puppetization and packaging for deployment. We
> have a work in progress pipeline up and running in labs which I encourage
> everyone to go try and break.
Seeing breakage of PDF downloads on pl.wikisource reported in
https://bugzilla.wikimedia.org/show_bug.cgi?id=65298 I got curious what
the status of the new PDF renderer is.
https://www.mediawiki.org/wiki/PDF_rendering#Status only links to the
quoted email from January 2014. Has anything happened in the last four
months that is worth to be added as a status update?
For the records, Nemo asked on the Talk page for a test instance and I
support the idea of having a bugday on PDF rendering once public testing
infrastructure for the new PDF renderer is available.
Open tickets to potentially re-test:
https://bugzilla.wikimedia.org/buglist.cgi?resolution=---&component=Collect…
andre
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/
Hi,
For the nth time, in #mediawiki, I've had someone ask how to be able to mark a bug as resolved, or claim it, or mark it as a duplicate of another bug. I conceptually know that this means "getting editbugs permissions" but it happens so infrequently that I never know where to go.
Usually what happens is this:
1. I wrack my brain trying to remember the process for about 30 seconds
2. Failing, I try to ping Andre, Quim, and Sumana (none of whom are in the channel, sadly)
3. I search with duckduckgo and pull up nothing of any use
4. I search MediaWiki.org and find outdated status reports about greasemonkey scripts but nothing useful
5. I go to the developer hub pages and look at the welcome-to-the-community process but again find nothing describing this process
Solution: We've made every editbugs user able to add editbugs to an account. I've documented the process here: https://www.mediawiki.org/wiki/Bugzilla#Why_can.27t_I_claim_a_bug_or_mark_i…
Thanks to Chad for the quick resolution on this, hopefully this will be a positive change overall.
--
Mark Holmquist
Software Engineer, Multimedia
Wikimedia Foundation
mtraceur(a)member.fsf.org
https://wikimediafoundation.org/wiki/User:MHolmquist
Thanks Dan and Erik.
Pine
> Date: Thu, 29 May 2014 13:36:12 -0700
> From: Dan Garry <dgarry(a)wikimedia.org>
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Subject: Re: [Wikitech-l] 404 errors
> Message-ID:
> <CAOW03MEodri_EPiTH2cUVcwjPEsvDOjFSOXxuyqDOjs=HUhAiQ(a)mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
>
> I'm getting these errors too.
>
> Judging by the chatter in #wikimedia-operations, this is being actively
> looked in to.
>
> Dan
>
>
> On 29 May 2014 13:34, ENWP Pine <deyntestiss(a)hotmail.com> wrote:
>
> > Hi, I'm getting some 404 errors consistently when trying to load some
> > English Wikipedia articles. Other pages load ok. Did something break?
> >
> > Pine
> >
> Dan Garry
> Associate Product Manager for Platform and Mobile Apps
> Wikimedia Foundation
>
>
> Date: Thu, 29 May 2014 13:36:00 -0700
> From: Erik Moeller <erik(a)wikimedia.org>
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Subject: Re: [Wikitech-l] 404 errors
> Message-ID:
> <CAEg6ZHkZJjCQk0xFKUhq9-cHJhW7xHVz0YqMQ0-zzeQgUChsVQ(a)mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
>
> It's being investigated, see #wikimedia-operations on irc.freenode.net.
>
> Erik
>
>
> On Thu, May 29, 2014 at 1:34 PM, ENWP Pine <deyntestiss(a)hotmail.com> wrote:
>
> > Hi, I'm getting some 404 errors consistently when trying to load some
> > English Wikipedia articles. Other pages load ok. Did something break?
> >
> > Pine
> >
> > _______________________________________________
> > Wikitech-l mailing list
> > Wikitech-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
>
> --
> Erik Möller
> VP of Engineering and Product Development, Wikimedia Foundation
>
>
> ------------------------------
>
Hi,
We were having discussions regarding putting the new
SwiftMailer[1] lib in/out of core, after the merge of the change
https://gerrit.wikimedia.org/r/#/c/135290. Tyler recommends to add the
installer code to the composer.json and not to add the SwiftMailer code to
the core. This creates the swiftmailer lib in core/vendors/swiftmailer.
After discussing with Bryan (https://dpaste.de/XVkL/raw), it looks
like *maintaining
a separate repo* for external libraries looked like the best solution, so
that it uses composer 'properly', and still works for wmf-deployement --
rather than getting the whole into core. It could be thus deployed via
trebuchet to the cluster and the autoloader pulled in in CommonSetings
(quoting Bryan).
Since the entire mail structure is being reworded to use
swiftmailer, and its a crucial dependancy ( There will be no alternate mail
systems in UserMailer.php -- as both the SMTP and no SMTP cases are handled
effectively by SwiftMailer ), I think we will need to have a separate repo
for that. Please go through the patch-set above and comment your opinions
on including external libraries.
[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=63483
Thanks,
Tony Thomas <http://tttwrites.in>
FOSS@Amrita <http://foss.amrita.ac.in>
*"where there is a wifi, there is a way"*
Hi,
Wikidata needs to monitor dispatch statistics per Bug 65291 (
https://bugzilla.wikimedia.org/show_bug.cgi?id=65291).
We would like to use ICINGA to send an IRC alert if the dispatch lag
exceeds a critical threshold. A nagios plugin for this (check_dispatch)
has been written in Perl and we would like to have this reviewed and
tested.
Where and by whom could this script be reviewed?
Christopher Johnson
(Firstly, apologies for breaking threading, and not quoting things
as fully as I normally would. I only just subscribed to this list,
so don't have previous mails in my mail client.)
I'm the main author of the Erudite skin[0], which Daniel Friesen
helped a good deal with when shepherding it into the gerrit git
system. So consider me biased to prefer the way that skin does
everything ;)
I think Bartosz' proposal to clean this stuff up is good and
important. In particular, making the official skins use their own
directories (and not skinname.php & skinname/*) is nice for several
reasons, not the least being to provide a good model for others
developing skins to follow.
Bartosz' second proposal, "$IP/skins/SkinName/ for both assets and
PHP files" is a good one I think. I agree with Tyler and Bartosz'
points regarding the extensions/ vs. skins/ directories; namely that
the skins/ directory is a sensible place for skins, and the fact
that they in practise aren't fundamentally different to extensions
is largely irrelevant.
To Tim Starling's point about making skin installation / upgrading
automatable; that shouldn't be much more challenging at all using
skins/ if they're in a separate directory to extensions; it can't
really be a code reuse issue, unless it's really bad code ;)
To Tyler Romeo's point about how skins are needlessly complex, and a
templating system would be much nicer, I don't really agree. Being
regular PHP gives a lot of flexibility, and in practise the vast
majority of a skin can already be just HTML with <?php tags for
inserting the appropriate content. Have a skim of the Erudite code
to see what I mean[1]. Granted I'm probably a bit code-blind from
having spent a long time with that bit of PHP, but to me it feels
really clean and straightforward. I'm sure there is some work that
could be done to reduce boilerplate code a bit (as Daniel
mentioned), but beyond that I think it's quite a good system.
As for the upper vs lower vs camel case issues, I'd like them to
magically go away, but I don't have a good suggestion for that.
Well, my suggestion would be lowercase everywhere, but that
conflicts with general mediawiki style. So, again, I personally am
inclined to generally agree with whatever Daniel says on the issue
;)
Nick
0. https://www.mediawiki.org/wiki/Skin:Erudite
1. https://git.wikimedia.org/blob/mediawiki%2Fskins%2Ferudite/fc5b86835e7b942d…
Hi everyone,
I'd like to welcome Dan Duvall to Wikimedia Foundation in his role as
Automation Engineer in our Release and QA group. Dan comes to us most
recently from Giant Rabbit, where he worked for a few months doing web
development consulting, and before that, at National Novel Writing
Month (NaNoWriMo), where he lead their engineering work. One big
accomplishment from his tenure at NaNoWriMo was porting their website
code from Drupal to Ruby on Rails.
Dan will be working closely with Chris McMahon and Željko Filipin on
improving our browser test automation. Our browser test work is
largely written in Ruby, and of course, large portions of MediaWiki
are written in PHP. Thus, Dan's deep experience both with Ruby and PHP
will be incredibly helpful in this role. We also anticipate his Ruby
knowledge will come in quite handy in other pockets of our system that
rely on Ruby skills (e.g. our Vagrant-based developer test environment
system)
Dan tells me this is by far the biggest place he's ever worked, so
please bear with him as he learns to navigate our expansive office
campus and figures out the intricacies of our intraoffice shuttle
system :-)
Welcome Dan!
Rob