Following up on disabling JavaScript support for IE6 [1], here is some
additional research on other browsers. I'd appreciate if people with
experience testing/developing for/with these browsers would jump in
with additional observations. I think we should wait with adding other
browsers to the blacklist until the IE6 change has been rolled out,
which may expose unanticipated consequences (it already exposed that
Common.js causes errors in blacklisted browsers, which should be fixed
once [2] is reviewed and merged).
As a reminder, the current blacklist is in <resources/src/startup.js>.
As a quick test, I tested basic browsing/editing operation on English
Wikipedia with various browsers. Negative results don't necessarily
indicate that we should disable JS support for these browsers, but
they do indicate the quality of testing that currently occurs for
those browsers. Based on a combination of test results, unpatched
vulnerabilities and usage share, an initial recommendation for each
browser follows.
Note that due to the heavy customization through gadgets/site scripts,
there are often site-specific issues which may not be uncovered
through naive testing.
== Microsoft Internet Explorer 7.x ==
Last release in series: April 2009
- Browsing: Most pages work fine (some styling issues), but pages with
audio files cause JavaScript errors (problem in TMH).
- Editing: Throws JS error immediately (problem in RefToolbar)
Both of these errors don't occur in IE8.
Security vulnerabilities:
Secunia reports 15 out of 87 vulnerabilities as unpatched, with the
most serious one being rated as "moderately critical" (which is the
same as IE6, while the most serious IE8 vulnerability is rated "less
critical").
Usage: <1%
Recommendation: Add to blacklist
== Opera 8.x ==
Last release in series: September 2005
Browsing/editing: Works fine, but all JS fails due to a script
execution error (which at least doesn't cause a pop-up).
Security: Secunia reports 0 unpatched vulnerabilities (out of 26).
Usage: <0.25%
Recommendation: Add to blacklist
== Opera 10.x-12.x ==
Last release in series: April 2014
Browsing/editing: Works fine, including advanced features like
MediaViewer (except for 10.x)
Security: No unpatched vulnerabilities in 12.x series according to
Secunia, 2 unpatched vulnerabilities in 11.x ("less critical") and 1
unpatched vulnerability in 10.x ("moderately critical")
Usage: <1%
Recommendation: Maintain basic JS support, but monitor situation re:
10.x and add that series to blacklist if maintenance cost too high
== Firefox 3.6.* ==
Last release in series: March 2012
Browsing/editing: Works fine (MediaViewer disables itself)
Security: 0 unpatched vulnerabilities according to Secunia
Recommendation: Maintain basic JS support
== Firefox 3.5.* ==
Last release in series: April 2011
Browsing/editing: Works fine (MediaViewer disables itself)
Security: 0 unpatched vulnerabilities according to Secunia
Recommendation: Maintain basic JS support
== Safari 4.x ==
Last release in series: November 2010
Browsing/editing: Works fine
Security: 1 unpatched "highly critical" vulnerability according to
Secunia ("exposure of sensitive information")
Recommendation: Maintain basic JS support, but monitor
== Safari 3.x ==
Last release in series: May 2009
Browsing/editing: Completely messed up, looks like CSS doesn't get loaded at all
Security: 2 unpatched vulnerabilities, "highly critical"
Usage share: Usage reports for Safari in [3] are broken, all Safari
versions are reported as "0.0". However, [4] suggests that Safari 3
usage is negligible/non-existent.
Recommendation: Styling issue may be worth investigating in case it
affects other browsers and/or is JS-caused. Otherwise probably can be
safely ignored.
[1] http://lists.wikimedia.org/pipermail/wikitech-l/2014-August/077952.html
[2] https://gerrit.wikimedia.org/r/#/c/152122/
[3] http://stats.wikimedia.org/wikimedia/squids/SquidReportClients.htm
[4] http://stackoverflow.com/questions/12655363/what-is-the-most-old-safari-ver…
--
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation
Minutes and slides from last week's quarterly review of the
Foundation's Release Engineering and QA team are now available at
https://meta.wikimedia.org/wiki/WMF_Metrics_and_activities_meetings/Quarter…
.
On Wed, Dec 19, 2012 at 6:49 PM, Erik Moeller <erik(a)wikimedia.org> wrote:
> Hi folks,
>
> to increase accountability and create more opportunities for course
> corrections and resourcing adjustments as necessary, Sue's asked me
> and Howie Fung to set up a quarterly project evaluation process,
> starting with our highest priority initiatives. These are, according
> to Sue's narrowing focus recommendations which were approved by the
> Board [1]:
>
> - Visual Editor
> - Mobile (mobile contributions + Wikipedia Zero)
> - Editor Engagement (also known as the E2 and E3 teams)
> - Funds Dissemination Committe and expanded grant-making capacity
>
> I'm proposing the following initial schedule:
>
> January:
> - Editor Engagement Experiments
>
> February:
> - Visual Editor
> - Mobile (Contribs + Zero)
>
> March:
> - Editor Engagement Features (Echo, Flow projects)
> - Funds Dissemination Committee
>
> We’ll try doing this on the same day or adjacent to the monthly
> metrics meetings [2], since the team(s) will give a presentation on
> their recent progress, which will help set some context that would
> otherwise need to be covered in the quarterly review itself. This will
> also create open opportunities for feedback and questions.
>
> My goal is to do this in a manner where even though the quarterly
> review meetings themselves are internal, the outcomes are captured as
> meeting minutes and shared publicly, which is why I'm starting this
> discussion on a public list as well. I've created a wiki page here
> which we can use to discuss the concept further:
>
> https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_r…
>
> The internal review will, at minimum, include:
>
> Sue Gardner
> myself
> Howie Fung
> Team members and relevant director(s)
> Designated minute-taker
>
> So for example, for Visual Editor, the review team would be the Visual
> Editor / Parsoid teams, Sue, me, Howie, Terry, and a minute-taker.
>
> I imagine the structure of the review roughly as follows, with a
> duration of about 2 1/2 hours divided into 25-30 minute blocks:
>
> - Brief team intro and recap of team's activities through the quarter,
> compared with goals
> - Drill into goals and targets: Did we achieve what we said we would?
> - Review of challenges, blockers and successes
> - Discussion of proposed changes (e.g. resourcing, targets) and other
> action items
> - Buffer time, debriefing
>
> Once again, the primary purpose of these reviews is to create improved
> structures for internal accountability, escalation points in cases
> where serious changes are necessary, and transparency to the world.
>
> In addition to these priority initiatives, my recommendation would be
> to conduct quarterly reviews for any activity that requires more than
> a set amount of resources (people/dollars). These additional reviews
> may however be conducted in a more lightweight manner and internally
> to the departments. We’re slowly getting into that habit in
> engineering.
>
> As we pilot this process, the format of the high priority reviews can
> help inform and support reviews across the organization.
>
> Feedback and questions are appreciated.
>
> All best,
> Erik
>
> [1] https://wikimediafoundation.org/wiki/Vote:Narrowing_Focus
> [2] https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings
> --
> Erik Möller
> VP of Engineering and Product Development, Wikimedia Foundation
>
> Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
>
> _______________________________________________
> Wikimedia-l mailing list
> Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
--
Tilman Bayer
Senior Operations Analyst (Movement Communications)
Wikimedia Foundation
IRC (Freenode): HaeB
Minutes and slides from the first ever quarterly review of the
Foundation's Technical Operations team (held on August 28) are now
available at https://meta.wikimedia.org/wiki/WMF_Metrics_and_activities_meetings/Quarter…
.
On Wed, Dec 19, 2012 at 6:49 PM, Erik Moeller <erik(a)wikimedia.org> wrote:
> Hi folks,
>
> to increase accountability and create more opportunities for course
> corrections and resourcing adjustments as necessary, Sue's asked me
> and Howie Fung to set up a quarterly project evaluation process,
> starting with our highest priority initiatives. These are, according
> to Sue's narrowing focus recommendations which were approved by the
> Board [1]:
>
> - Visual Editor
> - Mobile (mobile contributions + Wikipedia Zero)
> - Editor Engagement (also known as the E2 and E3 teams)
> - Funds Dissemination Committe and expanded grant-making capacity
>
> I'm proposing the following initial schedule:
>
> January:
> - Editor Engagement Experiments
>
> February:
> - Visual Editor
> - Mobile (Contribs + Zero)
>
> March:
> - Editor Engagement Features (Echo, Flow projects)
> - Funds Dissemination Committee
>
> We’ll try doing this on the same day or adjacent to the monthly
> metrics meetings [2], since the team(s) will give a presentation on
> their recent progress, which will help set some context that would
> otherwise need to be covered in the quarterly review itself. This will
> also create open opportunities for feedback and questions.
>
> My goal is to do this in a manner where even though the quarterly
> review meetings themselves are internal, the outcomes are captured as
> meeting minutes and shared publicly, which is why I'm starting this
> discussion on a public list as well. I've created a wiki page here
> which we can use to discuss the concept further:
>
> https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_r…
>
> The internal review will, at minimum, include:
>
> Sue Gardner
> myself
> Howie Fung
> Team members and relevant director(s)
> Designated minute-taker
>
> So for example, for Visual Editor, the review team would be the Visual
> Editor / Parsoid teams, Sue, me, Howie, Terry, and a minute-taker.
>
> I imagine the structure of the review roughly as follows, with a
> duration of about 2 1/2 hours divided into 25-30 minute blocks:
>
> - Brief team intro and recap of team's activities through the quarter,
> compared with goals
> - Drill into goals and targets: Did we achieve what we said we would?
> - Review of challenges, blockers and successes
> - Discussion of proposed changes (e.g. resourcing, targets) and other
> action items
> - Buffer time, debriefing
>
> Once again, the primary purpose of these reviews is to create improved
> structures for internal accountability, escalation points in cases
> where serious changes are necessary, and transparency to the world.
>
> In addition to these priority initiatives, my recommendation would be
> to conduct quarterly reviews for any activity that requires more than
> a set amount of resources (people/dollars). These additional reviews
> may however be conducted in a more lightweight manner and internally
> to the departments. We’re slowly getting into that habit in
> engineering.
>
> As we pilot this process, the format of the high priority reviews can
> help inform and support reviews across the organization.
>
> Feedback and questions are appreciated.
>
> All best,
> Erik
>
> [1] https://wikimediafoundation.org/wiki/Vote:Narrowing_Focus
> [2] https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings
> --
> Erik Möller
> VP of Engineering and Product Development, Wikimedia Foundation
>
> Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
>
> _______________________________________________
> Wikimedia-l mailing list
> Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
--
Tilman Bayer
Senior Operations Analyst (Movement Communications)
Wikimedia Foundation
IRC (Freenode): HaeB
My (belated) final report for my OPW project is up on my progress
reports page: https://www.mediawiki.org/wiki/Evaluating_and_Improving_MediaWiki_web_API_c…
. Thank you to everyone who has helped me learn as much and contribute
as much as I have this summer, and special thanks to my mentors and
technical advisers. This has been a great experience, and I'm happy to
be leaving API:Client code in better shape than I found it!
I hope to stick around the MediaWiki development community as I move
on in my career as a software developer. I've applied for the January
batch of Hacker School. Until that starts, I'll be working on API
applications for Growstuff (http://growstuff.org): a gardening site
that collects crowdsourced data from local gardeners around the world
and freely licenses the resulting data. Wherever I end up after this,
I'm glad to have gotten my start on MediaWiki.
-Frances
Hi all.
During the RFC doscussion today, the question popped up how the performance of
creating closures compares to creating objects. This is particularly relevant
for closures/objects created by bootstrap code which is always executed, e.g.
when registering with a CI framework.
Attached is a benchmark I quickly hacked up. It indicates that creating objects
is about 40% slower on my setup (PHP 5.4.9). I'd be curious to know how it
compares on HHVM.
In absolute numbers though, creating an object seems to take about one
*micro*second. That seems fast enough that we don't really have to care, I think.
Anyone want to try?
Cheers,
daniel
Hi,
I want to check if https://gerrit.wikimedia.org/r/157836/ is
live on dewp.
So I would get MediaWiki core's SHA1 from
http://de.wikipedia.org/w/api.php?action=query&meta=siteinfo,
extract the extension name from the Gerrit page, look at
core's extensions/$EXTENSION SHA1 for the $VERSION_SHA1 and
see if in the extension's repository $CHANGE_SHA1 is a par-
ent of $EXTENSION_SHA1.
That sounds simple, but like work. Has someone done that
already? (Web tools are welcome as well.)
Tim
Hi all!
tl;dr: How to best handle the situation of an old parser cache entry not
containing all the info expected by a newly deployed version of code?
We are currently working to improve our usage of the parser cache for
Wikibase/Wikidata. E.g., We are attaching additional information related to
languagelinks the to ParserOutput, so we can use it in the skin when generating
the sidebar.
However, when we change what gets stored in the parser cache, we still need to
deal with old cache entries that do not yet have the desired information
attached. Here's a few options we have if the expected info isn't in the cached
ParserOutput:
1) ...then generate it on the fly. On every page view, until the parser cache is
purged. This seems bad especially if generating the required info means hitting
the database.
2) ...then invalidate the parser cache for this page, and then a) just live with
this request missing a bit of output, or b) generate on the fly c) trigger a
self-redirect.
3) ...then generated it, attach it to the ParserOutput, and push the updated
ParserOutput object back into the cache. This seems nice, but I'm not sure how
to do that.
4) ...then force a full re-rendering and re-caching of the page, then continue.
I'm not sure how to do this cleanly.
So, the simplest solution seems to be 2, but it means that we invalidate the
parser cache of *every* page on the wiki potentially (though we will not hit the
long tail of rarely viewed pages immediately). It effectively means that any
such change requires all pages to be re-rendered eventually. Is that acceptable?
Solution 3 seems nice and surgical, just injecting the new info into the cached
object. Is there a nice and clean way to *update* a parser cache entry like
that, without re-generating it in full? Do you see any issues with this
approach? Is it worth the trouble?
Any input would be great!
Thanks,
daniel
--
Daniel Kinzler
Senior Software Developer
Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.
Since we had the luxury of having several people in the office,
Trevor, Juliusz, Rob Moen, Ed Sanders, Shahyar, May, Monte and I sat
down to talk about the problem we currently have of having no standard
way to create icons. Here is my write up of this meeting, again, if
you attended please add/correct me on anything and if you were not
please ask for clarification where needed.
Currently we have two modes of creating icons in MediaWiki
1) Using a font
2) Using SVGs with PNG fallbacks
and the markup varies depending on what extension you look at.
We discussed both approaches and advantages and disadvantages of each.
One of the major disadvantages of the WikiFont is the additional HTTP
request it creates to download the font and cannot be embedded in the
stylesheet using data uris like SVGs can (due to URL size
restrictions).
One of the major advantages of WikiFont is you can design a grayscale
icon, and style it using font colour. Shahyar was happy to move Flow
to using SVG based fonts if we could build grayscale SVGs and change
their colours using ResourceLoader. One concrete example is when you
have an icon used in a constructive anchor. The icon needs to be
green, but when hovered over a lighter green.
Another advantage brought up by May was that currently she finds it
much easier to build icons in this way, and that having to maintain
separate coloured versions of the SVGs is a pain point to her.
We decided that we should push towards using SVGs that can be built
into fonts for the purpose of the app.
As next steps
1) Monte to explore using SVGs in the Wikipedia apps. He will create
font from SVGs. He will work with May to ensure her workflow is as
easy as before.
2) Trevor is going to look into how we can automatically generate
different colour versions of SVGs and automatically create PNGs from
them.
3) I am going to aim to agree on a standard icon HTML markup for
mediawiki ui. We still have an issue with the fact that everyone is
using different HTML markup to create icons. After reviewing all our
current approaches [1] it was clear that we were relatively closely
aligned and that it simply is a case of agreeing on class names.
We aim to get all the above done by Sept 15th, 2014 so please poke us
on the mailing list if you haven't had a follow up then.
Full disorganised notes can be found here [2].
[1] https://www.mediawiki.org/wiki/Icon_standardisation
[2] http://etherpad.wikimedia.org/p/Icon_standardisation