Forwarded to legal.
---------- Forwarded message ----------
From: Kevin Israel
Date: Tuesday, March 20, 2012
Subject: [Wikitech-l] WURFL licensing concerns and Git migration
To: wikitech-l(a)lists.wikimedia.org
Our MobileFrontend extension, which is currently deployed on Wikimedia
sites, uses WURFL to detect the mobile devices it targets. However, I
recently became aware the version of the WURFL data files we use has a
rather restrictive license.
http://tech.groups.yahoo.com/group/wmlprogramming/message/34311
The license seems to suggest we are not even supposed to redistribute
verbatim copies or install the data files on multiple servers rather
than only making "[...] one copy [...]", if not merely fail to grant
such permission. Currently, the files are in our Subversion repository
and are going to end up in Git soon.
I am not a lawyer, and I realize this is probably a matter for the
Wikimedia Foundation to handle, albeit one of urgent importance to us.
If I am not mistaken, proper removal of infringing material from Git
repositories is somewhat painful in that it causes all child SHA-1
hashes to change, so I feel resolution of the above licensing concern
blocks Git migration of at least the MobileFrontend extension.
--
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
--
Sent from my iPad
I've been fiddling with a new iPad, with its notoriously high-resolution
display (2048x1536, roughly similar to the iPhone 4's earlier 2x resolution
jump on the small screen but on something "real" sized). Text renders
stunningly sharp. And you know what else?
SVG.
Graphics.
Look.
Totally.
Awesome.
On this screen!
I've got a little user script which replaces rasterized SVGs with their
scalable originals:
https://en.wikipedia.org/wiki/User:Brion_VIBBER/hidpi.js
even without fixing up any of the pure-raster images yet, this gives a
visible improvement to pages containing maps, flags, site icons, etc which
are often SVG.
Similar resolution screens will likely be coming to laptops sooner than
later, so we should definitely start looking into making our UI and our
images look awesome. (It'll also tend to help when printing or zooming in
in the browser -- for instance zoom in on the tables at
https://en.wikipedia.org/wiki/Olympic_Games#Host_nations_and_cities and
you'll actually see the flags, not just blurry little piles of pixels.)
I've filed a bunch of bugs about low-resolution PNG and GIF icons in our
user interface, under the high-density tracking bug:
https://bugzilla.wikimedia.org/showdependencytree.cgi?id=32101&hide_resolve…
over time we should make sure that our UI is consistently scalable; using
native SVG icons with fallback PNGs should do.
We'll have to do some experiments to determine a good way of doing
fallbacks, deciding when to render things out fully, etc. It may make sense
to have some per-image controls -- for instance files that are known to
render very slowly we might prefer to serve as rasters; non-SVG drawings
may also benefit from loading at higher resolution on high res displays.
Anyway, more fun stuff for people to think about. :)
-- brion
As you all probably know, the Gerrit migration is going to happen in a
little over 24 hours. For the SVN repositories being migrated (core
and all WMF-deployed extensions), I am doing three special things
today.
First, I am identifying all unreviewed revisions and reviewing those
that 1) I am capable of reviewing and 2) that I feel are small enough
and/or important enough to review.
Second, I am reverting all remaining unreviewed revisions and tagging
them with the 'gerritmigration' tag. These reverts are temporary, and
I will resubmit the reverted revisions into Gerrit after the
switchover. They will then have to go through the code review process
in Gerrit. It will be a bit weird and awkward to review so many
revisions, but this ensures that we can migrate to Gerrit with a clean
repository free of unreviewed code. I'll be doing these reverts in
logical chunks where possible, but I will eventually revert everything
that's unreviewed, so don't take it personally :) .
Third, I am declaring a code freeze effective immediately. It's not a
complete freeze, but stricter than the preceding slush (so maybe it's
froyo or something?). Essentially, the freeze means that as of right
now, we are pretty much going to do pre-commit review. If you want to
commit something between now and the Gerrit migration, you will need
to get your patch reviewed by someone *before* you commit it, and that
person will have to OK the revision in CodeReview shortly after it's
committed. I will periodically check the repository, and any new
commits that are unreviewed and have been sitting in the repo for more
than an hour will be reverted on sight, tagged with 'gerritmigration'
and resubmitted into Gerrit after the migration.
I realize that this is kind of a sudden and unilateral decree that I'm
imposing here, and I apologize for that. However, it's only for 24
hours and it's what needs to be done in order for the Gerrit migration
tomorrow to happen smoothly, so I hope you'll understand.
Roan
P.S.: Any help with the reviewing, reverting, tagging and/or
resubmitting revisions would be appreciated, but please find me on IRC
before you start helping me; if I am not on IRC, that means I'm not
working and you can go right ahead.
P.P.S.: Before anyone asks: in the (IMO unlikely) event that the
Gerrit migration doesn't happen tomorrow and is delayed to next week
or beyond, the code freeze automatically ends at 23:59 PDT (that's Mar
22, 07:59 UTC).
I talked with Tim, Chad, Roan, and RobLa to nail down how we're going to
manage membership in Gerrit's project owner groups (people with the
permissions to merge code into the master branch).
Caution: long. TL;DR: we'll start off with a small set of Gerrit
project owners for everything, and then aim to be consistent,
thoughtful, and transparent in how we add people to and remove people
from those groups.
Three major areas to cover:
* MediaWiki core
* Extensions that WMF deploys
* Other extensions/projects, including new repositories
== MediaWiki core ==
This is core.git,
https://gerrit.wikimedia.org/r/#admin,project,mediawiki/core,branches or
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=summary .
=== Who gets to merge things into core, and how? ===
Right now, it's shell & root. It's the people in
https://gerrit.wikimedia.org/r/#admin,group,11 once that group gets set
up properly (see https://bugzilla.wikimedia.org/show_bug.cgi?id=35148 ).
That group right now:
>From Ops (via an LDAP group):
fvassard
dzahn
jeluf
kate
lcarr
mark
midom
py
robh
ariel
laner
asher
jgreen
ben
sara
pdhanda
>From development:
brion
tfinc
tstarling
catrope
andrew
awjrichards
aaron
nikerabbit
nimishg
rfaulk
demon
hashar
reedy
preilly
robla
(removed zak and neilk as they are now dormant in our projects)
(added myself so I can do administrivia like adding & removing people)
So, we are currently limiting this to those who have cluster access,
because these will be the people who have to get interrupted to fix
things when something is screwed up. They will be married to the
consequences of their mistakes. This helps change the current
situation, where any developer can merge bad code in but it costs
developers time to back it out.
However, we may expand the group in the future, or change how we do
branches, to allow smaller-scoped commits to get merged more easily. As
Aaron put it, large commits are scary no matter who writes and commits
them, whereas some authors we can trust to merge in smaller changes.
Also to avoid:
* creating circular communities of people who give hasty, shoddy reviews
to each other's work and sidestep quality control
* the habit of cherry-picking friends' or colleagues' commits to review
and leaving strangers' commits to rot in the merge request queue
* being so cautious about adding people to the Gerrit project owners
groups that we build up an unsustainable merge request backlog
* letting big changesets into the codebase without accompanying unit
tests (counterexample: testing on FileBackend helped a lot)
* being jerks
=== How will we add Gerrit project owners? ===
The current thinking is that individuals can request to be added to the
project owner groups for specific Gerrit projects. We will create a
queue ( https://bugzilla.wikimedia.org/show_bug.cgi?id=35347 ) to
process requests from people who want to join the MediaWiki core Gerrit
project owners group, and I'll manage the process as I now manage the
commit access requests process. When someone requests membership, I'll
contact the existing project owners, and if the candidate gets zero
vetoes and at least one yes from the existing project owners, then we'll
approve the candidate.
We will also need to proactively add new people into the Gerrit project
when they get suggested by existing project owners. The shortlist would
also include experienced developers with good MediaWiki code review
skills, like Timo, Trevor, and Nikerabbit. We've already added
Platonides to the Gerrit project owners group since he fits these criteria.
After we run through the usual suspects, we should make regular efforts
to find underpublicized high-caliber code reviewers to suggest as
candidates. We should develop some kind of proxy for "has done a bunch
of good code review work" so we can check statistics to find candidates
-- perhaps "number of statuschanges they OK that do not later get
FIXMEs, proportional to the lines-of-code size of changes reviewed," or
something like that. Please do not bikeshed this right now! Chad will
propose something far more sensible sometime in April, I think.
Also, in the future, we may create another "unstable" branch (more
forgiving than master but still not wide-open). The purpose would be to
pull in all reasonable-quality pushes, and to provide a venue for code
reviewers to gain experience so they can eventually graduate to master.
=== Why might a project owner be removed? ===
We are also creating some social and technical procedures for removing
members from Gerrit project owner groups. After all, if you are a
chronic offender of breaking the deployment, then we have
to have some consequence. Reasons to consider removing members:
* inactivity (a few months or more without commits, code review
comments, or merges)
* lack of respect for senior community members, and anticollaborative
behavior
* breaking things through negligence or incompetence
When do we revoke access? We're still figuring that out. Of course in
most cases the other Gerrit project group owners would warn the relevant
person first, but after that, what?
I floated a "two strikes" rule (the second time a big problem happens,
we strongly consider revoking ownership) but we're iffy about it. After
all, the severity of an incident is often not proportional to
negligence! And who defines "big problem"? So, the criteria are likely
to be somewhat subjective, but we aim to nevertheless apply them fairly,
consistently, and we hope rarely!
== Extensions that Wikimedia Foundation deploys ==
There's a big list of them:
https://gerrit.wikimedia.org/r/#admin,projects under mediawiki/extensions/ .
Each extension gets its own Gerrit project. All MediaWiki core project
owners are also project owners for all the Gerrit projects of
WMF-deployed extensions. Additionally, if someone is the maintainer of
an extension, they should generally have Gerrit project owner rights on
that extension.
How do we determine who is the maintainer or the primary developer? We
look for the person credited in that extension's $wgExtensionCredits or
in a CREDITS file; for example, for CentralAuth, per
http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/CentralAuth/Cent…
, it'd be Brion Vibber. But there are some edge cases here:
* There are a few extensions, ParserFunctions, who don't have specific
reviewers.
* Sometimes the primary author is not experienced or skilled enough to
be a Gerrit project owner if that means basically being able to approve
something for deployment.
* Sometimes there are very old extensions, or extensions where the
original authors/maintainers listed in the credits have left the project
(like ThomasV with ProofreadPage).
So, over the next few weeks, the current MediaWiki core Gerrit project
owners will go through the ~100 extensions that are deployed on WMF
sites, and make judgment calls based on the extension owners' reputation
and experience. In most cases those extensions maintainers will also
become Gerrit project owners, but in some cases they will simply be
reviewers who get asked to review important changes.
The ones that are most important to cover sooner are the ones that have
been under active development in the last few months.
=== Changes ===
If someone wants to apply to be a Gerrit project owner of an extension
that the Wikimedia Foundation deploys, then we'll follow the same
procedure that we do for MediaWiki core, with that queue (follow
https://bugzilla.wikimedia.org/show_bug.cgi?id=35347 ) and checking with
existing Gerrit project owners. And removal will work the same way as
mentioned for core.
== Other MediaWiki extensions ==
As mentioned in
https://www.mediawiki.org/wiki/Git/Conversion#Affected_development_projects
, the extensions that WMF doesn't deploy (there are about 675 of them)
have some more time before deciding whether to switch to Git or move to
another repository. We do want to be proactive about communicating with
them (emailing people mentioned in their credits) and find guinea pigs. :-)
For example, we know who the Semantic MediaWiki/Semantic extensions
developers are, so we could easily reach out to them and know who the
Gerrit project owners would be. In contrast, for somewhat older
extensions with little traffic, we can wait a little longer. RobLa
suggested that, in those cases, we can just let people have Gerrit
project ownership of the extensions if they're listed in
$wgExtensionCredits or CREDITS and have done a substantial percentage of
the commits in the last three months or so; if not, we'd poke around a
little and ask for an ok from the people who are listed as the
extensions; authors.
== Changes ==
The addition procedure will probably be through that same queue. And as
for removing people from project owner status of a non-WMF deployed
extension, we would basically follow the same guidelines as core --
vandalism or proprietary licensing or anticollaborative behavior --
except that the "breaking the deployment" reason would not be applicable.
=== Newly created extensions/Gerrit projects ===
When anyone wants to create a new extension, they will by default be an
owner of it.
We aren't going to let people create new repositories completely
automatically, since that might lead to pointlessness, spam, or
non-Wikimedia-related code living on our servers -- see the model in
action at https://www.mediawiki.org/wiki/Git/New_repositories . In most
cases a request will be rubberstamped, as long as the Gerrit project
will be under an open source license and the work is related to
MediaWiki and/or Wikimedia in some way.
I hope this has been helpful. I suspect we won't have too much to
discuss but if there are clarifications needed please speak up; then in
a day or two I can put this up on the wiki to make it easier to reference.
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
I've removed "Download from SVN" link from mw.org sidebar. Any
suggestions on its replacement? Should it point to [[Git/Worfklow]] or
a separate page is needed? Also, there are more SVN-related links in
the sidebar, but I left them untouched for now as they're useful and
we don't have replacements for them.
--
Best regards,
Max Semenik ([[User:MaxSem]])
This year I want the Wikimedia technical community to have a strong
presence at Open Source Bridge <http://opensourcebridge.org/> in
Portland, Oregon, USA, June 26-29. OSB is tech talks & hack sessions
with hands-on technologists we want, for Foundation staff recruiting
(the Portland tech scene has good people looking for jobs) and for
volunteer recruiting and collaboration (tons of Mozilla people went
there last year). Good talks, clueful people, great food. :-)
If you submit a talk and it gets accepted, tell me, and Wikimedia
Foundation will partially subsidize or fully pay for your flight and
hotel. If you submit a talk and it doesn't get accepted but you still
want to go, talk with me and I'll see what I can do.
Call for talks:
<http://opensourcebridge.org/blog/2012/01/announcing-the-2012-call-for-propo…>
Ideas: the parser rewrite, Wikimedia Labs, how we scale and optimize
performance on a shoestring budget, our git/gerrit migration, securing
PHP-based webapps, various approaches to making our data more
structured/semantic, collaborative design, lessons from our communities,
JS hacks, hetdeploy, offline/mobile, geodata...
Please forward.
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
Hi everyone,
MediaWiki core and WMF-deployed extensions are now completely read-only
in SVN--please begin submitting your changes through Gerrit now for those
projects.
Might be a few minor issues to iron out (.gitreview files aren't everywhere
just yet), so feel free to ping me on IRC if you have any questions.
-Chad
As some may know, we've restricted videos on Wikimedia sites to the
freely-licensed Ogg Theora codec for some years, with some intention to
support other non-patent-encumbered formats like WebM.
One of our partners in pushing for free formats was Mozilla; Fire fox's
HTML5 video supports only Theora and WebM.
The prime competing format, H.264, has potential patent issues - like other
MPEG standards there's a patent pool and certain licensing rules. It's also
nearly got an exclusive choke hold on mobile - so much so that Mozilla is
considering ways to adopt H.264 support to avoid being left behind:
http://blog.lizardwrangler.com/2012/03/18/video-user-experience-and-our-mis…
Is it time for us to think about H.264 encoding on our own videos?
Right now users of millions of mobile phones and tablets have no access to
our audio and video content, and our old desktop fallback of using a Java
applet is unavailable.
In theory we can produce a configuration with TimedMediaHandler to produce
both H.264 and Theora/WebM transcodes, bringing Commons media to life for
mobile users and Apple and Microsoft browser users.
What do we think about this? What are the pros and cons?
-- brion
Our MobileFrontend extension, which is currently deployed on Wikimedia
sites, uses WURFL to detect the mobile devices it targets. However, I
recently became aware the version of the WURFL data files we use has a
rather restrictive license.
http://tech.groups.yahoo.com/group/wmlprogramming/message/34311
The license seems to suggest we are not even supposed to redistribute
verbatim copies or install the data files on multiple servers rather
than only making "[...] one copy [...]", if not merely fail to grant
such permission. Currently, the files are in our Subversion repository
and are going to end up in Git soon.
I am not a lawyer, and I realize this is probably a matter for the
Wikimedia Foundation to handle, albeit one of urgent importance to us.
If I am not mistaken, proper removal of infringing material from Git
repositories is somewhat painful in that it causes all child SHA-1
hashes to change, so I feel resolution of the above licensing concern
blocks Git migration of at least the MobileFrontend extension.
--
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand
In the past few months, I've needed to test gadgets that are used on WMF
sites, but didn't have an easy way to copy them from, say, Commons to
Beta in Labs.
I hacked something together and then forgot about it until Chris McMahon
(the WMF's new QA guy) wanted something similar for one of his test
wikis a couple of weeks ago. I couldn't find it, so last night I
rewrote the script from scratch.
The script is pretty ugly and, normally, I might commit it to SVN and
work on it there, but we're in the process of moving to Git, so I've
gone ahead and created a gitorious repository:
https://gitorious.org/wiki-gadgets
Lots of hard-coding that could be cleaned up, and it could use the API
to copy from one wiki to another, but its a start. I'm sharing it here
so that other people can begin to use it.
It just blats the import-able file out on standard output for now, and
it only copies from Commons, so it should be used in the following way:
$ php dupe-gadgets.php > import-this-file.xml
[ Go to Special:Import on your wiki and import. ]
I hope that it helps someone!
--
Mark A. Hershberger
Bugmeister
Wikimedia Foundation
mah(a)wikimedia.org