Minutes and slides from last Thursday's quarterly review of the
Foundation's Editing (formerly VisualEditor) team are now available at
(A separate but related quarterly review meeting of the Parsoid team
took place on Friday, those minutes should be up tomorrow.)
On Wed, Dec 19, 2012 at 6:49 PM, Erik Moeller <erik(a)wikimedia.org> wrote:
> Hi folks,
> to increase accountability and create more opportunities for course
> corrections and resourcing adjustments as necessary, Sue's asked me
> and Howie Fung to set up a quarterly project evaluation process,
> starting with our highest priority initiatives. These are, according
> to Sue's narrowing focus recommendations which were approved by the
> Board :
> - Visual Editor
> - Mobile (mobile contributions + Wikipedia Zero)
> - Editor Engagement (also known as the E2 and E3 teams)
> - Funds Dissemination Committe and expanded grant-making capacity
> I'm proposing the following initial schedule:
> - Editor Engagement Experiments
> - Visual Editor
> - Mobile (Contribs + Zero)
> - Editor Engagement Features (Echo, Flow projects)
> - Funds Dissemination Committee
> We’ll try doing this on the same day or adjacent to the monthly
> metrics meetings , since the team(s) will give a presentation on
> their recent progress, which will help set some context that would
> otherwise need to be covered in the quarterly review itself. This will
> also create open opportunities for feedback and questions.
> My goal is to do this in a manner where even though the quarterly
> review meetings themselves are internal, the outcomes are captured as
> meeting minutes and shared publicly, which is why I'm starting this
> discussion on a public list as well. I've created a wiki page here
> which we can use to discuss the concept further:
> The internal review will, at minimum, include:
> Sue Gardner
> Howie Fung
> Team members and relevant director(s)
> Designated minute-taker
> So for example, for Visual Editor, the review team would be the Visual
> Editor / Parsoid teams, Sue, me, Howie, Terry, and a minute-taker.
> I imagine the structure of the review roughly as follows, with a
> duration of about 2 1/2 hours divided into 25-30 minute blocks:
> - Brief team intro and recap of team's activities through the quarter,
> compared with goals
> - Drill into goals and targets: Did we achieve what we said we would?
> - Review of challenges, blockers and successes
> - Discussion of proposed changes (e.g. resourcing, targets) and other
> action items
> - Buffer time, debriefing
> Once again, the primary purpose of these reviews is to create improved
> structures for internal accountability, escalation points in cases
> where serious changes are necessary, and transparency to the world.
> In addition to these priority initiatives, my recommendation would be
> to conduct quarterly reviews for any activity that requires more than
> a set amount of resources (people/dollars). These additional reviews
> may however be conducted in a more lightweight manner and internally
> to the departments. We’re slowly getting into that habit in
> As we pilot this process, the format of the high priority reviews can
> help inform and support reviews across the organization.
> Feedback and questions are appreciated.
> All best,
>  https://wikimediafoundation.org/wiki/Vote:Narrowing_Focus
>  https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings
> Erik Möller
> VP of Engineering and Product Development, Wikimedia Foundation
> Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
> Wikimedia-l mailing list
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
Senior Operations Analyst (Movement Communications)
IRC (Freenode): HaeB
We (the Wikidata team) ran into an issue recently with the value that gets
passed as $baseRevId to Content::prepareSave(), see Bug 67831 . This comes
from WikiPage::doEditContent(), and, for core, is nearly always set to "false"
(e.g. by EditPage).
We interpreted this rev ID to be the revision that is the nominal base revision
of the edit, and implemented an edit conflict check based on it. Which works
with the way we use doEditContent() for wikibase on wikidata, and with most
stuff in core (which generally has $baseRevId = false). But as it turns out, it
does not work with rollbacks: WikiPage::commitRollback sets $baseRevId to the ID
of the revision we revert *to*.
Now, is that correct, or is it a bug? What does "base revision" mean?
The documentation of WikiPage::doEditContent() is unclear about this (yes, I
wrote this method when introducing the Content class - but I copied the
interface WikiPage::doEdit(), and mostly kept the code as it was). And in the
code, $baseRevId is not used at all except for passing it to hooks and to
Content::prepareSave - which doesn't do anything with it for any of the Content
implementations in core - only in Wikibase we tried to implement a conflict
check here, which should really be in WikiPage, I think.
So, what *does* $baseRevId mean? If you happen to know when and why $baseRevId
was introduced, please enlighten me. I can think of three possibilities:
1) It's the edit's reference revision, used to detect edit conflicts (this is
how we use this in Wikibase). That is, an edit is done with respect to a
specific revision, and that revision is passed back to WikiPage when saving, so
a check for edit conflicts can be done as close to the actual edit as possible
(ideally, in the same DB transaction). Compare bug 56849 .
2) The edit's "physical parent": that would be the same as (1), unless there is
a conflict that was detected early and automatic resolved by rebasing the edit.
E.g. if an edit is performed based on revision 11, but revision 12 was added
since, and the edit was successfully rebased, the "parent" would be 12, not 11.
This is what WikiPage::doEditContent() calls $oldid, and what gets saved in
rev_parent_id. Since WikiPage::doEditContent() makes the distinction between
$oldid and $baseRevId, this is probably not what $baseRevId was intended to be.
3) It could be the "logical parent": this would be identical to (2), except for
a rollback: if I revert revision 15 and 14 back to revision 13, the new
revision's logical parent would be rev 13's parent. The idea is that you are
restoring rev 13 as it was, with the same parent rev 13 had. Something like this
seems to be the intention of what commitRollback() currently does, but the way
it is now, the new revision would have rev 13 as its logical parent (which, for
a rollback, would have identical content).
So at present, what commitRollback currently does is none of the above, and I
can't see how what it does makes sense.
I suggest we fix it, define $baseRevId to mean what I explained under (1), and
implement a "late" conflict check right in the DB transaction that updates the
revision (or page) table. This might confuse some extensions though, we should
double check AbuseFilter, if nothing else.
Is that a good approach? Please let me know.
Is there some effective way to do this? We are using only mw api's in
latest huggle, and somehow it happens that when users are logged out
of mediawiki, it still works (edits are done using IP instead).
How can I ensure that api query will fail unless user is not logged
in, is there some variable for that? Huggle is executing huge number
of api queries in multiple threads so checking if user is logged in
before every single query would be too slow.
I like to sort items by priority but always I change this, when I
reopen the page it's lost and I have to change it again.
Is there a way to save this in bugzilla so that I always have bugs by
priority instead of component? I have cookies enabled but it doesn't
Hello and welcome to the latest edition of the WMF Engineering Roadmap
and Deployment update.
The full log of planned deployments next week can be found at:
A quick list of notable items...
== Tuesday ==
* MediaWiki deploy
** group1 to 1.24wmf10: All non-Wikipedia sites (Wiktionary, Wikisource,
Wikinews, Wikibooks, Wikiquote, Wikiversity, and a few other sites)
* The "In other projects" sidebar Beta Feature will be enabled.
== Wednesday ==
* The updated Android Wikipedia app will be released via Google Play
== Thursday ==
* MediaWiki deploy
** group2 to 1.24wmf10 (all Wikipedias)
** group0 to 1.24wmf11 (test/test2/testwikidata/mediawiki)
Thanks and as always, questions and comments welcome,
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
Its a pleasure to inform you that the feature for rating and review on
extensions has been deployed on WikiApiary.com . It would be great if you
all can submit some reviews on the extensions known to you to enrich the
site and help in data collection. Please visit the individual pages of
extensions and submit your rating. (For example ParserFunctions
The second step to involve to copy these ratings back to MW.o pages.
For this it is important to have some good data collected at wikiapiary.
Thanks and Regards,
a recent discussion in
revealed that parts of the SVG standard are deliberately broken on commons. While I see some reasons to not adhere fully to the standard, e.g. external resources might break over time, if they are moved or deleted, I don't feel it's good to break the standard as hard as it's done right now. It puts a burden on creators, on the principle of sharing within the wikimedia environment and overall, it's even technically inferior and leads or might lead to useless duplication of content.
The SVG standard defines an image element. The image resource is linked to using the xlink:href attribute. Optionally the image is embedded into the SVG using the https://en.wikipedia.org/wiki/Data_URI_scheme[https://en.wikipedia.org/wiki….
Combining SVGs with traditional bitmap images is useful in several ways: It allows creators sharing the way an image is manipulated and eases future modification that are hard to do or even impossible using traditional bitmap/photo editing. It basically has the same advantages that mash-up web content has over static content: Each layer or element can be modified individually without destroying the other elements. It's easy to see that a proper SVG is more to its potential users than a classig JPG or PNG with only one layer, being the result of all image operations.
These reasons point out the necessity for barrier-free access to the image element.
Currently, commons cripples this access layed out in the standard and originally implemented by "librsvg". It disables the handling of HTTP(S) resources. Users needing the same bitmap in more than one SVG are forces to base64-embed their source, and hence duplicate it, in each individual SVG. Indeed, there is quite some burden on creators and on wikimedia servers that duplicate lots of data right now and potentially even more in the future. Note that this duplication of data goes unnoticed by the routines specifically in place for bitmaps right now, that check uploads on MD5 collision and reject the upload on dup detection. Space might be cheap as long as donations are flowing, but reverting bad practice once it is common is harder than promoting good practice /now/ by adhering to the standard as closely as possible.
Therefore I advocate change to librsvg in one of the two ways layed out in comment 3 of the bug report given above and (re)support linking to external bitmaps in SVGs. Two strategies that come to mind to prevent disappearance of an external resource in the web are:
1) cache external refs on thumbnail generation, check for updates on external server on thumbnail re-generation
2) allow external refs to images residing on wikimedia servers only
Point 2) should be considered the easiest implementation, 1) is harder to implement but gives even more freedom to SVG creators and would adhere more closely to SVG standard. However, another argument for 2) would be the licensing issue: It ensures that only images are linked to that have been properly licensed by commons users and the upload process (and if a license violation is detected and the linked-to bitmap removed from commons, the SVG using such a bitmap breaks gracefully).