Hello,
I have wgCodeReviewDeferredPaths on mediawiki.org which let us
automatically defers revisions based on a path.
The initial list is currently:
+$wgCodeReviewDeferredPaths['mediawiki'] = array(
+ '%^/trunk/extensions/SemanticForms%',
+ '%^/trunk/WikiWord%',
+);
The bug report was:
https://bugzilla.wikimedia.org/23494
--
Ashar Voultoiz
Hello,
We have a couple of bugs requiring shell access. They are mostly about
tweaking namespaces, enabling an extension, configuring mediawiki.
It seems those bugs are mostly handled by RobH, Jeluf and myself. I am
wondering if we could create a working group under bugzilla and assign
the shell bugs directly to this group ? That would let us track changes
more easily.
--
Ashar Voultoiz
Hi everyone,
As you saw, we pulled the Bugmeister listing from the list of open
positions (though it's still available here:
http://wikimediafoundation.org/wiki/Job_openings/Bugmeister ). We
talked to some excellent candidates, but we found it was a trickier
spot to fill than we first anticipated, and that we need to rethink
our hiring order a little bit.
However, as was pointed out on IRC, we've had this hire in our plan
for quite some time, and the need is pretty great. Rather than just
continue to let this work go undone, we've asked Mark Hershberger (a
man of many hats) to step in and fill this role while we sort out our
longer term plan. After he wraps up his work on the 1.17 release,
he's going to dive into the bug database and start sorting.
It's not going to be his exclusive duty. For example, I'm sure he'll
also be tagging some things in Code Review, and he's planning to help
me out with some of the code review tracking. I'm sure other things
here and there will come up. However, Bugmeister will be his primary
role for a while.
More about Mark on his user page:
http://en.wikipedia.org/wiki/User:MarkAHershberger
Thank you Mark for taking on this role!
Rob
It started under the subject "What % of WMF is en:wp?".
2011/1/13 David Gerard <dgerard(a)gmail.com>:
> - I'd thought en:wp was still about 30% of everything - ~1/3 the
> edits, ~1/3 the articles, ~1/3 the page hits, etc.
I'm sorry about the shameless plug, but i just had to tell that the
number of edits in all the Wikipedias will change quite significantly
when bug 15607 will be closed (
https://bugzilla.wikimedia.org/show_bug.cgi?id=15607 ). Currently the
bulk of edits in the minor language Wikipedias is done by interwiki
bots, and i've got a hunch that en.wp is the de-facto hub for adding
interlanguage links. This is the workflow, more or less:
1. A human editor creates the article [[Ira Cohen]] in, say, Slovenian
(sl), after it already exists in 20 other Wikipedias.
2. A human editor adds [[sl:Ira Cohen]] to the article [[Ira Cohen]] in en.wp.
3. A human editor waits for the interwiki bots to pick it up and
propagate to 20 other Wikipedias in which this article already exists.
That makes it:
* 1 human edit in sl.wp.
* 1 human edit in en.wp.
* 20 bot edits in other Wikipedias.
After the Interlanguage extensions will be enabled, it will be:
* 1 human edit in sl.wp.
* 1 human edit in en.wp.
* 0 bot edits (some behind-the-scenes magic pushes the changes to 20
wikis, but it's not seen in Recent Changes.)
This is a major reason to have the Interlanguage extension finally
enabled. Besides a MAJOR cleaning-up in Recent Changes in all
Wikipedias, it will give a somewhat clearer picture of the activity in
the ones.
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
"We're living in pieces,
I want to live in peace." - T. Moore
Browsing the html code of source pages, I found this statement into a html
comment:
*Expensive parser function count: 0/500*
I'd like to use this statement to evaluate "lightness" of a page, mainly
testing the expensiveness of templates into the page but: in your opinion,
given that the best would be a 0/500 value, what are limits for a good,
moderately complex, complex page, just to have a try to work about? What is
a really alarming value that needs fast fixing?
And - wouldn't a good idea to display - just with a very small mark or
string into a corner of the page - this datum into the page, allowing a fast
feedback?
Alex
http://www.readwriteweb.com/archives/what_will_wikipedia_look_like_in_anoth…
"The most important thing," Wales told us on a press call today, "is
the increased diversity in languages." According to Wales, around 30%
of Wikipedia articles had been in English and already that number has
dropped to 20%. "We're going to see very very large projects in
languages where we've never seen such things before," he explained.
- I'd thought en:wp was still about 30% of everything - ~1/3 the
edits, ~1/3 the articles, ~1/3 the page hits, etc.
What are the various numbers? Is there anywhere to look them up, or
something to look them up from which they can be derived?
- d.
Earlier today, /a filled with binlogs in db27, which was s3 & s7 master.
nagios had warned too early / nobody noticed. Slaves lagged, lots of
locks, the wikis got to a halt.
Revisions between 6:50 and 8:20 pm UTC were lost (although they can be
manually reimported from db27).
The new s3 and s7 master is db17, with only one slave: db25.
After the master switch, we started having problems due to cached
revision text in memcached, due to the duplication of old_id values,
so we made them read-only until UTC midnight.
We decided not to disable $wgRevisionCacheExpiry but to remove the
faulty entries, thus I quickly prepared the script
maintenance/purgeStaleMemcachedText.php to clean them.
There were problems in hewiki, since data there didn't clean. On one
instance doing $wgMemc->get persisted even after a $wgMemc->delete on
that same key (???).
Other than the hewiki issues, it seemed to run fine. There will be lots
of wrong entries in diff and parser cache needing a manual action=purge
but a purge will clean them.
Flagged revs caches were not touched. Wikis using it may show the wrong
content (with the additional fun of some users viewing the right one).
There are also PPFrame_DOM->expand errors that started around the same
time, even on wikis on a different cluster. They usually only happen
once, and it succeeds just reloading.
https://bugzilla.wikimedia.org/show_bug.cgi?id=26429
http://blog.chromium.org/2011/01/html-video-codec-support-in-chrome.html
Chromium will support only Theora and VP8, Chrome to follow. Same as Firefox 4.
(How close are we to allowing VP8 on WMF? I understand Greg Maxwell,
as one of those actually working on making the VP8 code drop usable,
had some qualms about the code at the time.)
- d.
For folks who have not been following the saga on
http://wikitech.wikimedia.org/view/Dataset1
we were able to get the raid array back in service last night on the XML
data dumps server, and we are now busily copying data off of it to
another host. There's about 11T of dumps to copy over; once that's done
we will start serving these dumps read-only to the public again.
Because the state of the server hardware is still uncertain, we don't
want to do anything that might put the data at risk until that copy has
been made.
The replacement server is on order and we are watching that closely.
We have also been working on deploying a server to run one round of
dumps in the interrim.
Thanks for your patience (which is a way of saying, I know you are all
out of patience, as am I, but hang on just a little longer).
Ariel