Right now I'm seeing that image downloads on wikimedia commons have
slowed down by a factor of ten or so since the last weekend. I used to
download about 700 images an hour, so it's possible that my bandwidth
is being throttled, but it might be something affecting other people too.
Something I notice is that sometimes I can download a file very quickly:
[paul@haruhi ~]$ curl -O http://upload.wikimedia.org/wikipedia/commons/7/7b/Barabinsk_station1.jpg
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 807k 100 807k 0 0 797k 0 0:00:01 0:00:01 --:--:-- 863k
----------------
Then sometimes the download takes orders of magnitude longer:
[paul@haruhi ~]$ curl -O http://upload.wikimedia.org/wikipedia/commons/7/7b/Barabinsk_station1.jpg
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 807k 100 807k 0 0 16431 0 0:00:50 0:00:50 --:--:-- 14436
----------------
I'm wondering if there's some specific problem here, if I'm being
throttled, or if there's just a general need for more bandwidth &
servers. In any case I'd be pretty happy to make a donation that would
cover any operational costs that my activities incur and then some.
My fault. Will apply as soon as I get in front of a
computer if no one beats me to it.
On Sep 22, 2010 9:43 AM, "Asia Jedrzejewska-Szmek" <asia(a)fuw.edu.pl> wrote:
The script to dump pages fails with the following error:
zbyszek@escher:/srv/www/eduwiki$ sudo -u www-data php
maintenance/dumpBackup.php --full > /tmp/dump-full2
PHP Fatal error: Call to undefined method DumpFilter::DumpFilter() in
/home/srv/mediawiki/mediawiki-trunk/maintenance/backup.inc on line 306
This behaviour is the same in 1.16-wmf4 and trunk@72382.
This is easily corrected with the following patch:
diff --git a/maintenance/backup.inc b/maintenance/backup.inc
index 9b2ff89..88da915 100644
--- a/maintenance/backup.inc
+++ b/maintenance/backup.inc
@@ -303,7 +303,7 @@ class BackupDumper {
class ExportProgressFilter extends DumpFilter {
function ExportProgressFilter( &$sink, &$progress ) {
- parent::DumpFilter( $sink );
+ parent::__construct( $sink );
$this->progress = $progress;
}
HTH,
Asia Jedrzejewska-Szmek
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
The script to dump pages fails with the following error:
zbyszek@escher:/srv/www/eduwiki$ sudo -u www-data php maintenance/dumpBackup.php --full > /tmp/dump-full2
PHP Fatal error: Call to undefined method DumpFilter::DumpFilter() in /home/srv/mediawiki/mediawiki-trunk/maintenance/backup.inc on line 306
This behaviour is the same in 1.16-wmf4 and trunk@72382.
This is easily corrected with the following patch:
diff --git a/maintenance/backup.inc b/maintenance/backup.inc
index 9b2ff89..88da915 100644
--- a/maintenance/backup.inc
+++ b/maintenance/backup.inc
@@ -303,7 +303,7 @@ class BackupDumper {
class ExportProgressFilter extends DumpFilter {
function ExportProgressFilter( &$sink, &$progress ) {
- parent::DumpFilter( $sink );
+ parent::__construct( $sink );
$this->progress = $progress;
}
HTH,
Asia Jedrzejewska-Szmek
2010/9/21 David Gerard <dgerard(a)gmail.com>:
> Is this the problem that broke InstantCommons for pretty much everyone
> a few days ago?
Unless there's another obvious source of this information, I think it
would be useful for folks using the InstantCommons feature to be
included in a public list somewhere -- I just started such a list
here:
http://www.mediawiki.org/wiki/Sites_using_InstantCommons
I think this would help to create more visibility and awareness for
this functionality, and greater sensitivity to accidental breakage.
--
Erik Möller
Deputy Director, Wikimedia Foundation
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
In the last few days I've noticed sporadically that API calls
time out on Wikimedia Commons and also that sometimes file downloads
from Wikimedia Commons are really slow. Are there any performance
problems going on?
As I mentioned in a prior email, I have been converting
UsabilityInitiative/* extensions to work with ResourceLoader, and in the
process removing their dependency on UsabilityInitiative.php. Some of
these extensions were not really using any functionality in
UsabilityInitiative.php, so their "conversion" was more like cleanup and
removing a line that included it unnecessarily. The remaining extensions
(WikiEditor, Vector, ClickTracking and PrefStats) are now only
compatible with MediaWiki 1.17 because they depend on ResourceLoader
functionality.
I'm hoping these extensions, especially WikiEditor and Vector can be
examples of how to make use of ResourceLoader in extensions. Please
note: if you are working on code that is aimed at deployment this year,
you should not depend on ResourceLoader. We hope to have it running live
in November, but that's not a guarantee.
Please see the README for UsabilityInitiative
(http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/UsabilityInitiat…)
which you will notice ends in calling out that these extensions should
be moved. I've already moved Vector and WikiEditor, and the remaining
ones will likely be moved soon if there are no objections.
This should put the last nail in the coffin for UsabilityInitiative.
(the extension that is, the grant has been over for ages!)
- Trevor
Hi all,
I've sent this to wikitech-l before but I see now online that it
didn't create a new thread but instead recognized it as reply to an
old thread. Not sure what happened, but here again:
It's been roughly three years since I first saw this topic filed on
BugZilla[1] and before that is was often raised on IRC and on-wiki
during discussion about it being very clumsy and unpracticle to
systematically patrol uploads. Back then, from my point of view, this
was about local uploads.
Now adays I'm much more active on and for Wikimedia Commons, and not
so much on local uploads.
Obviously with more and more wikis moving towards Commons and the
growth of the wikis themselfs it's about time we can atleast some kind
of method of being able to atleast indicate that a file has been
'checked'. Or, to be more specific, to know what hasn't been checked.
On Commons there are several review systems for common external
resources that are used to import material from (such as Picassa and
Flickr). And those work very well. Bots crawl recent uploads and
whenever a reference to Flickr is found they are tagged as need-review
and the easy ones are even reviewed by bots (since is something unique
to Picassa and Flickr since they are machine readable and license info
can be automatically verified) and everyhting else (false matches and
errors) is manually reviewed.
However this is just a very tiny little bit of all the files on Commons.
Last march I've raised the topic of edit patrol on Commons [2] and
that has been a great success. We've got a team together and every
single anonymous edit made after April 1st 2010 has been or will soon
be patrolled [3]. Not once has it gone past the 30-day expiration time
for recentchanges table.
The same has been kept up for new page patrol aswell for several years.
Commons being primarily a media site, it's a bit of an akward thing to
say that we are totally unable to patrol uploads effectively.
We can't filter out uploads by bots, or trusted users. We can't filter
out what's been patrolled by patrollers. It's just an incredible mess
that sits there.
Several attempts have been made in the past to work around the
software. But no matter how you try, a patrol flag will make things a
whole lot easier.
Once there is the possiblity to click a link and *poof* toggle that
unpatrolled boolean I'm sure it won't take long before there are nice
AJAX-tools coming to make this easier en-mass and a checklist / team
will be formed to get the job done.
Alrighty, enough rant. What needs to be done for an implementation ?
When asking about this on IRC somebody said this; although a bit of a
workarond we can do this already by means of NewPage patrol in the
File namespace.
Unless it's well hidden, this is false. Because uploads don't create
an patrollable entry for the upload log action, nor for the
description page creation. As a matter of fact the creation of those
description page aren't registered in the recentchanges table at all
(Special:NewPages / Special:RecentChanges).
Depending on how uploads will become patrollable the above could
actually be a good thing. Since having to patrol both would be
ineffecient, and uploading a file isn't neccecarily asociated with
creating a page by users anyway. Plus it would mean duplicate entries
in Special:RecentChanges (upload action / page creation).
Log actions are already present in the recent changes table so I'm
guessing it doesn't take that much of a change in order to make
uploads patrollable.
One interesting thing about uploading (the same is true with moving,
and (un)protecting a page) is that it is also listed in the page
history (instead of just in the Logs) which means it is already very
accessable by the users and doesn't require a new system as to where
the [mark as patrolled] links should appear.
For re-uploads on the "diff" page (like with edits) and on new uploads
on the first revision. (although the latter may be subject to this
bug: https://bugzilla.wikimedia.org/show_bug.cgi?id=15936 which I hope
will be solved though it's not a show stopper, as long as there is any
way at all to get there (even if it requires to go to
Special:RecentChanges) that would be an incredible improvement to the
current situation).
Greetings,
Krinkle
[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=9501
[2] http://commons.wikimedia.org/wiki/Commons:Village_pump/Archive/2010Mar#Mark…
[3] http://commons.wikimedia.org/wiki/Commons:Counter_Vandalism_Unit#Anonymous_…
Please join me in congratulating Mark Bergsma on his promotion last week
to Operations EPM. Mark has been a volunteer since 2004, and a paid
Network Engineer on our team since August 2006. He's been helping us
with our extreme scaling issues (by debugging and tuning our Squid
setup, creating our Netherlands caching center, and generally developing
our network strategy) since the very beginning. For some time now Mark
has been unofficially in charge of managing the entire Ops Team's
deliverables including designing and implementing our new Primary Data
Center in Ashburn, VA, and the other Ops activities mentioned at
http://www.mediawiki.org/wiki/WMF_Projects <http://www.mediawiki.org/wiki/WMF_Projects>. Mark has expressed an
interest in gaining some experience with people management skills as a
logical next step in his career, and to that end we will gradually add
direct reports under Mark over the next year, starting with the Data
Center Ops crew. He will continue to report to me until we hire a
Director of Technical Operations.
I know you will do all you can to support Mark in his new role.
Danese Cooper
CTO, Wikimedia Foundation
Sorry about this but I think it is a reasonable request for help..
Google seems to be delivering wikipedia URLs for pages with a 302
interwiki link eg
http://www.google.co.uk/search?q=schools+wikipedia&ie=utf-8&oe=utf-8&aq=t&c…
gets en.wikipedia.org/wiki/SchoolsWP%3Aindex%3Ahome as the URL with
http://schools-wikipedia.org/ as content
As far as I can see interwiki redirects are no longer used by
Wikimedia software which just delivers straight links to target off
the wikipedia pages. So nothing links through the 302 but Google
returns it anyway.
Anyway could we therefore disable the temporary redirects to fix this
problem, or not serve them to Google or if there is some reason why
the redirects have to stay, show Google perm redirects instead (it is
not as if we will want the redirects to point at anywhere else)
Thanks for any help
Andrew Cates