So I have my little mediawiki 1.4rc1 running OK on my PB G4
(Panther at the moment, Tiger coming up), and am looking at the
bugzilla list, but it's somewhat unclear on how to choose what
to look at - priorities and criticalities seem kind of random.
I have some bigger projects in mind, but this is my first foray
into web apps, so I'd like to try some simpler hackery to begin
with. Any suggestions?
I'm sure this is not an original thought and probably has been discussed
again recently on IRC.
I'm curious to the outcome though:
Would it be possible to make dumps every one or two days, so that we are
less dependant on good fortune next time?
> some simpler hackery to begin with. Any suggestions?
Welcome! :) You may sure join #mediawiki channel at irc.freenode.org, that's where rants and raves are discussed, so you may catch up to latest breezes of development, as well as notice incoming bugzilla tickets :)
Most productive discussions happen live;-)
I've noticed increasing levels of vandalism via anonymizing proxies. We
turned off the automatic proxy-scanning some time ago because of
complaints by the clue-deficient who saw this as potential attacks.
However, it might be a good idea to do the following:
* whenever an admin _blocks_ a user, the IP they were editing from
should be automatically proxy-scanned, and blocked indefinitely if it is
an open proxy (_in addition to_ the username/IP block that would have
By restricting proxy scans to proven vandals, this will reduce the rate
of proxy scans to a few dozen a day (from tens of thousands before), and
result in a proportionately trivial level of complaints which can safely
be auto-replied or ignored. It will also allow the reply to be very
clear: "we detected abuse from your user, verified that it was coming
from an unsecured proxy on your network, and took appropriate action".
99% of the code for this already exists, so it should be trivial to put
in place -- however, I'm aware that our heroic developers are somewhat
Well, here again NSK wrote:
> Wikipedia/WikiCommons/Wikibooks are all dangerously slow. I recently installed
> the Zend Optimiser 2.5.7 on my site www.wikinerds.org . Zend Optimiser speeds
> up php by 40%. I think you should install it too if you haven't done so already.
Have you tried Turck? It's on all pages you may find about MediaWiki performance tuning.
http://turck-mmcache.sourceforge.net/. And yes, wikimedia sites sure run bytecode cache.
You may try other caching facilities (memcached, squid, ...) as well, then you could tell
that wikineds.org is running xxx% faster ;-)
Thanks to Brion and all who helped out with the outage. Bad things
happen despite the best of preparations, and even in hindsight I doubt
if anything could have been done aside from the costly and impractical
alternative of having a fully redundant site off in a bunker somewhere.
Not all failures can be foreseen.
In the interest of harm reduction for the future, I would like to
recommend the excellent services of the folks at easydns.com, who
provided reliable secondary services and mail queuing for me for some
years while I was handling IT stuff for an employer. Their prices are
cheap, they have competent technical support, and they have a network
of secondaries on at least two continents. I would suggest that we
include them in our plans even if other measures are taken.
The Uninvited Co., Inc.
(a Delaware corporation)
On http://en.wikipedia.org/wiki/User_talk:David_Gerard/1.0 , en:User:Payo1
raises this interesting question:
Would it be possible to find some schools in developing countries that
do have internet access and track what articles their students access
for a couple of months? We might even be able to identify gaps in
content by looking at their failed searches.--Payo1 10:21, 23 Feb 2005
This is regarding selection of articles needed for a paper Wikipedia.
Is such a query feasible or framable? This might be of use!