hello.
i have create a new mailing list, issues-l
(http://mail.wikipedia.org/mailman/listinfo/issues-l). notifications of
new site issues will be sent to this list. developers might want to
subscribe to it.
kate.
According to the hardware orders page[1], a major bottleneck is page rendering.
Does any phase of this stand out? DB fetch? Text parse?
Request/Response bandwidth?
I assume profiling on this has been done quite a lot...
[1]
http://meta.wikimedia.org/wiki/Hardware_ordered_August_30%2C_2005
I am having some trouble running the sample Special Page despite
following the steps described, i.e. having included the file in
LocalSettings and making appropriate changes to names, etc. Any
pointers on how to deal with this issue will be greatly appreciated.
The sample Special Page I'm referring to is at:
http://meta.wikimedia.org/w/index.php?title=Writing_a_new_special_page
After including the file in LocalSettings, I will get the following
error from PHP:
Fatal error: Call to undefined function: includable() in
/var/www/mediawiki-1.4.9/extensions/ExampleSpecialPage.php on line 27
None of the SpecialPage files inside the include/ directory have the
includable() function so I just went ahead and commented it out, and
then I will receive the following error:
Fatal error: Cannot redeclare class specialpage in
mediawiki-1.4.9/includes/SpecialPage.php on line 101
which is referring to the class SpecialPage declaration.
The problem code section is
class ExampleSpecialPage extends SpecialPage {
function ExampleSpecialPage() {
SpecialPage::SpecialPage( 'ExampleSpecialPage' );
$this->includable( true );
}
...
}
Hi,
I am trying to enhance the wikimedia syntax to include a simple easy to use
citation format. This will include an automatically generated reference page
in addition to the existing talk,, discussion and history pages. In addition
each reference on the reference page will have its own page and willl funciton
same as an article in a wiki with its own talk page. Thus the wiki might be
effectivelu used by students in writing papers.
I have just started going through the mediawiki code and i wanted to know what
would be a good starting point to work on this kind of a project. I am getting
very confused with the actual flow of the code. I will be creating a separate
database for the references so i need to unerstand the flow.
--
Thanks,
Amruta
Greetings everyone. My name is Kyle and I would like to introduce myself
to the group. I am the new hardware manager for the Tampa servers. My
handle is Solar or SolarKennedy to catch me on IRC or email me at
kyle(a)xkyle.com if you need something done.
SolarKennedy
Hello,
Jump into #wikimedia-tech on Freenode and say "hi" :)
Then we'll show a list of things to do... ;-D
(IRC is somewhat preferred medium for site operational issues)
Domas
-----Original Message-----
From: wikitech-l-bounces(a)wikimedia.org [mailto:wikitech-l-bounces@wikimedia.org] On Behalf Of kyle
Sent: Friday, September 09, 2005 3:41 AM
To: wikitech-l(a)wikipedia.org
Subject: [Wikitech-l] New Tampa Hardware Manager
Greetings everyone. My name is Kyle and I would like to introduce myself to the group. I am the new hardware manager for the Tampa servers. My handle is Solar or SolarKennedy to catch me on IRC or email me at kyle(a)xkyle.com if you need something done.
SolarKennedy
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
> I've often wondered this, so this is a great opportunity to jump in.
> Why not cache prerendered versions of all pages? It would
> seem that the majority of hits are reads. One approach I've
> seen elsewhere is to cache a page the first time it's loaded,
> and then have writes invalidate the cache. (That way you're
> not caching pages nobody looks at.)
We have multiple caches. First of all, all pages are cached
by squids, and achieve >75% hitrates for anonymous users.
Cached objects are invalidated by HTCP CLR message sent
via multicast to our global squid deployment.
Using squid caching also provides us easy way to
bring lots of content closer to users, thus reducing page load
times dramatically for anons (and a bit for logged in users).
If we get possibilities to deploy caches in Australia and China,
that'd be awesome. Right now we're in search for Chinese and
Australian locations though (kind of 3 or even 1 server deployments
would save huge countries:)
We cannot cache pages for logged in users, as they can be
different, though, at some time we might achieve that... Though now,
There is also parser cache, which caches documents for
logged in users as well. We try to increase efficiency of that as well.
> One tricky part is that writes to page B can affect page A
> if page A has a link to B. A reverse index of links would
> solve this, though I don't know how bit it'd be.
We have reverse index of links, and we use that for invalidating
both parser cache and squid cache objects.
Domas
Hi,
I know kates tool have a function counting "Distinct pages edited".
I was wondering if anyone has the time and will power to improve it a bit.
What I need is a tool comparing "Distinct pages edited" by usear A and B (if I can enter a date range, better). THis will be very usefull to find stalkers on wikipeda and hazardous users ;)
Thanks, Thats all I got for now.