Well, it looks like they don't work if there's more than one anonymous
user on the wiki. :)
The user_newtalk table has a unique key on user_id, and all entries for
anonymous users are keyed with user_id = 0. Once one gets in, no more
anonymous users can be added.
I've changed it to a regular (non-unique) key, it seems to work now.
-- brion vibber (brion @ pobox.com)
On Sunday 02 February 2003 12:56 am, Brion Vibber wrote:
> The browser still must check with the server to see if the page has
> changed, but if it hasn't the browser's cached version can be shown,
> saving a lot of database sorting, wikitext parsing, link checking, and
> bandwidth. If it has changed, you get the new version.
Everything is much faster now and that is great! But one minor annoyance is
that (at least with Konqueror 3.0.3) the New Messages message doesn't seem to
trigger the 'changed page' flag so Konqi dutifully displays the cached
version of a page instead of updating the page to display the new dynamic
content. No biggie though.
BTW, would it be possible to have the top header area be a different HTML
frame so that each frame can be dealt with separately? Then the content of
the page could be cached /server side/ after it is requested by an anon (or a
user with default settings) and served-up for anons and users with default
settings until a change is made to the page. Then the next anon/default user
accessing the page causes all the queries that are needed to render the new
page and then that page version is cached and made available to the next
anon/default user to view... The only thing that would be dynamic for anons
and default users would be the top frame (since it displays their IP/user
name and the 'new messages' link - but that could be cached client side so
long as the display of 'new messages' and changes in login status tells the
browser that that frame has changed). Just some thoughts - do with them as
you see fit.
This was a bug report
Fixing an old bug, I went ahead and made the diff links in the "classic"
Recentchanges view include the oldid numbers, so they show specifically
the diff for the edited item. (We get the information provided for us
already in our read of the recentchanges table.)
For extra points, this prevents the slowdown when someone selects the
last diff of a page with many revisions.
-- brion vibber (brion @ pobox.com)
There are two very good IRC channels on irc.freenode.net:
#php and #mysql
There are lots of knowledgeable people there who are eager to help open
source projects, many have already heard of Wikipedia. I learned a lot
from just being there for half an hour or so.
Here are some things we need to look into:
1) Composite indexes and index use in general. I do not know which indexes
are currently used on the live Wikipedia. However, after the *default*
database generation script, there is just a single composite index, namely
in the table watchlist. All other indexes are on a single field. If I
understand correctly, this means that when we do a query with multiple
comparisons, e.g. WHERE foo=bar AND baz=bat, only one index is used. At
least, that is what this article claims:
There are also a couple of tables with no indexes (including ARCHIVE,
which may cause Special:Undelete to create high server load) and some
unexplainable ones (site_stats has an index, but only one row). We really
need to clean up our indexes. I can't help much with this without server
access because I don't know if the table structures have been altered.
2) The HANDLER command. HANDLER, which works with both InnoDB and MyISAM,
provides low-level access to tables *without locking*. You get reads, but
you don't get consistency, which may cause problems when you're dealing
with critical, heavily modified data. But it seems perfect for our archive
stuff. The HANDLER syntax is explained here:
We definitely should look into this.
3) Upgrading to MySQL4. MySQL4 supports query caching (also subqueries,
but I haven't looked into that), which means that heavily queried pages
will load faster. When someone mentioned query caching, a lot of other
people made comments like "query caching rocks", so I assume it provides
quite some performance benefits in practice.
4) Tuning my.cnf. According to one IRC resident, upping table_cache can
greatly increase performance with lots of UPDATEs. If we create new
indexes, we may also need to raise index_cache (there's a formula to
calculate your index cache efficiency, currently we are at 99.75%, which
is pretty good).
5) Caching our article index. I have mentioned this idea before, I think:
We could just keep our entire article index in memory *persistently*
(currently the linkcache is filled for each request), which should make
link lookup almost instant. There are several ways to accomplish this:
- We could ask one of our resident C programmers to help. There's a
specified interface to access persistent resources in MySQL, which is used
by functions such as mysql_pconnect. This page describes how it works:
- We could use session variables for the purpose. There is supposed to be
an option to keep session stuff in shared memory. This might help with
other caching problems as well.
- We could just put a database on a ramdisk for this purpose.
The order of this list reflects the priority at which I think these
different questions need to be addressed. Getting our indexes to work
properly should IMHO be of greatest importance. Even if EXPLAIN shows the
use of an index, we may frequently still require large table scans because
we do not use composite indexes.
On Saturday 01 February 2003 04:00 am, Erik Moeller wrote:
> Um .. how about just getting rid of it? Why is it within Wikipedia's
> mission to somehow provide storage space for personal essays? We're an
> encyclopedia, not a hosting provider.
Let me repeat:
If POV material isn't allowed on meta then where should it go?
Would it be better for Saprtacus to still be on en.wiki's Recent Changes?
Again, create another namespace and then you can filter your view of meta's
Recent Changes so would only see changes made to entries in those namespaces.
I don't aim to discuss which one is better CVS or wiki.
First of all, again I am not so sure developing in wiki
works. Some think it won't work and some (including me, only
me?) think it might.
But think of the reality. Don't we need better mechanism for
development? Wiki might not work I know that. But I am not
Why don't we try? Is there any technical trouble to publish
wikipedia software sourcecode? It seems to me that it is
possible to publish sourcecode and some sysops apply them
regularly. If it didn't yield good result, it doesn't hurt
Oh, maybe am I only one who believes development in wiki
might work? If so, I should do that in my own.
>> Probably. If meta is the place only for those who are
>> interested in development and administration, I would
>> to participate in meta.
>The best place for discussion of development and
>technical aspects, is right here on wikitech-l.
What about the rest of stuff? Like documentation, testing,
bug reports and so on. Sourceforge?
I understand CVS seems better than wiki. Does anyone give
the opinion that the bug reports system of sourceforge is
better than meta-wikipedia. If I remember, there is none. I
would like to move bug reports in sourceforge to meta-wiki
I probably am going to post more detailed documents to meta-
wikipedia. (and hopefully more people will help
I know my proposal is not good enough, what else we can do
to encourage more people to partipicate development?