We use MySQL 4.1 at Wikicities without any problem. We did have a
problem in transitioning between versions though. Just be careful to
include "--default-character-set=latin1" when you run your mysqldumps
(if that applies to you, of course).
Jason Richey
Kevin Carillo wrote:
Hi.
I'd like to know whether it is possible to run Wikipedia with mysql 4.1.
Before installing the whole environment with mysql 4.0.20a, I tried to set
it up with mysql 4.1 and I remember having problems installing mediawiki
which did not accept this version of mysql.
However, my research project requires a lot of queries that contain
subqueries, and it is almost impossible to rewrite them all as JOINS (one of
the reason is that I not only use SELECT queries but also UPDATE and
DELETE). Using Mysql 4.1 would help me considerably.
In case, it is possible to use mysql 4.1, is there any way to upgrade the
version of mysql by keeping my current wikipedia database?
Thank you.
Kevin Carillo
_____
From: wikitech-l-bounces(a)wikimedia.org
[mailto:wikitech-l-bounces@wikimedia.org] On Behalf Of
wikitech-l(a)wikimedia.org
Sent: May 10, 2005 4:34 AM
To: wikitech-l(a)wikimedia.org
Subject: Wikitech-l Digest, Vol 22, Issue 26
Importance: Low
Send Wikitech-l mailing list submissions to
wikitech-l(a)wikimedia.org <mailto:>
To subscribe or unsubscribe via the World Wide Web, visit
http://mail.wikipedia.org/mailman/listinfo/wikitech-l
or, via email, send a message with subject or body 'help' to
wikitech-l-request(a)wikimedia.org <mailto:>
You can reach the person managing the list at
wikitech-l-owner(a)wikimedia.org <mailto:>
When replying, please edit your Subject line so it is more specific
than "Re: Contents of Wikitech-l digest..."
Today's Topics:
1. Re: Parser (was Re: Longterm hosting strategy) (Tim Starling)
2. Re: Parser (was Re: Longterm hosting strategy)
(Lee Daniel Crocker)
3. Re: Longterm software strategy (Tim Starling)
4. Re: Parser (was Re: Longterm hosting strategy)
(David A. Desrosiers)
5. Re: Parser (was Re: Longterm hosting strategy)
(?var Arnfj?r? Bjarmason)
6. New machines installed, killed in record 9.5 hours (Brion Vibber)
7. Re: Longterm software strategy (Brion Vibber)
8. Link table updates (was Re: Longterm software strategy)
(Tim Starling)
----------------------------------------------------------------------
Message: 1
Date: Tue, 10 May 2005 13:55:48 +1000
From: Tim Starling <t.starling(a)physics.unimelb.edu.au <mailto:> >
Subject: [Wikitech-l] Re: Parser (was Re: Longterm hosting strategy)
To: wikitech-l(a)wikimedia.org <mailto:>
Message-ID: <d5pau9$7q4$1(a)sea.gmane.org <mailto:> >
Content-Type: text/plain; charset=ISO-8859-1
Lee Daniel Crocker wrote:
I agree, I don't think the parser's a big
issue, although it would be
nice for a bit snappier response. In hindsight, storing the wikitext in
a database was a mistake. There's already a wonderful piece of software
highly optimized and scalable for storing randomly accessed variable-
sized chunks of text with lots of tools for backup, replication, and so
on; it's called a file system. Storing the wikitext itself in something
like Reiserfs would probably speed it up, and also speed up access to
the rest of the metadata in the database which would become much
smaller.
That's what ExternalStore is for. Moving the bulk out of the database,
or at least to a different database, is a pressing need. We need to
separate bulk, rarely accessed data from hot data, so that we can save
the highly redundant storage on the DB master for hot data. Domas has
been working on it. We're running out of disk space on Ariel again, and
another compression round is obviously only a stopgap solution.
-- Tim Starling
------------------------------
Message: 2
Date: Mon, 09 May 2005 21:04:55 -0700
From: Lee Daniel Crocker <lee(a)piclab.com <mailto:> >
Subject: Re: [Wikitech-l] Parser (was Re: Longterm hosting strategy)
To: ?var Arnfj?r? Bjarmason <avarab(a)gmail.com <mailto:> >, Wikimedia
developers
<wikitech-l(a)wikimedia.org <mailto:> >
Message-ID: <1115697895.5779.25.camel(a)shuttle.piclab.com <mailto:> >
Content-Type: text/plain; charset=utf-8
On Tue, 2005-05-10 at 03:44 +0000, Cvar ArnfjC6rC0 Bjarmason wrote:
like
Reiserfs would probably speed it up, and also speed up access to
the rest of the metadata in the database which would become much
smaller.
How about something like a version control system, subversion for
example, I don't know how it would do speed wise for something like
this but with that you'd get
Waaaay too slow (have you ever used Subversion?) But it might not be
a bad idea to put a WebDAV/DeltaV front end on whatever we create to
make it possible for third-party tools to access it.