Tony Sidaway wrote:
>Brion Vibber said:
>
>
>>Brion Vibber wrote:
>>
>>
>>>Update logs are still replaying, but we're up to 42 minutes prior to
>>>the crash on one machine and still going. I don't expect problems.
>>>
>>>
>>With two servers fully recovered we've got the wikis up for read-write
>>access; editing is open. Total time from crash to restoring edit
>>service was about 24 hours, 10 minutes. Sigh.
>>
>>Some special pages (including contribs and watchlist) are off for the
>>moment to reduce server load until we have more machines up. Some
>>things remain a little wonky.
>>
>>
>>
>Interesting discussion on Slashdot about the relative recoverability of
>Postgresql. If we stay with open source DBMS, perhaps at least some of
>the database servers should be running alternative software.
>
>
>
>
Kudos to the developers for their heroic efforts in bringing everything
back from what threatened to be a serious data loss.
Regarding using different databases: I agree, diversity is good.
However, I should point out that I have destroyed a PostgreSQL database
on one occasion by power-cycling a machine (it was running VACUUM at the
time). This makes me sceptical about relying on software diversity
alone, particularly in the face of crude threats such as power loss,
fire, tornadoes and flood.
The value of the Wikipedia data is now big enough that it worth putting
a formal disaster recovery plan in place.
A good idea in the short term might be to keep a slave database or two
offsite, so that it they are unlikely to crash at the same time as the
central site. Note that online slaves are not a 100% solution to data
corruption, as they will faithfully mirror any corruption which
accumulates from causes other than database failure.
This emphasizes the importance of taking and saving snapshot dumps. At
the moment, keeping off-site dumps is done on an ad-hoc basis by
volunteers. This should certainly be formalized to include the
automatic creation and archiving of dumps off-site, in addition to
running offsite slave databases. At a data rate of only 10 Mbits/s, a
170GB offsite backup would take only 38 hours to move offsite. At these
sorts of rates, monthly backups could be lodged with any of a number of
mirror services, perhaps organizations such as universities, the UK
mirror service and Internet artchive might be interested in doing this?
The current worst case would be physical destruction of the servers at
the Florida colo; the data is both priceless and uninsurable, but is the
server farm insured against this sort of event?
-- Neil
Brion Vibber wrote:
> James R. Johnson wrote:
>> Is there something wrong with the wikis? I was trying to do
>> some writing on ang.wikibooks.org, and ang.wiktionary.org and they don't
>> work. Are they down right now, or did something else happen?
>
> There was some sort of power failure at the colocation facility. We're
> in the process of rebooting and recovering machines.
The power failure was due to circuit breakers being tripped within the
colocation facility; some of our servers have redundant power supplies
but *both* circuits failed, causing all our machines and the network
switch to unceremoniously shut down.
Whether a problem in MySQL, with our server configurations, or with the
hardware (or some combination thereof), most of our database servers
managed to glitch the data on disk when they went down. (Yes, we use
InnoDB tables. This ain't good enough, apparently.)
The good news: one server maintained a good copy, which we've been
copying to the others to get things back on track. We're now serving all
wikis read-only.
The bad news: that copy was a bit over a day behind synchronization (it
was stopped to run maintenance jobs), so in addition to slogging around
170gb of data to each DB server we have to apply the last day's update
logs before we can restore read/write service.
I don't know when exactly we'll have everything editable again, but it
should be within 12 hours.
-- brion vibber (brion @ pobox.com)
So, there's a bug in the hooks code for 1.4. I've got a fix working on
my machine, but since the problem requires a change in the calling
semantics, I have to test all the code that causes events. This is
taking a while.
I hope to have something checked in in the next couple of days.
~Evan
--
Evan Prodromou
evan(a)bad.dynu.ca
Someone or something has been making new wikis with the 1.3 schema.
These appear to work on 1.4 until someone tries to create a login
account, at which point changes to the user tables cause a failure.
One recent example is bm.wiktionary.org.
If some person is setting these up manually on the server, I need to
know what procedure you're following so it can be fixed. If there is
still some sort of automated wiki creation script running, I need to
know about it so it can be tracked down and fixed.
-- brion vibber (brion @ pobox.com)
MediaWiki 1.3.9 and 1.4beta4 are very slow and uses too many CPU cycles and
memory compared to other wikis. The database schema, as I see it, seems to be
inefficient. Are there any changes planned for 1.4 final to increase
performance?
--
NSK
http://portal.wikinerds.org
At 2/22/2005 09:30 AM, Lane, Ryan wrote:
>I like the bread crumbs idea a lot.
Thanks. I borrowed that idea from DokuWiki (check the "Thanks" section in
the doc), this option is builtin.
>It would be nice if this feature was
>available without having to add the options into an article. For instance,
>adding the options you want to use in LocalSettings.php, and the breadcrumbs
>trace from anywhere you start at.
OK, sounds good. You can already use the extension anywhere in the
layout/template, as mentionned in the doc, but you are right, it does not
take parameter anymore in that form, since PHP's TAL require a
parameter-less callback. So I can either read global parameter from
LocalSettings.php, I'll keep that in mind, or, provide decent defaults and
have those options be parts of the user options, I'll think about it...
--
Sebastien Barre
I like the bread crumbs idea a lot. It would be nice if this feature was
available without having to add the options into an article. For instance,
adding the options you want to use in LocalSettings.php, and the breadcrumbs
trace from anywhere you start at.
Something like this is much better than having to use the back button (which
is a serious pita when editing a lot).
Ryan Lane
Naval Oceanographic Office
> -----Original Message-----
> From: wikitech-l-bounces(a)wikimedia.org
> [SMTP:wikitech-l-bounces@wikimedia.org] On Behalf Of Sebastien BARRE
> Sent: Monday, February 21, 2005 3:23 PM
> To: wikitech-l(a)wikimedia.org; mediawiki-l(a)wikimedia.org
> Subject: [Wikitech-l] Some extensions...
>
> Hi,
>
> I've created some extensions to make my life easier documenting some of
> our
> projects.
> Feel free to give them a shot, and use them at will:
> http://public.kitware.com/Wiki/User:Barre/Extensions
>
> Suggestions are welcome.
> Bare in mind that I started fiddling with the MediaWiki API a little more
> than one week ago, so hold on with the flaming.
>
> --
> Sebastien Barre
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)wikimedia.org
> http://mail.wikipedia.org/mailman/listinfo/wikitech-l
[crossposted to wikipedia-l, wikien-l, wikitech-l]
Daniel Mayer (maveric149(a)yahoo.com) [050222 20:25]:
> "A small price to pay for a project like this. But get a move on with all those
> ambitious plans for paper versions. Most of the world doesn't have computers."
> by Anonymous
The closest we appear to have to an active plan for this is ... mine!
http://en.wikipedia.org/wiki/User:David_Gerard/1.0
This relies on rating code (so as to let the Wiki do the work - editorial
committees don't scale, editors with opinions will).
Jimbo's idea - which passes the "simple brilliant elegance" test - is to
set up ratings on a large Wikipedia (e.g. en:!) and just gather data for a
month or whatever. Then release the data for everyone to look at and make
sense of.
This relies on someone who knows PHP writing rating code, or better yet
beating Magnus Manske's existing rating code into production quality ...
I could install MediaWiki at home (it runs on FreeBSD, right?) and hack on
it here. And, ahahaha, learn PHP, of which I know not a jot or tittle. And
I haven't written anything longer than a quickie shell script since 1993.
"Rusty" isn't in it.
So if SOMEONE ELSE who is interested and KNOWS PHP could come forward, that
would be *really good*!
- d.
Hi,
I've created some extensions to make my life easier documenting some of our
projects.
Feel free to give them a shot, and use them at will:
http://public.kitware.com/Wiki/User:Barre/Extensions
Suggestions are welcome.
Bare in mind that I started fiddling with the MediaWiki API a little more
than one week ago, so hold on with the flaming.
--
Sebastien Barre
***Intentional Top-post***
Gmail placed this message into my SPAM folder (I'm guessing it looked
like an on-line pharmacy ad). Those using Gmail may want to retrieve
it.
-Rich Holton
On Mon, 21 Feb 2005 05:00:19 +0000 (UTC),
bugzilla-daemon(a)mail.wikimedia.org
<bugzilla-daemon(a)mail.wikimedia.org> wrote:
> http://bugzilla.wikimedia.org/show_bug.cgi?id=1569
>
> Summary: HTML comments in wikitext can cause extra spaces in
> rendered HTML
> Product: MediaWiki
> Version: 1.4beta6
> Platform: All
> URL: http://en.wikipedia.org/wiki/Madge_Oberholtzer
> OS/Version: All
> Status: NEW
> Severity: minor
> Priority: Low
> Component: Page rendering
> AssignedTo: wikibugs-l(a)wikipedia.org
> ReportedBy: jbonham(a)mail.utm.utoronto.ca
>
> Wikitext such as:
>
> Madge then purchased mercuric chloride tablets under the guise of shopping for
> something else, <!-- my sources disagree whether it was a hat or makeup --silsor -->
> and consumed them in another attempt at [[suicide]].
>
> can result in two spaces being visibly rendered in the HTML, one of which is a
>   as follows:
>
> Madge then purchased mercuric chloride tablets under the guise of shopping for
> something else,  and consumed them
>
> This is a bug because the default behaviour of HTML rendering agents is to ignore
> extra whitespace, but the presence of the   prevents this.
>
> --
> Configure bugmail: http://bugzilla.wikimedia.org/userprefs.cgi?tab=email
> ------- You are receiving this mail because: -------
> You are the assignee for the bug, or are watching the assignee.
> _______________________________________________
> Wikibugs-l mailing list
> Wikibugs-l(a)Wikipedia.org
> http://mail.wikipedia.org/mailman/listinfo/wikibugs-l
>
--
en.wikipedia:User:Rholton