Juliano Ravasi Ferraz wrote:
> Brion suggests using "DELETE FROM archive;", but I wonder if this
> procedure is still up-to-date for the latest development releases. It
> looks like that the real data are stored in `text`, and there are no
> database constraints to propagate deletions. Issuing this command seems
> to just cause a "database" leak (compared to a memory leak). Am I right?
Right.
> If so, is it correct to issue something like "DELETE FROM archive, text
> WHERE ar_text_id = old_id;" for current MediaWiki versions? Is this
> secure? Or will this somehow harm database consistency?
That sounds like it will probably work, on MySQL 4 or later. (But back
everything up first!)
There's not really any built-in management for discarding old data to
save space; MediaWiki assumes you have a buttload of space and will get
more or manually rearrange data when you need it. :)
> Another question: I ran for some time with $wgCompressRevisions off, and
> now I want to turn it on. Ok, it will compress pages from now on, but is
> there a procedure to compress all old revisions already created before
> setting the option?
There is compressOld.php, but IIRC this code is broken on 1.5 at the moment.
-- brion vibber (brion @ pobox.com)
Dear Tim and Brion,
I did not see you on the IRC for a couple of days.
I'd kindly like to ask you for CVS access, as Hashar recommended to me
today,
so that I can commit http://bugzilla.wikipedia.org/show_bug.cgi?id=2014
patchlet1 revisited for Enotif for REL1_5 and HEAD.
The 2014 patch fixes some problems, makes the interface to enotif clean,
UserTalkUpdate.php can be dropped.
It also introduces a new message marker for changes on the user page,
which quite a lot of users have asked for.
It relies on the existing only memchached bit for "usertalk newmessage".
You could study my (acquired) competences in CVS usages on
http://cvs.sourceforge.net/viewcvs.py/enotifwiki and
http://cvs.berlios.de/cgi-bin/viewcvs.cgi/enotifwiki/enotifwiki/
(BerliOS also offers SUBVERSION; currently I use CVS)
Kind regards,
Tom (Wikinaut)
A few people have asked me for the calendar I use on
http://krass.com/wiki/Calendar
I couldn't release it earlier, because the CalendarClass I used didn't
have a GPL friendly license (in fact, it didn't have any license and I
wasn't able to contact the author). Now I have written my own
primitive class and can release the whole thing under GPL.
Its just tested on http://krass.com/ and not tested in any other
environment.
You can find the code here: http://krass.com/wikicalendar.tgz
I put this info also on: http://meta.wikimedia.org/wiki/User:Cdamian/calendar
cheers,
christof
--
Christof Damian
christof(a)damian.net
hi all
i see the XML export of Wikipedia content
http://en.wikipedia.org/wiki/Special:Export/
that you get the content in xml format
for example for 'perl' you have
http://en.wikipedia.org/wiki/Special:Export/perl
but where i can found something about it to using
it better any doc, how-to or something else would be
useful
regards
mohsen
__________________________________________________
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
http://mail.yahoo.com
Hi,
I am attempting, as an experiment for work, to get Mediawiki up and running
using widgEditor (http://www.themaninblue.com/writing/perspective/2005/01/27/)
as a WYSIWYG interface. As I work in a very un-geek office trying to get people
to learn Wiki mark-up would be much to much to ask (many have difficulty turning
the computers on - seriously). However, my co-workers have expressed a real
interest in using a Wiki for collaboration. Therefore the WYSIWYG interface.
So far so good, everything works as it should (with a few tweaks here and
there). However, I am stuck attempting to get the TOC and section editing
working. As widgEditor sends all the code as raw HTML I need to get mediawiki
to treat hN's as it would '=' for section headings. Looking high and low I
cannot find the right place to make the changes. Any help would be appreciated.
P.S. I have also noticed that while the HTML is stored in the database just fine
the text that is displayed have spaces before the closing bracket. I would like
to clean this up - however I cannot find where this extra space is coming from.
Cheers
Andrew.
When including a page by {{:page_B}}into page_A page_A does not get updated
after editing page_B. I think this is due to cached pages. Is that rigth?
And how can i get page_A get updated without making a minor edit on that
page?
Heinz
Thanks Angela, that was a very good hint.
But when including a page that way, the page title does not get included.
I have either to include the =title= whithin the inluded page again, which
does'nt look nice there or to set the title manually on the page which
include subpages.
Are there allready some working things to get a page included and the page
title displayed too?
Heinz
"Angela" <beesley(a)gmail.com> schrieb im Newsbeitrag
news:<8b722b80050811025860676dde(a)mail.gmail.com>...
On 8/11/05, Thomas Koll <tomk32(a)gmx.de> wrote:
> How about extending the template/vorlagen code to allow something like
> {{0:Artikel im normalen Namespace}} or {{Wikipedia:some article in
> wikipedia ns}} ?
This already works. You can include pages from the main namespace by
starting with a colon.
{{:Main Page}} will include the main page.
Angela.
----------
Finlay McWalter wrote:
> Folks,
> Algebra.com (which also goes by the name cooldictionary.com)
> appears to be leeching images directly from the wikimedia upload server.
What is the accepted means for copying the images from wikipedia? My
impression was there was some sort of copyright problem due to the fact
that, unlike the text content, the image content of Wikipedia is not,
_reliably_ under one of the free licenses such as the GFDL.
Hi,
Earlier this evening, I ended up posting the same message three times
to mediawiki-l, having received the following bounce notice on all
three attempts, which I took to mean it hadn't reached the list.
Looking at the archives[1], it seems that all three copies got
through. Anybody have any ideas what actually happened, and why?
[1] http://mail.wikipedia.org/pipermail/mediawiki-l/2005-August/thread.html#6321
---------- Forwarded message ----------
From: Mail Delivery System <MAILER-DAEMON(a)zwinger.wikimedia.org>
Date: 15-Aug-2005 21:26
Subject: Undelivered Mail Returned to Sender
To: rowan.collins(a)gmail.com
This is the Postfix program at host zwinger.wikimedia.org.
I'm sorry to have to inform you that your message could not be
be delivered to one or more recipients. It's attached below.
For further assistance, please send mail to <postmaster>
If you do so, please include this problem report. You can
delete your own text from the attached returned message.
The Postfix program
<quagga(a)zwinger.wikimedia.org>: cannot access mailbox /var/mail/quagga for user
quagga. error writing message: File too large
Final-Recipient: rfc822; quagga(a)zwinger.wikimedia.org
Action: failed
Status: 5.0.0
Diagnostic-Code: X-Postfix; cannot access mailbox /var/mail/quagga for user
quagga. error writing message: File too large
This is a page started by +sj+ for the purpose of making life easier for
those wanting to take snapshots of Wikipedia - for a DVD, Palm, a mirror
site or whatever. It should probably actually be on meta, but this'll do
for a start ;-)
(I've also placed it in Category:Wikipedia 1.0 in anticipation of future
instructions on how to pull all articles with good ratings in a given area,
etc.)
Additions, expansions and comments are most welcomed.
- d.