Hi,
I'm having a problem when trying to clone a 1.6.3 MediaWiki (the plan
being to clone it to another machine then upgrade to 1.9.x).
This Wiki is running on Windows XP (sorry). It started off as 1.4, and
was upgraded to 1.5 then 1.6. The upgrade process in both cases was to
install the new mediawiki software, and run the install script giving it
the parameters to access the existing database, then customize it.
What I'm trying to do this time is a bit different - both upgrade and
move. The idea is to copy the mediawiki directory over, then use
mysqldump to dump the DB, then load it into a new MySQL db (with the
same user IDs, passwords, etc.).
The process seems to work, until I edit and save a page on the clone.
At that point I get this:
A database query syntax error has occurred. This may indicate a bug
in the software. The last attempted database query was:
INSERT INTO `kahlua_text` (old_id,old_text,old_flags) VALUES (NULL,'Hi,
I\'m Ian Smith...','utf-8')
from within function "Revision::insertOn". MySQL returned error
"1364: Field 'old_comment' doesn't have a default value (localhost)".
Now, this error seems reasonable -- Revision::insertOn is doing an
insert into kahlua_text (kahlua_ is my DB prefix), and it's only
supplying values for old_id,old_text,old_flags; and sure enough,
old_comment is there, and doesn't have a default value. SHOW CREATE on
either the original or the cloned DB shows:
kahlua_text | CREATE TABLE `kahlua_text` (
`old_id` int(8) unsigned NOT NULL auto_increment,
`old_namespace` tinyint(2) unsigned NOT NULL default '0',
`old_title` varchar(255) character set latin1 collate latin1_bin NOT
NULL default '',
`old_text` mediumtext NOT NULL,
`old_comment` tinyblob NOT NULL,
`old_user` int(5) unsigned NOT NULL default '0',
`old_user_text` varchar(255) character set latin1 collate latin1_bin
NOT NULL default '',
`old_timestamp` varchar(14) character set latin1 collate latin1_bin
NOT NULL default '',
`old_minor_edit` tinyint(1) NOT NULL default '0',
`old_flags` tinyblob NOT NULL,
`inverse_timestamp` varchar(14) character set latin1 collate
latin1_bin NOT NULL default '',
PRIMARY KEY (`old_id`),
KEY `old_timestamp` (`old_timestamp`),
KEY `name_title_timestamp`
(`old_namespace`,`old_title`,`inverse_timestamp`),
KEY `user_timestamp` (`old_user`,`inverse_timestamp`),
KEY `usertext_timestamp` (`old_user_text`,`inverse_timestamp`)
) ENGINE=MyISAM DEFAULT CHARSET=latin1 |
So I guess I don't get why this error doesn't pop up on the original
Wiki?
But more to the point, how can I clone this Wiki? Is there some upgrade
script I can run to fix the table? I would rather do the
clone-then-upgrade process; but if I upgrade-then-clone, will that fix
it?
Thanks,
Ian Smith
Motorola | Good Technology Group
ismith(a)motorola.com
408-352-7467
4250 Burton Drive, Santa Clara, CA 95054
www.motorola.com/good
Vou resumir aqui algumas teses que tenho defendido em artigos e
conferências. Não são opiniões soltas nem expressões emocionais de
preferências políticas. São o resultado de mais de vinte anos de
estudos sobre a distribuição do poder no mundo e sobre as
possibilidades que o nosso país tem de preservar sua soberania nas
próximas décadas.
1. As forças que visam à criação do governo mundial são as mesmas que
tentam diluir a soberania dos EUA numa "North American Commonwealth",
fundindo os EUA, o México e o Canadá. São as mesmas que interferem na
Amazônia, violando a nossa soberania territorial. São as mesmas que
subsidiam e apóiam a esquerda do Terceiro Mundo, o politicamente
correto e a destruição da civilização judaico-cristã. São as mesmas
que, dominando a grande mídia de Nova York e Washington (mas ainda em
desvantagem no rádio e na internet), se esforçam para deter a ação
militar americana contra o terrorismo internacional.
2. Nos EUA trava-se uma luta feroz entre esse esquema mundialista e a
resistência conservadora, empenhada em preservar não só a soberania
americana mas uma ordem internacional constituída de nações
independentes.
3. O destino do mundo depende do desenlace dessa luta. Se a maior das
nações não preservar sua soberania, as demais serão dissolvidas com um
simples memorando do secretário-geral da ONU.
4. Ataques aos EUA não ferem em nada o esquema de poder global mas
apenas a nação americana. Em conseqüência, fortalecem esse esquema.
5. A luta dos conservadores americanos é a nossa luta. Se eles
perderem, o Brasil perderá muito mais. No Brasil ninguém sabe disso
porque nada do que eles dizem ou fazem chega até aí. Nossa mídia é
caudatária do New York Times e do Washington Post, house organs da
elite globalista. Praticamente tudo o que vocês lêem sobre os EUA vem
pré-moldado pela esquerda chique internacional.
6. Quem acha que as ambições globalistas e o interesse nacional dos
EUA são a mesma coisa está maluco, ignora o assunto ou está mentindo.
Ou então os conservadores americanos é que não sabem de si próprios e
têm de tomar aulas na Escola Superior de Guerra.
http://www.olavodecarvalho.org/semana/0702122jb.html
Anyone have experience with an install of MediaWiki on Debian/Sarge
with Apache 1.3.34-4, and php4 4.3.10-18?
What version of MediaWiki did you use? Did you use apt-get install
from the Etch site? Any gotcha's to be aware of?
thanks
THeresa
I was diagnosing a problem where the wiki as showing an error page and
suggesting to put "$wgShowExceptionDetails = true;" on
LocalSettings.php to see the error message.
I did this, and saved, and repeated several times, reloading and so
on, but to no effect. The message just said "Internal error" and that
I should use "$wgShowExceptionDetails = true;".
After much debugging and using log files I traced the error to this
line in User.php:
throw new PasswordError( wfMsg( 'password-change-forbidden' ) );
I'm not asking for help about the exception, because it was easy to
figure out after I traced it to this line. But I'd like to know if
there was something more I should have done to see the actual
exception message.
That can save me some time when such a thing happens again...
Thanks!
SplitCategoryPage.php has been placed as an experimental extension on
mediawiki.orghttp://www.mediawiki.org/wiki/Extension:SplitCategoryPage
It hooks the regular Category page and changes the display so that:
1) All subcategories are displayed and counted
2) The correct number of articles is displayed if the number is > 200
This is experimental because
a) it requires a minor hack of CategoryPage.php
b) it makes category pages slower
c) there has to be a better way to do this.
d) I'm not that good at coding (see c, above)
But I couldn't find anything that worked for me, and I needed it, so
I'm sharing with anyone who wants to try it.
Hoping it will be improved through the miracle of open source!
=====================================
Jim Hu
Associate Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054
Hello,
A few weeks ago i was looking for a way to upload multiple files on a
wiki at once. Thereby i came across a link to an extension called
SpecialUploadLocal
(http://meta.wikimedia.org/wiki/SpecialUploadLocal), but normal users
(without the necessary server rights) are not able use it. Another
extension is the SpecialMultiUploadViaZip
(http://meta.wikimedia.org/wiki/SpecialMultiUploadViaZip)...which
works perfectly for normal users: they zip the files they want to
upload and upload the zip-file; finally the zip-file gets unpacked and
the seperate files are on the wiki.
My question now is: why so complicated? Why do users first have to zip
their files? I am curious why nobody ever came to the idea (or why
nobody ever mentioned it on this list) of giving multiple
browse-buttons on the upload page, so users can directly upload
multiple files just via browsing instead of zipping them first. Or is
there some problem with that?
greetings,
B.
hi
I have finally managed to install media wiki and am now trying to
upload content in it. The problem that im facing is that im not able
to edit some of the default content that comes with the software, like
for example,
Main Page
Contents [hide]
1 WELCOME TO RADWIKI
1.1 Binghamton University research wikipedia
1.1.1 Please use the Search to the left to find a specific topic, or
choose a category below to browse to an article.
2 TOPICS
2.1 TECHNOLOGY
2.1.1 HARDWARE
2.1.2 SOFTWARE
2.1.3 SUPPORT
3 BU RESERACH NEWS
3.1 Expert on measuring stress creates guide for researchers
3.2 Isbell named distinguished professor
3.3 Binghamton University researchers measure holiday spirit
WELCOME TO XXXXXX
Only from Welcome XXXXXX its the content im uploading, how do i get
rid of the ones above, the edit option does not allow this, how can i
customize the look of the wiki.
someone please helps me,
Thanks.
On 2/22/07, mediawiki-l-request(a)lists.wikimedia.org
<mediawiki-l-request(a)lists.wikimedia.org> wrote:
> Send MediaWiki-l mailing list submissions to
> mediawiki-l(a)lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> or, via email, send a message with subject or body 'help' to
> mediawiki-l-request(a)lists.wikimedia.org
>
> You can reach the person managing the list at
> mediawiki-l-owner(a)lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of MediaWiki-l digest..."
>
>
> Today's Topics:
>
> 1. Re: Flash extensions for MW 1.6.9? (Fernando Correia)
> 2. Fwd: Who can fix an option in xml2sql.exe? (Rolf Lampa)
> 3. Who can fix an option in xml2sql.exe? (Rolf Lampa)
> 4. Re: Who can fix an option in xml2sql.exe? (Brion Vibber)
> 5. Re: Who can fix an option in xml2sql.exe? (Rolf Lampa)
> 6. Re: Who can fix an option in xml2sql.exe? (Samuel Lampa)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Thu, 22 Feb 2007 07:02:57 -0300
> From: "Fernando Correia" <fernandoacorreia(a)gmail.com>
> Subject: Re: [Mediawiki-l] Flash extensions for MW 1.6.9?
> To: "MediaWiki announcements and site admin list"
> <mediawiki-l(a)lists.wikimedia.org>
> Message-ID:
> <8140eff40702220202i18d16b48v6479d7a9cc83db2c(a)mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> 2007/2/21, Ken McDonald <ken(a)pixologic.com>:
> > Which flash plugins have people had success with in MW 1.6.9? And, in
> > general, which ones do people like best (there seem to be a lot more of
> > them coming out).
>
> I've written a Flash extension named LinkSwf because the existing ones
> didn't quite satisfy the requirements of the users of our corporate
> wiki.
>
> Features
>
> * Plays the video on a maximized window.
> * Works on Internet Explorer and Firefox.
> * Can stretch the video to fill the window or play it in a predefined size.
> * Supports detection and update of the Flash Video player.
> * Can be used inside a template.
>
>
> I'm not sure if it would work with version 1.6, though.
>
> http://www.mediawiki.org/wiki/Extension:LinkSwf/LinkSwf.php
>
>
>
> ------------------------------
>
> Message: 2
> Date: Thu, 22 Feb 2007 11:38:54 +0100
> From: Rolf Lampa <rolf.lampa(a)rilnet.com>
> Subject: [Mediawiki-l] Fwd: Who can fix an option in xml2sql.exe?
> To: mediawiki-l(a)lists.wikimedia.org
> Message-ID: <45DD72BE.3050305(a)rilnet.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
>
>
> ------------------------------
>
> Message: 3
> Date: Thu, 22 Feb 2007 11:44:33 +0100
> From: Rolf Lampa <rolf.lampa(a)rilnet.com>
> Subject: [Mediawiki-l] Who can fix an option in xml2sql.exe?
> To: mediawiki-l(a)lists.wikimedia.org
> Message-ID: <45DD7411.1050704(a)rilnet.com>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> Hi all,
>
> Unfortunately the xml2sql.exe makes extended inserts (by n rows) when
> producing MySQL's INSERT format. Problem is that when there's HUGE pages
> in the xml dump (like some pages in Wikipedia) then one have to reduce
> the number of rows per "batch" when uploading with BigDump.php,
> otherwise the upload will surely crash.
>
> But reducing the amount of rows per batch makes upload very slow...
>
> If the produced output would (optionally) be NOT extended, I could more
> easily* fix the INSERTS on my own (reshuffling the sql by total *size* -
> instead of "extending" the inserts by a certain number of rows).
>
> Suggested solution:
>
> #1: Option: --noextended
>
> or (example, make extended inserts by size (kb) instead:
>
> #2: Option: --extendedbykb 512
>
> But the developer Toetew
> (http://meta.wikimedia.org/wiki/User_talk:Tietew#Hyper_Estraider_extension)
> doesn't seem to be very active. Can the source code be modified, and if
> so, who's expert on ANSI C...? Or perhaps someone knows how to get in
> contact with Tietew?
>
> Regards,
>
> // Rolf Lampa
>
> [*] - more easy: In order to "tear apart" the extended INSERTS into
> separate rows again (in order to pack them again later, but by total
> size instead) I need to use three different Regex for finding where to
> split the INSERT for page.sql, revision.sql and text.sql respectively.
> But sometimes the text-table gets messed up anyway (it tends to do so
> when trying to split articles containg like sql code examples and the
> alike... ).
>
>
> ------------------------------
>
> Message: 4
> Date: Thu, 22 Feb 2007 03:02:56 -0800
> From: Brion Vibber <brion(a)pobox.com>
> Subject: Re: [Mediawiki-l] Who can fix an option in xml2sql.exe?
> To: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
> Message-ID: <45DD7860.7060109(a)pobox.com>
> Content-Type: text/plain; charset=ISO-8859-1
>
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> Rolf Lampa wrote:
> > Unfortunately the xml2sql.exe makes extended inserts (by n rows) when
> > producing MySQL's INSERT format. Problem is that when there's HUGE pages
> > in the xml dump (like some pages in Wikipedia) then one have to reduce
> > the number of rows per "batch" when uploading with BigDump.php,
> > otherwise the upload will surely crash.
>
> Don't know what xml2sql.exe is or where it comes from; Google search
> turns up something that looks unrelated to wikis and is documented only
> in German.
>
> Have you tried mwdumper, which we maintain?
>
> - -- brion vibber (brion @ pobox.com / brion @ wikimedia.org)
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v1.4.2.2 (Darwin)
> Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
>
> iD8DBQFF3XhgwRnhpk1wk44RAiJYAKDZGD0uMQPNtTuH5fiZdERWlpMs5QCfbc6v
> 5cokxLQ0VcdDv22NiyABaew=
> =vpWF
> -----END PGP SIGNATURE-----
>
>
>
> ------------------------------
>
> Message: 5
> Date: Thu, 22 Feb 2007 12:18:15 +0100
> From: Rolf Lampa <rolf.lampa(a)rilnet.com>
> Subject: Re: [Mediawiki-l] Who can fix an option in xml2sql.exe?
> To: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
> Message-ID: <45DD7BF7.8020002(a)rilnet.com>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> Brion Vibber wrote:
> > -----BEGIN PGP SIGNED MESSAGE-----
> > Hash: SHA1
> >
> > Rolf Lampa wrote:
> >
> >> Unfortunately the xml2sql.exe makes extended inserts (by n rows) when
> >> producing MySQL's INSERT format. Problem is that when there's HUGE pages
> >> in the xml dump (like some pages in Wikipedia) then one have to reduce
> >> the number of rows per "batch" when uploading with BigDump.php,
> >> otherwise the upload will surely crash.
> >>
> >
> > Don't know what xml2sql.exe is or where it comes from; <...>
>
> http://meta.wikimedia.org/wiki/Xml2sql
>
> > Have you tried mwdumper, which we maintain?
> >
>
> No I have not tried that one. I read about it today though (only
> briefly, it was only mentioned in short amongst other tools) and I
> didn't "get hooked". Where is the best reading on mwdumper? (or do I
> really have to try it out... =).
>
> I use WinXP & xampplite and I produce data and also process existing xml
> data locally (fast Delphi-app) before upload.
>
> Regards,
>
> // Rolf Lampa
>
>
>
>
> ------------------------------
>
> Message: 6
> Date: Thu, 22 Feb 2007 12:24:01 +0100
> From: Samuel Lampa <shl(a)rilpedia.org>
> Subject: Re: [Mediawiki-l] Who can fix an option in xml2sql.exe?
> To: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
> Message-ID: <45DD7D51.3030000(a)rilpedia.org>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> Brion Vibber skrev:
> > -----BEGIN PGP SIGNED MESSAGE-----
> > Hash: SHA1
> >
> > Rolf Lampa wrote:
> >
> >> Unfortunately the xml2sql.exe makes extended inserts (by n rows) when
> >> producing MySQL's INSERT format. Problem is that when there's HUGE pages
> >> in the xml dump (like some pages in Wikipedia) then one have to reduce
> >> the number of rows per "batch" when uploading with BigDump.php,
> >> otherwise the upload will surely crash.
> >>
> >
> > Don't know what xml2sql.exe is or where it comes from; Google search
> > turns up something that looks unrelated to wikis and is documented only
> > in German.
> >
> Try http://meta.wikimedia.org/wiki/Xml2sql
>
> --
> Samuel Lampa
> http://rilnet.com
>
>
>
>
> ------------------------------
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)lists.wikimedia.org
> http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
>
> End of MediaWiki-l Digest, Vol 41, Issue 61
> *******************************************
>
Hi,
I have taken over as WikiSysop a already running wiki in my company.
Only the WikiSysop User is allowed to create new Accounts.
* MediaWiki: 1.9.0
* PHP: 5.2.0 (apache2handler)
* MySQL: 5.0.27-standard-log
My email address is registered for the WikiSysop user. I am creating and
changing new passwords as described in the FAQ:
Go to Special:Userlogin, when logged in as a sysop. The link to this
page differs from language. Enter a username and an email address, and
click the "by email" button. The account will be created with a random
password which is then emailed to the given address.
When the user gets its new password via email, there is still mentioned
another email address as mine. How can I fix this, so that they get my
email address mailed, if they have any kind of problems???
Thanks,
Achim
Just a tip about how to ride out a PHP5.1 to PHP5.2 upgrade on your
server.
The recent upgrade of PHP5.1 to PHP5.2 on my server caused trouble for
my MediaWiki 1.8.2 (and might effect any MediaWiki version older than
1.9.1).
I got my MediaWiki working again like this:
* I upgraded to MediaWiki version 1.9.2 (most recent version when I did
this; 1.9.3 has since been released)
* I also had to uncomment the following line in MediaWiki's
LocalSettings.php to keep it running okay in this upgraded environment:
ini_set( 'memory_limit', '20M' );
Not sure why that became necessary but it did.
My MediaWiki runs on a commercial virtual web hosting account at
Dathorn.com. I moved it there to escape Dreamhost.com's oversold
virtual web hosting servers. I don't like the cPanel environment I'm
stuck in now but the server does seem to be snappy. So far so good.
http://wikigogy.org/Special:Version
--
Roger Chrisman :-) http://Wikigogy.org - free resources
for teachers of English as a second or foreign language
Hi all,
Unfortunately the xml2sql.exe makes extended inserts (by n rows) when
producing MySQL's INSERT format. Problem is that when there's HUGE pages
in the xml dump (like some pages in Wikipedia) then one have to reduce
the number of rows per "batch" when uploading with BigDump.php,
otherwise the upload will surely crash.
But reducing the amount of rows per batch makes upload very slow...
If the produced output would (optionally) be NOT extended, I could more
easily* fix the INSERTS on my own (reshuffling the sql by total *size* -
instead of "extending" the inserts by a certain number of rows).
Suggested solution:
#1: Option: --noextended
or (example, make extended inserts by size (kb) instead:
#2: Option: --extendedbykb 512
But the developer Toetew
(http://meta.wikimedia.org/wiki/User_talk:Tietew#Hyper_Estraider_extension)
doesn't seem to be very active. Can the source code be modified, and if
so, who's expert on ANSI C...? Or perhaps someone knows how to get in
contact with Tietew?
Regards,
// Rolf Lampa
[*] - more easy: In order to "tear apart" the extended INSERTS into
separate rows again (in order to pack them again later, but by total
size instead) I need to use three different Regex for finding where to
split the INSERT for page.sql, revision.sql and text.sql respectively.
But sometimes the text-table gets messed up anyway (it tends to do so
when trying to split articles containg like sql code examples and the
alike... ).