Hi,
I'm desperately trying to upgrade from 1.4rc1 to 1.6.8. I followed all
the information I could find on the web but still having no success.
I manage to upgrade within 1.4 easily up to 1.4.15.
But upgrading to everything >= 1.5 fails with:
...
Initialising "MediaWiki" namespace...
A database error has occurred
Query: SELECT old_text,old_flags FROM `text` WHERE old_id = '48' LIMIT 1
Function: Revision::loadText
Error: 1146 Table 'wikidb.text' doesn't exist (localhost)
Backtrace:
GlobalFunctions.php line 602 calls wfbacktrace()
Database.php line 473 calls wfdebugdiebacktrace()
Database.php line 419 calls databasemysql::reportqueryerror()
Database.php line 806 calls databasemysql::query()
Database.php line 825 calls databasemysql::select()
Revision.php line 667 calls databasemysql::selectrow()
Revision.php line 435 calls revision::loadtext()
Revision.php line 424 calls revision::getrawtext()
InitialiseMessages.inc line 197 calls revision::gettext()
InitialiseMessages.inc line 72 calls initialisemessagesreal()
updaters.inc line 822 calls initialisemessages()
update.php line 67 calls do_all_updates()
Apparently, there's no 'text' table in older databases.o
How can I upgrade my database? -- I tried the scripts upgrade1_5.php and
update.php (in this order). The old databse is already UTF8 encoded.
Any help much appreciated!
--
Johannes
Hi,
I have a weird problem on one Wiki installation, Special:Recentchanges
isn't reflecting the updates. If I run the rebuildrecentchanges.php
script I see them but no subsequent changes until I run
rebuildrecentchanges.php again.
Is this looking familiar to somebody?
MediaWiki 1.8.2
PHP 5.2.0
MySQL 4.1.22
Extensions:
DynamicPageList2 0.5.1
Inputbox
Thanks!
--
Bert van de Grift
http://www.vdgrift.org
GPG Key: http://www.vdgrift.org/0x306DE560.asc
Fingerprint: 3E79 1F71 6699 619E 8BCC B21A E1ED 76E0 306D E560
I tried to install the SemanticMediaWiki extension v0.6 (current) on
1.9.0 and got the same symptom as I reported with a vanilla
installation of 1.8.2 in the thread Empty pages..., ie just HTML head
and body tags with no content.
Rob Church advised in that thread to try setting $wgDisableCounters =
true; so I tried that in this situation, despite the fact that what
that was intended to work-around was supposedly fixed in 1.9.0. But I
got a result: with that in place I got the error message:
Fatal error: Class 'SMW_LanguageEn' not found in
C:\Domains\intersp.org\wwwroot\extensions\SemanticMediaWiki\includes\SMW_GlobalFunctions.php on line 136
I've gone over the semediawiki installation instructions several times
and I'm sure I've done everything correctly. I haven't yet looked for
any more appropriate forum to report this problem, but I thought that
people here might be interested in what it took to get that error
message. Here's the MW installation report again (for 1.8.2), let me
know if any other info would help:
PHP 5.1.6 installed
Found database drivers for: MySQL
Warning: PHP's register_globals option is enabled. Disable it if you can.
MediaWiki will work, but your server is more exposed to PHP-based security vulnerabilities.
PHP server API is cgi-fcgi; using ugly URLs (index.php?title=Page_Title)
Have XML / Latin1-UTF-8 conversion support.
PHP is configured with no memory_limit.
Have zlib support; enabling output compression.
Couldn't find Turck MMCache, eAccelerator or APC. Object caching functions cannot be used.
GNU diff3 not found.
Found GD graphics library built-in, image thumbnailing will be enabled if you enable uploads.
Installation directory: C:\Domains\****
Script URI path:
Environment checked. You can install MediaWiki.
Warning: $wgSecretKey key is insecure, generated with mt_rand(). Consider changing it manually.
Generating configuration file...
Database type: MySQL
Loading class: DatabaseMysql
Attempting to connect to database server as ****...success.
Connected to 4.1.21-community-max-nt-log
Database **** exists
Creating tables... using MySQL 4 table defs... done.
Initializing data...
Created sysop account ****.
Initialising "MediaWiki" namespace for language code en...
Done. Updated: 0, inserted: 1495, kept: 0.
Creating LocalSettings.php...
Installation successful! Move the config/LocalSettings.php file into the parent directory, then
follow this link to your wiki.
--
Robin Faichney
The Invisible Eye on Consciousness
<http://www.robinfaichney.org/>
Hello guys,
ich have installed Mediawiki-1.6.5. Since some time email notification
does not work any more on user registration, password change or page
watch. How can I debug Mediawiki to see whats wrong?
Greets,
Serguei
I'm running MW v1.8.2 and use this on the home page:
Special:Newimages
How do you set the number of images to display? I googled around and
found this:
{{Special:Newimages[/int]}}
where "int" is the number of new images, but it didn't work; it just
displayed the code.
Thanks.
Tim
> From: "Rob Church" <robchur(a)gmail.com>
> Date: January 2, 2007 6:33:39 AM EST
> To: "MediaWiki announcements and site admin list" <mediawiki-
> l(a)wikimedia.org>
> Subject: Re: [Mediawiki-l] Template to list 10 most recent created
> pages
> Reply-To: MediaWiki announcements and site admin list <mediawiki-
> l(a)Wikimedia.org>
>
>
> On 02/01/07, Vernon Thommeret <synotic(a)gmail.com> wrote:
>> The extension works well enough for me, but I just have a minor quip:
>> addHTML() seems to be ignoring any newlines I put in. Is there a way
>> to get around this? I also can't figure out how to remove the
>
> I'm not quite sure what you mean. Where are you adding newlines? A
> newline character is totally meaningless within HTML, remember - we
> have paragraph markup, and line breaks for that.
I'm just trying to present my markup a certain way. For instance,
instead of "<ol><li>", I'd format my code as "<ol>\n\t<li>". Using
$wgOut->addHTML, this generally works (and I've tested it in other
extensions), but it seems to have a really curious behavior in the
Newestpages extension. If I write "$wgOut->addHTML("coca\cola")" in
the other extensions I've tried, I get the expected "coca" followed
by a newline, followed by "cola". However, with Newestpages I get:
"<p>coca</p>\ncola". It almost seems as if it is using addWikiText
instead of addHTML, but in my tests, addWikiText adds simply
"<p>cocacola</p>" given "coca\ncola".
All of this behavior is even stranger when I look at the only
definition for the addHTML function:
function addHTML( $text ) { $this->mBodytext .= $text; }
There seems to be nothing that would cause it to do anything but
append the text to the main text. Could the text be somehow filtered
before it gets sent? Again, I realize that this is a minor point, but
I like to keep clean, functional markup, and a lot of extraneous
"<p></p>"s and confusing markup display defeats that.
Thanks.
My "elevator speech" to executives and leaders in companies about the use of wikis tends to focus on the pain they feel about knowledge sharing, knowledge transfer and problem solving. I'm an applied sociologist (not a technical person:) who helps people adopt/integrate technology solutions into their daily work life.
Example... I have a media research client who has a core group of baby boomers who have been with the company for 30-40 years. These folks are the brain trust of the company and they're ready to retire. They are launching a wiki to try to capture all that knowledge and not let it walk out the door. It's a low cost option to pilot a new way of solving a complex problem. If it doesn't work, they can always go back to MS Sharepoint (which hasn't worked real well for them in the past).
Your sales pitch should aim 1) to solve a problem executives see as a priority; 2) show how the return exceeds the perceived and real risks; 3) how it may help the corporate culture, not hurt it, and 4) most importantly, how it will help the bottom-line.
Hope this helps. I learn a lot about wikis from reading the emails. Thank you!
Nancy Dailey
----- Original Message -----
From: mediawiki-l-request(a)lists.wikimedia.org
To: mediawiki-l(a)lists.wikimedia.org
Sent: Friday, January 12, 2007 7:00 AM
Subject: MediaWiki-l Digest, Vol 40, Issue 39
Send MediaWiki-l mailing list submissions to
mediawiki-l(a)lists.wikimedia.org
To subscribe or unsubscribe via the World Wide Web, visit
http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
or, via email, send a message with subject or body 'help' to
mediawiki-l-request(a)lists.wikimedia.org
You can reach the person managing the list at
mediawiki-l-owner(a)lists.wikimedia.org
When replying, please edit your Subject line so it is more specific
than "Re: Contents of MediaWiki-l digest..."
------------------------------------------------------------------------------
Today's Topics:
1. Output New System Message Using Monobook.php? (Patricia Barden)
2. Re: Special:Recentchanges isn't reflecting the updates
(Bert van de Grift)
3. Re: Special:Recentchanges isn't reflecting the updates
(Brion Vibber)
4. Re: Special:Recentchanges isn't reflecting the updates
(Bert van de Grift)
5. [OT] wikis "sales pitch" (Frederik Dohr)
------------------------------------------------------------------------------
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
I need someone's help on this one.
I've pretty much locked down anonymous users on our site. They can view
any pages, but tabs are gone and editing is disabled. The one thing I
do want to allow is for anonymous users to send email via
Special:Emailuser. How do I open just that page/function to anonymous
users?
Thanks!
Last night, after I asked Brion whether he could turn on Rob's
Special:Contributors extension on Wikimedia sites, and Domas colorfully
indicated that it would be a Very Bad Idea(TM). The data for the extension
can be retrieved from the revision table, but due to the existing topology
of the table and the current load on the revision table itself, it would
cause a very expensive query. He indicated that another option would be to
create a new database table with page, user and editcount data. I've kept
thinking about the idea, and have modified Domas's suggestion slightly:
mysql> describe contributors;
+--------------------+-----------------+------+-----+---------+-------------
---+
| Field | Type | Null | Key | Default | Extra
|
+--------------------+-----------------+------+-----+---------+-------------
---+
| co_id | int(8) unsigned | NO | PRI | NULL |
auto_increment |
| co_page | int(8) unsigned | NO | MUL | NULL |
|
| co_user | int(5) unsigned | NO | MUL | 0 |
|
| co_user_text | varchar(255) | NO | | NULL |
|
| co_editcount | int(11) | NO | | NULL |
|
| co_editcount_minor | int(11) | NO | | NULL |
|
| co_timestamp | char(14) | NO | | NULL |
|
+--------------------+-----------------+------+-----+---------+-------------
---+
Each row in the database would represent each row in the output of
http://vs.aka-online.de/cgi-bin/wppagehiststat.pl, to pick an example. The
co_id is the primary key of the table, and co_page links to the usual
suspect, page.page_id. Co_user and co_user_text link to user_id and
user_name, as well. Co_editcount is the number of edits said user has made
to the page, and co_editcount_minor represents the number of minor edits
done to the page (added because someone is going to ask for it sooner or
later). Co_timestamp is the timestamp of generation of the row, added in
case a fancy algorithm to detect old queries is made in the future.
As to how to initially populate the table: well, a bulk initialization
script is one option, but Domas didn't like it either. ;) :P What I
suggested is to populate the table on demand, and maintain it later, and
after a night's sleep, I thought about how to refine it more:
* On page creation, generate a row for the page and the user.
* On page edit, one of two options:
** If a row for the page and user exists already, then UPDATE the
co_editcount and co_editcount_minor fields as necessary.
** If rows for the page exist, but none points to the user, generate a new
row.
** If there are no rows pointing to the page, create the necessary rows from
the data in the revision table.
* On a query from [[Special:Contributors]], if there are no rows pointing to
the row, generate them from revision, then return the expected results,
Otherwise, just return the results.
* On page deletion/undeletion/oversight, SELECT all the rows pointing to a
particular page and DELETE them from the table, essentially invalidating the
cache for the page.
As to why the change is desired, it was described in detail at
http://bugzilla.wikimedia.org/show_bug.cgi?id=7988. Also, this could help to
implement some long-lasting suggestions to enhance Wikimedia projects'
credibility and citability. For example, Roy Rosenzweig of George Mason
University made a few recommendations in his article "Can History be Open
Source? Wikipedia and the Future of the Past"
(http://chnm.gmu.edu/resources/essays/d/42, and a good read, by the way),
such as putting the total number of edits made to a page into every page's
footer (next to [[MediaWiki:Lastmodifiedat]]). Also, with a JOIN with the
user table, a percentage of edits made by active editors can be displayed to
the viewer, which is another suggestion by Rosenzweig.
There are probably more efficient ways to do this, but I just wanted to get
the ball rolling... any comments? Should it be done?
Titoxd.
It's "easy" to mirror a MediaWiki from one primary server to a number
of secondary servers, but is it possible to have multiple primary servers?
Example: 10 servers and users can make changes on ANY of the 10
servers. Every night, the servers rsync to each other as follows:
1. If server X's version hasn't changed all day and server Y's version
HAS changed, server X accepts server Y's version.
2. If both server X's and server Y's versions have changed, automatic
CVS style merging is used to resolve the changes.
3. If CVS style merging yields a conflict, the site maintainer is
notified and must merge the two files manually (I'm thinking of a
creating a small site, so this shouldn't be too painful)
I realize the rules above only work for 2 servers -- is there a clever
version of this for n servers (n>2)?
--
We're just a Bunch Of Regular Guys, a collective group that's trying
to understand and assimilate technology. We feel that resistance to
new ideas and technology is unwise and ultimately futile.