Hi,
i did an upgrade of a running 1.2.4 mediawiki with a backuped database,
using the config/index.php Web-Install. The installation finished
smoothly, but the final Wiki-code shows a problem. The resulting wiki
start-page is blank, the log shows this:
[18-Jun-2004 18:54:19] PHP Fatal error: Call to a member function on a
non-object in /usr/local/ftp/gpintra2/wiki2/includes/Skin.php on line 1693
The mentioned line is:
/*static*/ function makeArticleUrl ( $name, $urlaction='' ) {
$title = Title::newFromText( $name );
$title= $title->getSubjectPage();
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
$this->checkTitle($title, $name);
return $title->getLocalURL( $urlaction );
}
I could duplicate this with a second installation, with a copy of the
same 1.2.4 database. Meanwhile, a fresh installation of 1.3.0beta3 on
the same machine is successful.
Any ideas?
Thanks, Thommie
--
---------------------------------------------------------------------
THOMAS M. ROTHER * n e t z w i s s e n * D-73728 Esslingen
F.R. Germany, European Union * mailto:t.rother@netzwissen.de
http://www.netzwissen.de * GPG Key from http://wwwkeys.de.pgp.net
Fingerprint B208 E204 4249 4635 19B9 B691 3E73 C8B9 1229 DE4C
---------------------------------------------------------------------
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Dear listmembers,
I've been using the mediawiki for a few months now and it's working pretty
well. Now I would like to build a latex source file from several wikipages.
So I have several diffrent articles like ("tux" "linux" "gpl" "oml") and want
to have them together in one latex and later one pdf file with graphics.
The pdf file should have a \tableofcontents
Also I would like to autogenerate this often (to keep the manual up to date)
Does anyone have an good idea?
Greets,
Joachim
- --
Keine "Software-Patente" in Europa
Wir protestieren gegen "Software-Patente" in Europa.
Die EU-Kommission und der Ministerrat bemühen sich im Verborgenen um
unbegrenzte Patentierbarkeit von Software, wie von internationalen
Großkonzernen und Patentanwälten gefordert. Sie ignorieren dabei die
demokratische Entscheidung des Europa-Parlamentes vom 24. September 2003,
welche die Unterstützung von mehr als 300.000 Bürgern, 2.000.000 Kleinen &
mittelständischen Firmen und Dutzenden von Ökonomen und Wissenschaftlern
genießt.
Darüber hinaus können Sie uns unterstützen, indem Sie den Aufruf zum Handeln
II des FFII e.V. unterzeichnen.
http://www.ffii.org/ffii-cgi/aktiv?f=euparl&l=de
Vielen Dank,
Joachim Schiele
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.4 (GNU/Linux)
iD8DBQFA3yN4yzk1c1T5Os4RAnt8AJ41l3ObIe77hCve7ejI+wRQbJ0prACcDs1h
d9GRaV8/3Y7g9ON+ur5vVRg=
=tZli
-----END PGP SIGNATURE-----
Hello,
I was very delighted to see another beta today, as I couldn't get beta 3
to work due to problems over Pear. It always moaned right at
installation over the Pear directories etc.
Now, the installation script said everything was fine (except for
set_time_limit(): Cannot set time limit in safe mode..., but that didn't
stop me from installing 1.2.6 without problems)
So I moved the LocalSettings.php to the httpdoc-dir and clicked on the
link to go to my main page.
Sadly, I got this error mesage:
[pear_error: message="failed to open stream: No such file or directory"
code=0 mode=return level=notice prefix="" info=""]
Google revealed that I am not the first to ask about this message with
Mediawiki. But that was the only reference to this particular error
message.
Does anyone has an idea how to solve this on a hosted server?
--
Kind Regards,
Marko Faas
This is not exactly MediaWiki-l related per se, so please ignore the
reply-to and answer me off-list if you can.
The PNG logo on my Monobook installation as well as the one on the
wikipedia sites (jigsaw globe) has an alpha channel and works equally
nice in both Gecho-based browsers as well as M$ Internet Exploder. I
can't seem to create one for my site. the best I get is a PNG that is
transparent on mozilla but has a weird grey back on MSIE. can anyone
clue me in on where I went wrong and what is the correct procedure to
get a cross-browser compatible transparent PNG? I have both Gimp and
Adobe tools available.
Thanks,
Ira.
--
Lucky dog
Ira Abramov
http://ira.abramov.org/email/
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Some compatibility fixes for PHP 4.1.2 and 4.2.x; installer checks for
missing MySQL support; and many various things fixed. Anyone running a
public server on 1.3.0beta is strongly recommended to upgrade to this
release, as a potential JavaScript injection attack in earlier betas has
been fixed. (1.2.x is not vulnerable.)
Release notes:
https://sourceforge.net/project/shownotes.php?release_id=248913
Download:
http://prdownloads.sourceforge.net/wikipedia/mediawiki-1.3.0beta4.tar.gz?do…
Report bugs at:
https://sourceforge.net/tracker/?group_id=34373&atid=411192
Play "Stump the Developers" live on IRC:
#mediawiki on irc.freenode.net
Help mailing list:
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
(CVS note: this is tagged as REL1_3_0beta4a. We tagged a REL1_3_0beta4
last week but didn't release quite it.)
Changes from beta3 include but are not limited to:
* Install checks for missing PHP MySQL support
* Fixed broken PHP 4.1.2 compatibility functions
* PHP 4.2 vs MonoBook skin title encoding
* Offsets
* Mysterious install fixes
* Fix for with html-tidy
* (optional) experimental load balancer
* Fixes for some broken queries from last beta
* Improved error reporting in command line scripts and elsewhere
* Account creation per IP throttle (requires memcached)
* Copyright info on uploads (optional)
* Hook for custom spam check filter function
* Improved HTML escaping of user names in some places
* Removed < and > from legal title chars (shouldn't have been added)
* Recentchanges reports move-over-redirect more clearly
* Search index updates no longer use REPLACE DELAYED, as it causes more
problems than it solves
* Copyright notice was missing on old revisions, fixed
* More bad title checks in skin
* Unused skins no longer loaded (reduces memory usage)
* "newbies" mode on Special:Contributions
* Bureaucrat flag put back into Special:Makesysop
* Special:Undelete fix for titles with apostrophes
* Upload warning on existing filename
* $wgWhitelistRead fix
* Titles with "./" are allowed again as long as it's not at the
beginning or following a "/"
* Language file upades for Hungarian, Japanese, Korean, Portuguese
* Spelling corrections in languge names
* rebuildtextindex.php updated
* SQL fix to blobs table
* TeX suppress dvips output from logs
* Style fixes
- -- brion vibber (brion @ pobox.com)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.2 (Darwin)
Comment: Using GnuPG with Thunderbird - http://enigmail.mozdev.org
iD8DBQFA3hSCwRnhpk1wk44RAozfAKC+DUHqNn6fHib/z4x5KERA/iU0+gCghKs5
MH0NL5ooqwYMbd6pw8C6/JA=
=peLW
-----END PGP SIGNATURE-----
Hi all,
on win2k, php 4.3.7 and iis i have set up a development version of
mediawiki 1.3.02 beta succsesfully with fulltext search automatically
enabled.
On our server, the same configuration, full text does not work.
If i try the rebuildall.php on the command line i get the following hint:
"rebuildlinks.inc needs to be updated to the new schema"
Any hint what to do would be nice.
Best regards
Andreas
Hello,
I found a bug in the special:Randompage, it goes like this:
The algorithm picks a random number, then checks in the database for a page
with a higher cur_random (a random value given to each page), and takes the
first page that matches. The problem with this is I only have 2 pages, both
with a very low cur_random value. So often I get redirected to an empty page
(404). I don't know if there's a way to fix this, or a workaround (other
than "make more pages")? Maybe, instead of doing '$rt = "" ' in the else,
you could do another query, but this time with "AND cur_random < $randstr
AND cur_random > 0" (to avoid the homepage).
Final code: (in includes/SpecialRandompage.php)
...
} else {
$sqlget = "SELECT cur_id,cur_title
FROM cur USE INDEX (cur_random)
WHERE cur_namespace=0 AND cur_is_redirect=0
AND cur_random<$randstr
AND cur_random > 0
ORDER BY cur_random DESC
LIMIT 1";
$res = wfQuery( $sqlget, DB_READ, $fname );
if( $s = wfFetchObject( $res ) ) {
$rt = wfUrlEncode( $s->cur_title );
} else {
# No articles?!
$rt = "";
}
}
...
This solution isn't good, it takes nearly always the same page. There should
be a better way :-)
And I also have a problem when I edit a page, then press Save. I get told
there was a conflicting error, but when I return to the page, the update has
been done. I think it's some problem with my session-things (I'm using a
windows webserver, not apache).
Any comments?
Tom
Hello! I'm new to mediawiki, and just installed it. It works fine, but
now I'd like to have the same beautiful skins as in the wikipedia, i.e.
the Monobook skin.
Now I looked at this page http://meta.wikipedia.org/wiki/Skins and then
downloaded Smarty-2.6.2, and put it at the right place; then I did the
LocalSettings.php changes (albeit I only did the smarty part - do I need
to get PHPTal too?) so here's what I have now in my LocalSettings.php:
set_include_path(get_include_path() . ":" . $IP.'/Smarty-2.6.2/libs');
$wgUseSmarty = true;
Ok.. and that didn't change anything :) What am I supposed to do now, to
be able to choose the Monobook skin?
Thank you very much for your help.
BoD
In a message dated 6/21/2004 5:02:01 PM Eastern Standard Time,
brion(a)pobox.com writes:
> Note that currently we don't have diff-based storage; when you make a
> change to a page the entire previous revision is stored in whole.
> (Consider enabling $wgCompressOld if you have zlib support in PHP; this
> will reduce old text requirements by roughly half.)
>
> -- brion vibber (brion @ pobox.com)
Currently, my group's wiki is small. There are a few of us actively
contributing right now, but that will probably change soon. The handful of volunteers
that have been putting in content have also been learning the way of the wiki
as they do so, making multiple edits on some rather lengthy articles, and
innocently eating up storage space.
At the moment, our wiki is restricted to only registered users being able to
contribute and only the sysop can create a registered user account.
We had attempted to research the wiki's overhead requirements in making a
judgment as to whether or not to buy more disk space from our provider. During
the investigation of overhead storage requirements, we used the 'wikipedia'
statistics and charts on space. It never occurred to us that 'wikipedia' was
storing full copies of all versions of an article based on the 590MB May 22, 2004
number and considering the high number of articles the db had. We must have
been reading the wrong statistics.
Do the 'wikipedia' administrators remove history from their wiki in order to
preserve space? If so, how is this done? Is there some sort of 'export only
the lastest version of each article, etc.' option, clear the db, and then import
the lastest version back?
Our administrator has set the "$wgCompressRevisions = true;" since your
message (above) -- will that take care of only the revisions since the flag was
turned on or will there be compression of the previous revisions as well?
I appreciate everyone's patience in this. I'm sort of the go-between right
now. Hopefully our administrator will come online with this list and she can
pose the questions more 'technically'. :)
Our versions:
MediaWiki: 1.3.0beta2
PHP: 4.3.4 (apache)
MySQL: 4.0.18
Take care,
Debi
I tried to put the $wgUseDatabaseMessages = false; in the
LocalSettings.php but it didn't seem to have an effect.
Reloading the frontpage still takes about 15 seconds. The best result
i've had is 8 seconds. Opening other pages takes about 4 seconds and
using the back button to return to the mainpage is quick (less than 1
seconds).
I went to the testserver and it's much quicker. Is squid cache in use at
the testserver?
The wiki: http://www.bergenkitesurfingklubb.no/wiki
Rune
LocalSettings.php:
<?php
# This file was automatically generated by the MediaWiki installer.
# If you make manual changes, please keep track in case you need to #
recreate them later.
$IP = "/home/BergenKi/www/wiki";
ini_set( "include_path", ".:$IP:$IP/includes:$IP/languages" );
include_once( "DefaultSettings.php" );
# If PHP's memory limit is very low, some operations may fail.
# ini_set( 'memory_limit', '20M' );
if ( $wgCommandLineMode ) {
if ( isset( $_SERVER ) && array_key_exists( 'REQUEST_METHOD',
$_SERVER ) ) {
die( "This script must be run from the command line\n"
);
}
} else {
## Compress output if the browser supports it
if( !ini_get( 'zlib.output_compression' ) ) ob_start(
'ob_gzhandler' ); }
$wgSitename = "Wiki";
$wgScriptPath = "/wiki";
$wgScript = "$wgScriptPath/index.php";
$wgRedirectScript = "$wgScriptPath/redirect.php";
## If using PHP as a CGI module, use the ugly URLs
$wgArticlePath = "$wgScript/$1";
# $wgArticlePath = "$wgScript?title=$1";
$wgStylePath = "$wgScriptPath/stylesheets";
$wgStyleDirectory = "$IP/stylesheets";
$wgLogo = "$wgStylePath/images/wiki.png";
$wgUploadPath = "$wgScriptPath/images";
$wgUploadDirectory = "$IP/images";
$wgEmergencyContact = "styre(a)BergenKitesurfingklubb.no";
$wgPasswordSender = "styre(a)BergenKitesurfingklubb.no";
$wgDBserver = "xxxxx.no";
$wgDBname = "xxxxx";
$wgDBuser = "xxxxx";
$wgDBpassword = "xxxxx";
## To allow SQL queries through the wiki's Special:Askaql page, ##
uncomment the next lines. THIS IS VERY INSECURE. If you want ## to allow
semipublic read-only SQL access for your sysops, ## you should define a
MySQL user with limited privileges.
## See MySQL docs: http://www.mysql.com/doc/en/GRANT.html
#
# $wgAllowSysopQueries = true;
# $wgDBsqluser = "sqluser";
# $wgDBsqlpassword = "sqlpass";
$wgDBmysql4 = $wgEnablePersistentLC = true;
## To enable image uploads, make sure the 'images' directory ## is
writable, then uncomment this:
$wgDisableUploads = false;
$wgUseImageResize = true;
$wgUseImageMagick = true;
$wgImageMagickConvertCommand = "/usr/bin/convert";
## If you have the appropriate support software installed ## you can
enable inline LaTeX equations:
# $wgUseTeX = true;
$wgMathPath = "{$wgUploadPath}/math";
$wgMathDirectory = "{$wgUploadDirectory}/math";
$wgTmpDirectory = "{$wgUploadDirectory}/tmp";
$wgLocalInterwiki = $wgSitename;
$wgLanguageCode = "en";
$wgUseLatin1 = false;
$wgProxyKey =
"80e02647ccf74a9c7e0a63e14c48928e9f7647e4147f15ea2751726c26b43dfa";
## Default skin: you can change the default skin. Use the internal
symbolic ## names, ie 'standard', 'nostalgia', 'cologneblue',
'monobook':
# $wgDefaultSkin = 'monobook';
## For attaching licensing metadata to pages, and displaying an ##
appropriate copyright notice / icon. GNU Free Documentation ## License
and Creative Commons licenses are supported so far.
# $wgEnableCreativeCommonsRdf = true;
$wgRightsPage = ""; # Set to the title of a wiki page that describes
your license/copyright $wgRightsUrl = ""; $wgRightsText = "";
$wgRightsIcon = ""; # $wgRightsCode = ""; # Not yet used
$wgUseDatabaseMessages = false;
?>
-----Opprinnelig melding-----
Fra: Brion Vibber [mailto:brion@pobox.com]
Sendt: 22. juni 2004 18:27
Til: MediaWiki announcements and site admin list
Emne: Re: [Mediawiki-l] Speed issues
Rune Tomren wrote:
> However, i find the wiki very slow most of the times. sometimes i have
> to wait 25 seconds for a page to load.
> the worst is when entering the wiki, sometimes i have to wait about 20
> seconds.
> I have a static web on the same server and that is very fast indeed.
> File transfers is quick as well.
> When using the official wiki beta testserver sometimes the delay is
> equally bad there.
This might be the recaching of the MediaWiki message namespace, in
particular the caching stage which may take a while initially. Try
disabling database messages: $wgUseDatabaseMessages = false;
> I know this is beta but i'm qurious to know if this is the expected
> speed or if one can expect an increase in speed when the wiki is out
> of beta or maybe my server space has too little brain allocated?
Is it *frequently* that slow or once a day?
-- brion vibber (brion @ pobox.com)
-----Opprinnelig melding-----
Fra: Brion Vibber [mailto:brion@pobox.com]
Sendt: 22. juni 2004 18:27
Til: MediaWiki announcements and site admin list
Emne: Re: [Mediawiki-l] Speed issues
Rune Tomren wrote:
> However, i find the wiki very slow most of the times. sometimes i have
> to wait 25 seconds for a page to load.
> the worst is when entering the wiki, sometimes i have to wait about 20
> seconds.
> I have a static web on the same server and that is very fast indeed.
> File transfers is quick as well.
> When using the official wiki beta testserver sometimes the delay is
> equally bad there.
This might be the recaching of the MediaWiki message namespace, in
particular the caching stage which may take a while initially. Try
disabling database messages: $wgUseDatabaseMessages = false;
> I know this is beta but i'm qurious to know if this is the expected
> speed or if one can expect an increase in speed when the wiki is out
> of beta or maybe my server space has too little brain allocated?
Is it *frequently* that slow or once a day?
-- brion vibber (brion @ pobox.com)