We have upgraded mediawiki to
MediaWiki 1.15.3
PHP 5.3.2
MySQL 5.0.90-community
We now get this error message as the first line of every page.
Searching the web implied that upgrading beyong PHP 5.3.1 and
mediawiki 1.15.0 should solve the problem but it does not. Anyone got
any clues?
bobj
-----------------------------------
Dr Bob Jansen
Turtle Lane Studios Pty Ltd
PO Box 26, Erskineville NSW 2043, Australia
Ph: +61-414 297 448
Skype: bobjtls
http://www.turtlelane.com.au
In line with the Australian anti-spam legislation, if you wish to
receive no further email from me, please send me an email with the
subject "No Spam"
In an extension, I'd like to detect when the "Go" button has been used to jump directly to an article that exists. Any ideas how to do this? Here is what I have tried so far:
- The Special:Search page (file specials/SpecialSearch.php) does an immediate redirect to the target, with no hook provided.
- There is a hook SpecialSearchNogomatch that detects the use of the "Go" button but only when the target doesn't exist. That's the opposite of what I need: in my case, the target does exist. This hook is called too late anyway (after the redirect).
- The hooks SpecialSearchResults and SpecialSearchNoResults do not fire in this situation
- Maybe use a hook in the Title or Article class, but how to detect only the Go button? I don't want to include articles that have been visited by URL or by clicking a wiki link.
By brute force, I could probably make my extension run on every page load (ugh), tracking if I'm on Special:Search with a query string parameter "Go=go", and then use the SpecialSearchNogomatch hook to exclude searches that fail. But that feels like an inefficient hack.
Thanks for any ideas....
DanB
Welcome to mediawiki-l. This mailing list exists for discussion and questions
about the MediaWiki software[0]. Important MediaWiki-related announcements
(such as new versions) are also posted to this list.
Other resources.
If you only wish to receive announcements, you should subscribe to
mediawiki-announce[1] instead.
MediaWiki development discussion, and all Wikimedia technical questions, should
be directed to the wikitech-l[2] mailing list.
Several other MediaWiki-related lists exist:
- mediawiki-api[5] for API discussions,
- mediawiki-enterprise[6] for discussion of MediaWiki in the enterprise,
- mediawiki-cvs[7] for notification of commits to the Subversion repository,
- mediawiki-i18n[8] for discussion of MediaWiki internationalisation support,
- wikibugs-l[9] for notification of changes to the bug tracker.
List administrivia (unsubscribing, list archives).
To unsubscribe from this mailing list, visit [12]. Archives of previous postings
can be found at [3].
This list is also gatewayed to the Gmane NNTP server[4], which you can use to
read and post to the list.
Posting to the list.
Before posting to this list, please read the MediaWiki FAQ[10]. Many common
questions are answered here. You may also search the list archives to see if
your question has been asked before.
Please try to ask your question in a way that enables people to answer you.
Provide all relevant details, explain your problem clearly, etc. You may
wish to read [13], which explains how to ask questions well.
To post to the list, send mail to <mediawiki-l(a)lists.wikimedia.org>. This is a
public list, so you should not include confidential information in mails you
send.
When replying to an existing thread, use the "Reply" or "Followup" feature of
your mail client, so that clients that understand threading can sort your
message properly. When quoting other messages, please use the "inline" quoting
style[11], for clarity.
When creating a new thread, do not reply to an existing message and change the
subject. This will confuse peoples' mail readers, and will result in fewer
people reading your mail. Instead, compose a new message for your post.
Messages posted to the list have the "Reply-To" header set to the mailing list,
which means that by default, replies will go to the entire list. If you are
posting a reply which is only interesting to the original poster, and not the
list in general, you should change the reply to only go to that person. This
avoids cluttering the list with irrelevant traffic.
About this message.
This message is posted to the list once per week by <river(a)wikimedia.org>.
Please contact me if you have any questions or concerns about this mailing.
References.
[0] http://www.mediawiki.org/
[1] http://lists.wikimedia.org/mailman/listinfo/mediawiki-announce
[2] http://lists.wikimedia.org/mailman/listinfo/wikitech-l
[3] http://lists.wikimedia.org/pipermail/mediawiki-l/
[4] http://dir.gmane.org/gmane.org.wikimedia.mediawiki
[5] http://lists.wikimedia.org/mailman/listinfo/mediawiki-api
[6] http://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
[7] http://lists.wikimedia.org/mailman/listinfo/mediawiki-cvs
[8] http://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
[9] http://lists.wikimedia.org/mailman/listinfo/wikibugs-l
[10] http://www.mediawiki.org/wiki/FAQ
[11] http://en.wikipedia.org/wiki/Posting_style#Inline_replying
[12] http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
[13] http://www.catb.org/~esr/faqs/smart-questions.html
Hello, I'd like to insert the code provided by google analicts on the main
page of the wiki, for its monitoring. In which file should I insert the
code? Thanks
--
{+}Nevinho
Venha para o Movimento Colaborativo http://sextapoetica.com.br !!
> Hello, I'd like to insert the code provided by google analicts on the main
> page of the wiki, for its monitoring. In which file should I insert the
> code? Thanks
>
>
Add your google cod in your skin.php file (example MonoBook.php) before
tags </body></html>.
Geo
--
George Alexandru Dudău
CORE IT MEX - Grupul de firme MOBEXPERT
Network administrator | proiecte web | proiecte speciale
Tel : +40 21 2421040-1139
http://betha.lx.rohttp://www.itmex.ro
Are there any extensions to the MW API to do semantic property
queries, eg, to return a list of pages where ?prop=value or some such?
--
MK <halfcountplus(a)intergate.com>
Today I have tried google's "Page Speed" tool. My server is running apache with mod_deflate but
* Compressing /skins/common/wikibits.js?269 could save 20.6KiB (68% reduction)
* Compressing /skins/common/ajaxwatch.js?269 could save 3.8KiB (67% reduction)
* Compressing /skins/common/ajax.js?269 could save 2.8KiB (60% reduction).
even I have
<IfModule mod_deflate.c>
AddOutputFilterByType DEFLATE text/html text/plain text/xml application/xml application/xhtml+xml text/javascript text/css application/x-javascript image/gif
BrowserMatch ^Mozilla/4 gzip-only-text/html
BrowserMatch ^Mozilla/4\.0[678] no-gzip
BrowserMatch bMSIE !no-gzip !gzip-only-text/html
DeflateCompressionLevel 5
SetEnvIf User-Agent ".*MSIE.*" nokeepalive ssl-unclean-shutdown downgrade-1.0 force-response-1.0
</IfModule>
as soon I use the url without the parameter I get it compressed.
/skins/common/ajax.js?269 - not compressed
/skins/common/ajax.js - compressed
Someone any tip for me how to get it compressed even with the parameter?
Thanks
Jan
Hi,
I would like to use separate style sheets for different namespaces. I found this in MonoBook.php:
<?php if($this->data['pagecss']) { ?><style type="text/css"><?php $this->html('pagecss') ?></style> (line 96 in 1.15)
could there be a similar function? Something like: "If there is a Mediawiki page called {{NAMESPACE}}.css, use it"...?
thanks
Bernhard
I ran into a strange behavior of the parser (MediaWiki 1.15.0):
in our wiki a page is build by calling two nested templates which call
several semantic inline queries which then call a format template to
show the results. one of the format template arguments is a list of
names (of people) for which another template finds corresponding to the
current year an affiliation.
with few results (of the inline queries) this works fine.
but, with many results only a very blank page is shown. the strange
thing is that the preview works fine.
I tried to change increase the following parameters
,----
|
| ini_set( 'memory_limit', '100M' );
| $wgMaxPPExpandDepth=100;
| $wgMaxPPNodeCount=10000000;
`----
but none seemed to solve this problem. also the source code of the
preview of the problematic page says:
,----
|
| Preprocessor node count: 13901/1000000
| Post-expand include size: 439559/2097152 bytes
| Template argument size: 37595/2097152 bytes
| Expensive parser function count: 0/100
`----
any ideas/help is very much appreciated!
Tomy