The local settings file only defines the default skin variable once and it is set to
cancervoicesnew. Also, after I log in my preferences shows the skin to be cancervoicesnew.
Still doesn't answer the question of when I initially connect, the skin displayed is
Vector and not that defined in the local settings file.
Sounds as if I am the only one having this issue.
Bobj
Dr Bob Jansen
Turtle Lane Studios
PO Box 26 Erskineville NSW 2043 Australia
Ph: +61 414 297 448
Skype: bobjtls
Send MediaWiki-l mailing list submissions to
mediawiki-l(a)lists.wikimedia.org
To subscribe or unsubscribe via the World Wide Web, visit
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
or, via email, send a message with subject or body 'help' to
mediawiki-l-request(a)lists.wikimedia.org
You can reach the person managing the list at
mediawiki-l-owner(a)lists.wikimedia.org
When replying, please edit your Subject line so it is more specific
than "Re: Contents of MediaWiki-l digest..."
Today's Topics:
1. MW 1.16.0 usability extension default skin not working
(Dr Bob JAnsen)
2. Re: How can I get the content of a wiki page (Edward Swing)
3. Re: How can I get the content of a wiki page (yanick bajazet)
4. Re: How can I get the content of a wiki page (Edward Swing)
5. Re: MW 1.16.0 usability extension default skin not working
(Trevor Parscal)
6. Re: PHP 5.2.x and ini_set( 'pcre.backtrack_limit', '2M' ); in
LocalSettings.php (Benjamin Lees)
7. Re: MW 1.16.0 Usability Extension - Default Skin not working.
(roger(a)rogerchrisman.com)
8. Re: PHP 5.2.x and ini_set( 'pcre.backtrack_limit', '2M' ); in
LocalSettings.php (roger(a)rogerchrisman.com)
----------------------------------------------------------------------
Message: 1
Date: Tue, 9 Nov 2010 23:18:25 +1100
From: Dr Bob JAnsen <bob.jansen(a)turtlelane.com.au>
Subject: [Mediawiki-l] MW 1.16.0 usability extension default skin not
working
To: "mediawiki-l(a)lists.wikimedia.org"
<mediawiki-l(a)lists.wikimedia.org>
Message-ID: <E727BB82-962D-402E-9C51-EC9E8AC63062(a)turtlelane.com.au>
Content-Type: text/plain; charset=us-ascii
Trevor writes
Vector does away with the book background image in favor of a gradient.
But the question is, why is the Vector skin being display at all when the local settings
file is configured to use my skin and why does logging in then correctly show my skin and
then logging out revert back to Vector?
Bobj
Dr Bob Jansen
Turtle Lane Studios
PO Box 26 Erskineville NSW 2043 Australia
Ph: +61 414 297 448
Skype: bobjtls
http://www.turtlelane.com.au
On 09/11/2010, at 23:00, mediawiki-l-request(a)lists.wikimedia.org wrote:
Vector does away with the book background image
in favor of a gradient.
------------------------------
Message: 2
Date: Tue, 9 Nov 2010 07:24:47 -0500
From: "Edward Swing" <deswing(a)vsticorp.com>
Subject: Re: [Mediawiki-l] How can I get the content of a wiki page
To: "MediaWiki announcements and site admin list"
<mediawiki-l(a)lists.wikimedia.org>
Message-ID:
<C505792E7472474DAC90E6C2844BBD967D6710(a)destroyer.VSTI.LOCAL>
Content-Type: text/plain; charset="us-ascii"
Are the user comments included as part of the wiki text? Or do you mean
the comments that a user supplies when editing a page?
-----Original Message-----
From: mediawiki-l-bounces(a)lists.wikimedia.org
[mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of yanick
bajazet
Sent: Tuesday, November 09, 2010 5:58 AM
To: mediawiki-l(a)lists.wikimedia.org
Subject: [Mediawiki-l] How can I get the content of a wiki page
Hello,
I'm creating wiki pages automatically using the script
ImportTextFile.php (thank you Sam :-) ). But I need to be able update
these pages without deleting the possible user comment.
To do that, the only solution that I found, on the web, is to get the
content of my wiki page with the php script "getText.php" (from
maintenance) and do the merge by myself but this script seems to have
been deleted in the last version of mediawiki.
Is there a new script or a other way to get the content of my wiki page
? What other solutions to update wiki pages or part of them ?
Sorry for my english, I'm french,
Yanick
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
This e-mail and any attachments are for the intended recipient(s) only and may contain
proprietary, confidential material. If you are not the intended recipient, (even if the
email address above is yours) do not use, retain, copy or disclose any part of this
communication or any attachment as it is strictly prohibited and may be unlawful. If you
believe that you have received this e-mail in error, please notify the sender immediately
and permanently delete. This email may be a personal communication from the sender and as
such does not represent the views of the company.
------------------------------
Message: 3
Date: Tue, 9 Nov 2010 12:35:32 +0000
From: yanick bajazet <yanick1bjazet(a)hotmail.com>
Subject: Re: [Mediawiki-l] How can I get the content of a wiki page
To: <mediawiki-l(a)lists.wikimedia.org>
Message-ID: <SNT125-W29B4E1EFB8A7D7E078C283EB300(a)phx.gbl>
Content-Type: text/plain; charset="iso-8859-1"
I'm mean the user comments included as part of the wiki text not the comments that a
user supplies when editing a page
Date: Tue, 9 Nov 2010 07:24:47 -0500
From: deswing(a)vsticorp.com
To: mediawiki-l(a)lists.wikimedia.org
Subject: Re: [Mediawiki-l] How can I get the content of a wiki page
Are the user comments included as part of the wiki text? Or do you mean
the comments that a user supplies when editing a page?
-----Original Message-----
From: mediawiki-l-bounces(a)lists.wikimedia.org
[mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of yanick
bajazet
Sent: Tuesday, November 09, 2010 5:58 AM
To: mediawiki-l(a)lists.wikimedia.org
Subject: [Mediawiki-l] How can I get the content of a wiki page
Hello,
I'm creating wiki pages automatically using the script
ImportTextFile.php (thank you Sam :-) ). But I need to be able update
these pages without deleting the possible user comment.
To do that, the only solution that I found, on the web, is to get the
content of my wiki page with the php script "getText.php" (from
maintenance) and do the merge by myself but this script seems to have
been deleted in the last version of mediawiki.
Is there a new script or a other way to get the content of my wiki page
? What other solutions to update wiki pages or part of them ?
Sorry for my english, I'm french,
Yanick
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
This e-mail and any attachments are for the intended recipient(s) only and may contain
proprietary, confidential material. If you are not the intended recipient, (even if the
email address above is yours) do not use, retain, copy or disclose any part of this
communication or any attachment as it is strictly prohibited and may be unlawful. If you
believe that you have received this e-mail in error, please notify the sender immediately
and permanently delete. This email may be a personal communication from the sender and as
such does not represent the views of the company.
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
------------------------------
Message: 4
Date: Tue, 9 Nov 2010 08:00:39 -0500
From: "Edward Swing" <deswing(a)vsticorp.com>
Subject: Re: [Mediawiki-l] How can I get the content of a wiki page
To: "MediaWiki announcements and site admin list"
<mediawiki-l(a)lists.wikimedia.org>
Message-ID:
<C505792E7472474DAC90E6C2844BBD967D6712(a)destroyer.VSTI.LOCAL>
Content-Type: text/plain; charset="us-ascii"
If your pages have a consistent area where the users provide their
comments, you may need to write something that parses the current
contents of the page, and then appends the comments section to the new
page contents. The more structured the page is, the easier it would be
to parse.
You can use api.php to retrieve the current page contents (as well as
load new contents).
-----Original Message-----
From: mediawiki-l-bounces(a)lists.wikimedia.org
[mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of yanick
bajazet
Sent: Tuesday, November 09, 2010 7:36 AM
To: mediawiki-l(a)lists.wikimedia.org
Subject: Re: [Mediawiki-l] How can I get the content of a wiki page
I'm mean the user comments included as part of the wiki text not the
comments that a user supplies when editing a page
Date: Tue, 9 Nov 2010 07:24:47 -0500
From: deswing(a)vsticorp.com
To: mediawiki-l(a)lists.wikimedia.org
Subject: Re: [Mediawiki-l] How can I get the content of a wiki page
Are the user comments included as part of the wiki text? Or do you
mean
the comments that a user supplies when editing a
page?
-----Original Message-----
From: mediawiki-l-bounces(a)lists.wikimedia.org
[mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of yanick
bajazet
Sent: Tuesday, November 09, 2010 5:58 AM
To: mediawiki-l(a)lists.wikimedia.org
Subject: [Mediawiki-l] How can I get the content of a wiki page
Hello,
I'm creating wiki pages automatically using the script
ImportTextFile.php (thank you Sam :-) ). But I need to be able update
these pages without deleting the possible user comment.
To do that, the only solution that I found, on the web, is to get the
content of my wiki page with the php script "getText.php" (from
maintenance) and do the merge by myself but this script seems to have
been deleted in the last version of mediawiki.
Is there a new script or a other way to get the content of my wiki
page
? What other solutions to update wiki pages or
part of them ?
Sorry for my english, I'm french,
Yanick
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
This e-mail and any attachments are for the intended recipient(s) only
and may
contain proprietary, confidential material. If you are not the
intended recipient, (even if the email address above is yours) do not
use, retain, copy or disclose any part of this communication or any
attachment as it is strictly prohibited and may be unlawful. If you
believe that you have received this e-mail in error, please notify the
sender immediately and permanently delete. This email may be a personal
communication from the sender and as such does not represent the views
of the company.
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
This e-mail and any attachments are for the intended recipient(s) only and may contain
proprietary, confidential material. If you are not the intended recipient, (even if the
email address above is yours) do not use, retain, copy or disclose any part of this
communication or any attachment as it is strictly prohibited and may be unlawful. If you
believe that you have received this e-mail in error, please notify the sender immediately
and permanently delete. This email may be a personal communication from the sender and as
such does not represent the views of the company.
------------------------------
Message: 5
Date: Tue, 09 Nov 2010 08:02:02 -0800
From: Trevor Parscal <tparscal(a)wikimedia.org>
Subject: Re: [Mediawiki-l] MW 1.16.0 usability extension default skin
not working
To: mediawiki-l(a)lists.wikimedia.org
Message-ID: <4CD9707A.4060809(a)wikimedia.org>
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Your user account is set to vector?
- Trevor
On 11/9/10 4:18 AM, Dr Bob JAnsen wrote:
> Trevor writes
>
Vector does away with the book background image
in favor of a gradient.
>
> But the question is, why is the Vector skin being display at all when the local
settings file is configured to use my skin and why does logging in then correctly show my
skin and then logging out revert back to Vector?
>
> Bobj
>
> Dr Bob Jansen
> Turtle Lane Studios
> PO Box 26 Erskineville NSW 2043 Australia
> Ph: +61 414 297 448
> Skype: bobjtls
>
http://www.turtlelane.com.au
>
>
> On 09/11/2010, at 23:00, mediawiki-l-request(a)lists.wikimedia.org wrote:
>
>
Vector does away with the book background
image in favor of a gradient.
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)lists.wikimedia.org
>
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
------------------------------
Message: 6
Date: Tue, 9 Nov 2010 13:38:38 -0500
From: Benjamin Lees <emufarmers(a)gmail.com>
Subject: Re: [Mediawiki-l] PHP 5.2.x and ini_set(
'pcre.backtrack_limit', '2M' ); in LocalSettings.php
To: MediaWiki announcements and site admin list
<mediawiki-l(a)lists.wikimedia.org>
Message-ID:
<AANLkTi=K8aZgSyguXw4vorjpoO24vZgwok9t0bRovgv=(a)mail.gmail.com>
Content-Type: text/plain; charset=ISO-8859-1
Current versions of MediaWiki (1.16+) will actually check the memory
limit on every pageview and raise it if it's below 50MB, by default.
(The installer used to add an ini_set line in LocalSettings.php; see
http://www.mediawiki.org/wiki/Manual:$wgMemoryLimit ). There
shouldn't be any need to futz with it.
What exactly leads you to suspect pcre.backtrack_limit is causing
problems? Do you have a huge regex that's being silently ignored, or
are you getting an error?
------------------------------
Message: 7
Date: Tue, 9 Nov 2010 15:12:07 -0800
From: roger(a)rogerchrisman.com
Subject: Re: [Mediawiki-l] MW 1.16.0 Usability Extension - Default
Skin not working.
To: MediaWiki announcements and site admin list
<mediawiki-l(a)lists.wikimedia.org>
Message-ID:
<AANLkTimziRoAV9Ru5KHuQT3zq_pQ7mcTBK_W3sp2FwzH(a)mail.gmail.com>
Content-Type: text/plain; charset=ISO-8859-1
$wgDefaultSkin = 'cancervoicesnsw';
Is that variable reset to 'vector' farther down in your LocalSettings.php?
If the variable is set in more than one place in LocalSettings.php,
the last value set is the value PHP uses. PHP reads LocalSettings.php
from top to bottom.
Roger
------------------------------
Message: 8
Date: Tue, 9 Nov 2010 17:41:10 -0800
From: roger(a)rogerchrisman.com
Subject: Re: [Mediawiki-l] PHP 5.2.x and ini_set(
'pcre.backtrack_limit', '2M' ); in LocalSettings.php
To: MediaWiki announcements and site admin list
<mediawiki-l(a)lists.wikimedia.org>
Message-ID:
<AANLkTim--DXa1N5B6-4rQhHEEyj0CBigL9JYFu4GX_e2(a)mail.gmail.com>
Content-Type: text/plain; charset=ISO-8859-1
On Tue, Nov 9, 2010 at 10:38 AM, Benjamin Lees <emufarmers(a)gmail.com> wrote:
Current versions of MediaWiki (1.16+) will
actually check the memory
limit on every pageview and raise it if it's below 50MB, by default.
(The installer used to add an ini_set line in LocalSettings.php; see
http://www.mediawiki.org/wiki/Manual:$wgMemoryLimit ). ?There
shouldn't be any need to futz with it.
What exactly leads you to suspect pcre.backtrack_limit is causing
problems? ?Do you have a huge regex that's being silently ignored, or
are you getting an error?
I don't know where to look for errors about this. Is there is a log,
maybe a PHP log, that would record when a $wgSpamRegex runs out of
memory?
Here's the evidence I have that it might be running out of memory:
With 'pcre.backtrack_limit' at default (not set in LocalSettings.php,
and 'memory_limit' not set either) a clumsy regex like this one (meant
to match external links not separated by at least two words),
$wgSpamRegex = "/http:\/\/\S*\s*\S*\s*\S*\s*\S*http:\/\/\S*/i";
..fails to match any of the following wiki edit field text:
==Banana==
[
http://link.link.banababot.com/linkety-link:Linkety-plonk-1 Link-1]
b a n a n a
[
http://link.link.banababot.com/linkety-link:Linkety-plonk-2 Link-2]
[
http://link.link.banababot.com/linkety-link:Linkety-plonk-3 Link-3]
..However, it match links 2 and 3 when they are on top, as in this edit text:
==Banana==
[
http://link.link.banababot.com/linkety-link:Linkety-plonk-2 Link-2]
[
http://link.link.banababot.com/linkety-link:Linkety-plonk-3 Link-3]
[
http://link.link.banababot.com/linkety-link:Linkety-plonk-1 Link-1]
b a n a n a
Links 2 and 3 should match whether or not link 1 is above them. If I
set 'pcre.backtrack_limit' to 8M, or even 2M, then the regex
successfully matches links 1 and 2 in both texts above, as intended.
With 'pcre.backtrack_limit' at its default setting, not set in
LocalSettings.php, the regex fails to match the first text above. I
think it runs out of memory because it is a _clumsy_ regex. Where
might I look for a log of it failing or running out of memory?
The following regex is _better formed_, I think, and successfully
matches both above texts with or without a 'pcre.backtrack_limit'
setting in LocalSettings.php:
$wgSpamRegex = "http:\/\/(\S+\s+){1,3}\S*http:\/\/\S+|"/i; #
Matches 2 external links with less than " x.. x.. " between them
I did the above tests by editing LocalSettings.php and then trying
test edits in my wiki.
What is the relationship and precedence between PHP's
'pcre.backtrack_limit' and 'memory_limit' settings?
Thanks for your help!
Roger --Wikigogy.org
MediaWiki 1.16.0
PHP 5.2.14 (litespeed)
MySQL 5.0.91-community-log
------------------------------
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
End of MediaWiki-l Digest, Vol 86, Issue 8
******************************************