Welcome to mediawiki-l. This mailing list exists for discussion and questions
about the MediaWiki software[0]. Important MediaWiki-related announcements
(such as new versions) are also posted to this list.
Other resources.
If you only wish to receive announcements, you should subscribe to
mediawiki-announce[1] instead.
MediaWiki development discussion, and all Wikimedia technical questions, should
be directed to the wikitech-l[2] mailing list.
Several other MediaWiki-related lists exist:
- mediawiki-api[5] for API discussions,
- mediawiki-enterprise[6] for discussion of MediaWiki in the enterprise,
- mediawiki-cvs[7] for notification of commits to the Subversion repository,
- mediawiki-i18n[8] for discussion of MediaWiki internationalisation support,
- wikibugs-l[9] for notification of changes to the bug tracker.
List administrivia (unsubscribing, list archives).
To unsubscribe from this mailing list, visit [12]. Archives of previous postings
can be found at [3].
This list is also gatewayed to the Gmane NNTP server[4], which you can use to
read and post to the list.
Posting to the list.
Before posting to this list, please read the MediaWiki FAQ[10]. Many common
questions are answered here. You may also search the list archives to see if
your question has been asked before.
Please try to ask your question in a way that enables people to answer you.
Provide all relevant details, explain your problem clearly, etc. You may
wish to read [13], which explains how to ask questions well.
To post to the list, send mail to <mediawiki-l(a)lists.wikimedia.org>. This is a
public list, so you should not include confidential information in mails you
send.
When replying to an existing thread, use the "Reply" or "Followup" feature of
your mail client, so that clients that understand threading can sort your
message properly. When quoting other messages, please use the "inline" quoting
style[11], for clarity.
When creating a new thread, do not reply to an existing message and change the
subject. This will confuse peoples' mail readers, and will result in fewer
people reading your mail. Instead, compose a new message for your post.
Messages posted to the list have the "Reply-To" header set to the mailing list,
which means that by default, replies will go to the entire list. If you are
posting a reply which is only interesting to the original poster, and not the
list in general, you should change the reply to only go to that person. This
avoids cluttering the list with irrelevant traffic.
About this message.
This message is posted to the list once per week by <river(a)wikimedia.org>.
Please contact me if you have any questions or concerns about this mailing.
References.
[0] http://www.mediawiki.org/
[1] http://lists.wikimedia.org/mailman/listinfo/mediawiki-announce
[2] http://lists.wikimedia.org/mailman/listinfo/wikitech-l
[3] http://lists.wikimedia.org/pipermail/mediawiki-l/
[4] http://dir.gmane.org/gmane.org.wikimedia.mediawiki
[5] http://lists.wikimedia.org/mailman/listinfo/mediawiki-api
[6] http://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
[7] http://lists.wikimedia.org/mailman/listinfo/mediawiki-cvs
[8] http://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
[9] http://lists.wikimedia.org/mailman/listinfo/wikibugs-l
[10] http://www.mediawiki.org/wiki/FAQ
[11] http://en.wikipedia.org/wiki/Posting_style#Inline_replying
[12] http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
[13] http://www.catb.org/~esr/faqs/smart-questions.html
Hi all,
I'm trying to modify the GuMax skin for my site. And it is, I
think, *almost* working. The bug I've found is on a file
description page, e.g.
http://www.parts-unknown.org/mediawiki/index.php?title=File:Greater_hindust…
As you scroll down past the image (a map of Hindutva territorial
ambitions as indicated by Anjana Chatterji that's actually rather
scary), you get to the File History. The table background should be
transparent. Instead, it's white. Or none, which seems to produce
the same result.
I've gone through all my .css files trying to find why it isn't.
And I'm failing. I've made a copy of the skin available at
http://www.parts-unknown.org/mediawiki/files/earthwiki2.tgz if that
helps.
What do I need to fix?
Thanks!
--
David Benfell <benfell(a)parts-unknown.org>
http://www.parts-unknown.org/
Hi,
Two weeks ago I upgraded my wiki, http://wikigogy.org, from MediaWiki
1.9.2 to 1.16.0.
Maintenance/update.php ran fine (after I removed the leading './' from
the path to Bad Behavior in LocalSettings.php which had thrown a
warning the first time through).
I now see that _search_ is not finding all pages it should: I _think_
search is finding only pages that have been edited post upgrade!
Example:
First this search found only one page:
http://wikigogy.org/index.php?title=Special%3ASearch&redirs=1&search=approa…
[[Natural approach]]
Then I did a trivial edit at the end of another page that I knew
search should have found, [[Method, approach and strategy]], repeated
the same search and it found both:
[[Natural approach]]
[[Method, approach and strategy]]
Any idea why this is happening?
Is there a maintenance script I can run that will "touch" every page
on my wiki so that search will search them all again as usual?
A better way to solve this?
I have shell access to my wiki server and will happily provide any
additional info that will help solve this puzzle.
My LocalSettings.php has nothing about "Search" in it, but in contains
the word "Cache" here:
## Shared memory settings
$wgMainCacheType = CACHE_NONE;
$wgMemCachedServers = array();
and there:
# When you make changes to this configuration file, this will make
# sure that cached pages are cleared.
$configdate = gmdate( 'YmdHis', @filemtime( __FILE__ ) );
$wgCacheEpoch = max( $wgCacheEpoch, $configdate );
Is that relevant in any way?
Thank you,
Roger Chrisman --Wikigogy.org
MediaWiki 1.16.0
PHP 5.2.14 (litespeed)
MySQL 5.0.91-community-log
Hi guys,
I was trying to upgrade my installation of mediawiki from 1.15.4 to 1.16,
following the procedures listed on the Mediawiki upgrade page, and I ran
into a database error along the way. I'm using Debian Lenny with PHP
5.2.6-1+lenny9 (apache2handler) and Postgres 8.3.12. During the upgrade
procedure I got this error:
Query: CREATE TABLE user_properties(
up_user INTEGER NULL REFERENCES mwuser(user_id) ON DELETE CASCADE,
up_property TEXT NOT NULL,
up_value TEXT
)
Function: DatabaseBase::sourceStream
Error: 1 ERROR: there is no unique constraint matching given keys for
referenced table "mwuser"
And now the wiki is down. Can anyone help me figure out what to do from this
point? Luckily I have daily backups of the database so that's not a huge
problem, but I'd like to get this up and running again ASAP. I actually
restored a backup over the database and tried to install mediawiki 1.15.5
(from Debian's backports repository, since Mediawiki's archives are down)
but now anything related to editing is broken. I did not try deleting the
database and then restoring the backup though, which I probably should when
I get the time. Unfortunately it's a private internal wiki but if any more
information is needed I'd be happy to provide it.
Thanks!
-Ibrahim Awwal
Hello
I know mediawiki 1.9 is ancient but I have a running
installation with quite some extension which I would like to
export to a different machine which unfortunately must run
Kubuntu 10.04.
When I try to run 1.9 it does not work and I when I try to
install it from scratch I see the following error:
Parse error: syntax error, unexpected T_NAMESPACE, expecting
T_STRING in /var/www/wiki/includes/Namespace.php on line 46
I presume this is hopeless?!
Any advice is strongly appreciated
Uwe Brauer
Hello everyone,
After an update 1.12.0 to 1.16.0 (apparently successful), an error
message appears by submitting photos to the wiki:
"Warning!
The upload directory (public) is not writable from the web server. "
Rather than give permission anyway, maybe there specific instructions?
Another concern, some images are not displayed and the error message is:
'Unable to run external Programs in safe mode'
In addition, on some pages, appears at the top, for example :
Warning: mkdir () [function.mkdir]: Permission denied in /
home/www/f8d56dcb8a058b382b573d2a8274e93a/web/wiki/includes/GlobalFunctions.php
on line 2176
Warning: wfMkdirParents: failed to mkdir "/
home/www/f8d56dcb8a058b382b573d2a8274e93a/web/wiki/images/thumb/5/5c/Eglise-Corancy.jpg"
mode 511 in /
home/www/f8d56dcb8a058b382b573d2a8274e93a/web/wiki/includes/GlobalFunctions
. php on line 2179
Thank you for the help that you will make.
--
Martine
martine.bonissol at gennievre.net
Apologizes for duplications
====
Joint NETTAB 2010 and BBCC 2010 workshops focused on Biological Wikis
November 29 - December 1, 2010, Naples, Italy
http://www.nettab.org/2010/
LAST CALL FOR PARTICIPATION
The joint NETTAB and BBCC 2010 workshop on
"Biological Wikis" promises to be a great meeting
for all researchers involved in the exploitation of wikis in biology.
Come and discuss your ideas and doubts with such
scientists as Alex Bateman, Alexander Pico,
Andrew Su, Dan Bolser, Robert Hoffmann, Thomas
Kelder, Mike Cariaso, Adam Godzik, Luca Toldo,
Wyeth Wasserman, Daniel Renfro and other who, we hope, will join the workshop.
It's a great chance to follow smart tutorials and
lectures on WikiPathways, WikiGenes, PDBWiki,
Gene Wiki, TOPSAN, and a proficient use of
Wikipedia and Semantic Wiki. See below a list of
keynote speakers and tutorials. And of course enjoy Italian lifestyle....
Registration is open at http://www.nettab.org/2010/rform.html .
A reduction of 20 euro applies to all fees for members of ISCB and BITS.
SCIENTIFIC PROGRAMME
In the style of previous editions, NETTAB 2010
will include: keynote lectures given by leading
experts in the field, oral communications from
selected contributions, open discussion, selected
software demonstrations and posters, as well as tutorials.
Provisional programme ( see http://www.nettab.org/2010/progr.html )
Monday, November 29, 2010 (Tutorial day, open to all interested participants)
The following four tutorials will be given starting at 11.30.
Mining biological pathways using WikiPathways web services and more...
Thomas Kelder, Department of Bioinformatics
(BiGCaT), Maastricht University, the Netherlands
How to create your own collaborative publishing project with WikiGenes
Robert Hoffmann, Computational Biology Center,
cBIO, Memorial Sloan-Kettering Cancer Center, MSKCC, New York, USA
Everything you wanted to know about Wikipedia but were too afraid to ask
Alex Bateman, Wellcome Trust Sanger Institute,
Hinxton, Cambridge, United Kingdom
Andrew Su, Bioinformatics and Computational
Biology, Genomics Institute of the Novartis
Research Foundation (GNF), San Diego, USA
Semantic MediaWiki: a community database and more.
Dan Bolser, College of Life Sciences, University
of Dundee, Scotland, United Kingdom
Tuesday, November 30, 2010 (NETTAB workshop day)
A rich scientific programme is foreseen, starting
at 9.00. The following five invited talk will be given:
The Pros and Cons of Wikipedia for Scientists
Alex Bateman, Wellcome Trust Sanger Institute,
Hinxton, Cambridge, United Kingdom
Collaborative publishing with authorship tracking
and reputation system - WikiGenes
Robert Hoffmann, Computational Biology Center,
cBIO, Memorial Sloan-Kettering Cancer Center, MSKCC, New York, USA
WikiPathways, community-based curation for biological pathways
Alexander Pico, Gladstone Institute of
Cardiovascular Disease, San Francisco, USA
The Gene Wiki: Achieving critical mass and mining for novel annotations
Andrew Su, Bioinformatics and Computational
Biology, Genomics Institute of the Novartis
Research Foundation (GNF), San Diego, USA
PDBWiki : Success or failure?
Dan Bolser, College of Life Sciences, University
of Dundee, Scotland, United Kingdom
Many Oral communications and software
demonstrations will be presented, including:
SNPedia
Mike Cariaso, Keygene, The Netherlands
TOPSAN: a collaborative annotation environment
for structural genomics and beyond
Adam Godzik, Sanford-Burnham Medical Research Institute, La Jolla, CA, USA
SBML2SMW: bridging System Biology with semantic
web tech-nologies for biomedical knowledge
acquisition and hypothesis elicitation
Tobias Mathäß, Peter Haase, Hiroaki Kitano and Luca Toldo
Towards A General-Purpose Database Wiki for Biological Database Curation
Heiko Mueller, Sam Lindley, Joanna Sharman, Tony
Harmar, James Cheney and Peter Buneman
Demonstration of a citation-enabled workflow
using the ConceptWiki triple based approach
Christine Chichester, Hailiang Mei, Kees Burger and Barend Mons
Extending Mediawiki for community annotation
Daniel Renfro, Deborah A. Siegele and James C. Hu
The wiki-based Transcription Factor Encyclopedia
and a model for robust community participation
Wyeth Wasserman and Dimas Yusuf
A panel discussion with all speakers on the
Future of Biological Wikis will close the day. A
poster session is also planned.
Wednesday, December 1, 2010 (BBCC workshop day)
This section will include both oral
communications on various bioinformatics topics
and the announcemente of next NETTAB workshop.
This will be introduced by a lecture on "Clinical
bioinformatics: a research agenda to support
health care transformation" given by
Riccardo Bellazzi, University of Pavia, Italy
BBCC Oral communications
Repository for statistical analysis of mass spectrometry data
Federica Viti, Ivan Merelli, et al
Mathematical Models for Feature Selection and
their Application to Bioinformatics
Paola Festa, Paola Bertolazzi and Giovanni Felici
Toward an Improved Combinatoric Biclustering Algorithm
Ekaterina Nosova, Roberto Tagliaferri and Rino Miele
Massive-scale RNA-Seq analysis of non ribosomal
transcriptome in human trisomy 21
Valerio Costa, Claudia Angelini, et al
Eliciting Fuzzy Knowledge from the PIMA Dataset
Antonio d'Acierno, Giuseppe De Pietro and Massimo Esposito
Genome duplication and gene annotation: an
example for a reference plant species
Alessandra Vigilante, Mara Sangiovanni, et al
For previous editions of the BBCC workshop,
please visit http://bioinformatica.isa.cnr.it/BBCC/ .
For any further information or clarification:
Web site: http://www.nettab.org/2010/
Email: info @ nettab . org
Many thanks.
Paolo Romano
Paolo Romano (paolo.romano(a)istge.it)
Bioinformatics
National Cancer Research Institute (IST)
Largo Rosanna Benzi, 10, I-16132, Genova, Italy
Tel: +39-010-5737-288 Fax: +39-010-5737-295 Skype: p.romano
Web: http://www.nettab.org/promano/
_______________________________________________
Nettab-Announce mailing list
Nettab-Announce(a)istge.it
http://webmail.istge.it/mailman/listinfo/nettab-announce
Paolo Romano (paolo.romano(a)istge.it)
Bioinformatics
National Cancer Research Institute (IST)
Hey everybody,
I was wondering whether I can have two discrete wikis (semantic wikis to
be more accurate) sharing the same database on the same server. My
intention is to launch a simpler copy of the existing wiki (so they are
going to have the same database scheme). I want both of them to be
up-to-date, namely, if someone edit the one, the changes should appear
on the other as well. Is this possible?
Thank you,
Nata
The local settings file only defines the default skin variable once and it is set to cancervoicesnew. Also, after I log in my preferences shows the skin to be cancervoicesnew. Still doesn't answer the question of when I initially connect, the skin displayed is Vector and not that defined in the local settings file.
Sounds as if I am the only one having this issue.
Bobj
Dr Bob Jansen
Turtle Lane Studios
PO Box 26 Erskineville NSW 2043 Australia
Ph: +61 414 297 448
Skype: bobjtls
http://www.turtlelane.com.au
On 10/11/2010, at 12:41, mediawiki-l-request(a)lists.wikimedia.org wrote:
> Send MediaWiki-l mailing list submissions to
> mediawiki-l(a)lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> or, via email, send a message with subject or body 'help' to
> mediawiki-l-request(a)lists.wikimedia.org
>
> You can reach the person managing the list at
> mediawiki-l-owner(a)lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of MediaWiki-l digest..."
>
>
> Today's Topics:
>
> 1. MW 1.16.0 usability extension default skin not working
> (Dr Bob JAnsen)
> 2. Re: How can I get the content of a wiki page (Edward Swing)
> 3. Re: How can I get the content of a wiki page (yanick bajazet)
> 4. Re: How can I get the content of a wiki page (Edward Swing)
> 5. Re: MW 1.16.0 usability extension default skin not working
> (Trevor Parscal)
> 6. Re: PHP 5.2.x and ini_set( 'pcre.backtrack_limit', '2M' ); in
> LocalSettings.php (Benjamin Lees)
> 7. Re: MW 1.16.0 Usability Extension - Default Skin not working.
> (roger(a)rogerchrisman.com)
> 8. Re: PHP 5.2.x and ini_set( 'pcre.backtrack_limit', '2M' ); in
> LocalSettings.php (roger(a)rogerchrisman.com)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Tue, 9 Nov 2010 23:18:25 +1100
> From: Dr Bob JAnsen <bob.jansen(a)turtlelane.com.au>
> Subject: [Mediawiki-l] MW 1.16.0 usability extension default skin not
> working
> To: "mediawiki-l(a)lists.wikimedia.org"
> <mediawiki-l(a)lists.wikimedia.org>
> Message-ID: <E727BB82-962D-402E-9C51-EC9E8AC63062(a)turtlelane.com.au>
> Content-Type: text/plain; charset=us-ascii
>
> Trevor writes
>
> Vector does away with the book background image in favor of a gradient.
>
> But the question is, why is the Vector skin being display at all when the local settings file is configured to use my skin and why does logging in then correctly show my skin and then logging out revert back to Vector?
>
> Bobj
>
> Dr Bob Jansen
> Turtle Lane Studios
> PO Box 26 Erskineville NSW 2043 Australia
> Ph: +61 414 297 448
> Skype: bobjtls
> http://www.turtlelane.com.au
>
>
> On 09/11/2010, at 23:00, mediawiki-l-request(a)lists.wikimedia.org wrote:
>
>> Vector does away with the book background image in favor of a gradient.
>
>
> ------------------------------
>
> Message: 2
> Date: Tue, 9 Nov 2010 07:24:47 -0500
> From: "Edward Swing" <deswing(a)vsticorp.com>
> Subject: Re: [Mediawiki-l] How can I get the content of a wiki page
> To: "MediaWiki announcements and site admin list"
> <mediawiki-l(a)lists.wikimedia.org>
> Message-ID:
> <C505792E7472474DAC90E6C2844BBD967D6710(a)destroyer.VSTI.LOCAL>
> Content-Type: text/plain; charset="us-ascii"
>
> Are the user comments included as part of the wiki text? Or do you mean
> the comments that a user supplies when editing a page?
>
> -----Original Message-----
> From: mediawiki-l-bounces(a)lists.wikimedia.org
> [mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of yanick
> bajazet
> Sent: Tuesday, November 09, 2010 5:58 AM
> To: mediawiki-l(a)lists.wikimedia.org
> Subject: [Mediawiki-l] How can I get the content of a wiki page
>
>
> Hello,
>
> I'm creating wiki pages automatically using the script
> ImportTextFile.php (thank you Sam :-) ). But I need to be able update
> these pages without deleting the possible user comment.
>
> To do that, the only solution that I found, on the web, is to get the
> content of my wiki page with the php script "getText.php" (from
> maintenance) and do the merge by myself but this script seems to have
> been deleted in the last version of mediawiki.
>
> Is there a new script or a other way to get the content of my wiki page
> ? What other solutions to update wiki pages or part of them ?
>
> Sorry for my english, I'm french,
>
> Yanick
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
> This e-mail and any attachments are for the intended recipient(s) only and may contain proprietary, confidential material. If you are not the intended recipient, (even if the email address above is yours) do not use, retain, copy or disclose any part of this communication or any attachment as it is strictly prohibited and may be unlawful. If you believe that you have received this e-mail in error, please notify the sender immediately and permanently delete. This email may be a personal communication from the sender and as such does not represent the views of the company.
>
>
>
> ------------------------------
>
> Message: 3
> Date: Tue, 9 Nov 2010 12:35:32 +0000
> From: yanick bajazet <yanick1bjazet(a)hotmail.com>
> Subject: Re: [Mediawiki-l] How can I get the content of a wiki page
> To: <mediawiki-l(a)lists.wikimedia.org>
> Message-ID: <SNT125-W29B4E1EFB8A7D7E078C283EB300(a)phx.gbl>
> Content-Type: text/plain; charset="iso-8859-1"
>
>
> I'm mean the user comments included as part of the wiki text not the comments that a user supplies when editing a page
>
>
>> Date: Tue, 9 Nov 2010 07:24:47 -0500
>> From: deswing(a)vsticorp.com
>> To: mediawiki-l(a)lists.wikimedia.org
>> Subject: Re: [Mediawiki-l] How can I get the content of a wiki page
>>
>> Are the user comments included as part of the wiki text? Or do you mean
>> the comments that a user supplies when editing a page?
>>
>> -----Original Message-----
>> From: mediawiki-l-bounces(a)lists.wikimedia.org
>> [mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of yanick
>> bajazet
>> Sent: Tuesday, November 09, 2010 5:58 AM
>> To: mediawiki-l(a)lists.wikimedia.org
>> Subject: [Mediawiki-l] How can I get the content of a wiki page
>>
>>
>> Hello,
>>
>> I'm creating wiki pages automatically using the script
>> ImportTextFile.php (thank you Sam :-) ). But I need to be able update
>> these pages without deleting the possible user comment.
>>
>> To do that, the only solution that I found, on the web, is to get the
>> content of my wiki page with the php script "getText.php" (from
>> maintenance) and do the merge by myself but this script seems to have
>> been deleted in the last version of mediawiki.
>>
>> Is there a new script or a other way to get the content of my wiki page
>> ? What other solutions to update wiki pages or part of them ?
>>
>> Sorry for my english, I'm french,
>>
>> Yanick
>>
>> _______________________________________________
>> MediaWiki-l mailing list
>> MediaWiki-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>>
>> This e-mail and any attachments are for the intended recipient(s) only and may contain proprietary, confidential material. If you are not the intended recipient, (even if the email address above is yours) do not use, retain, copy or disclose any part of this communication or any attachment as it is strictly prohibited and may be unlawful. If you believe that you have received this e-mail in error, please notify the sender immediately and permanently delete. This email may be a personal communication from the sender and as such does not represent the views of the company.
>>
>> _______________________________________________
>> MediaWiki-l mailing list
>> MediaWiki-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
>
> ------------------------------
>
> Message: 4
> Date: Tue, 9 Nov 2010 08:00:39 -0500
> From: "Edward Swing" <deswing(a)vsticorp.com>
> Subject: Re: [Mediawiki-l] How can I get the content of a wiki page
> To: "MediaWiki announcements and site admin list"
> <mediawiki-l(a)lists.wikimedia.org>
> Message-ID:
> <C505792E7472474DAC90E6C2844BBD967D6712(a)destroyer.VSTI.LOCAL>
> Content-Type: text/plain; charset="us-ascii"
>
> If your pages have a consistent area where the users provide their
> comments, you may need to write something that parses the current
> contents of the page, and then appends the comments section to the new
> page contents. The more structured the page is, the easier it would be
> to parse.
>
> You can use api.php to retrieve the current page contents (as well as
> load new contents).
>
> -----Original Message-----
> From: mediawiki-l-bounces(a)lists.wikimedia.org
> [mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of yanick
> bajazet
> Sent: Tuesday, November 09, 2010 7:36 AM
> To: mediawiki-l(a)lists.wikimedia.org
> Subject: Re: [Mediawiki-l] How can I get the content of a wiki page
>
>
> I'm mean the user comments included as part of the wiki text not the
> comments that a user supplies when editing a page
>
>
>> Date: Tue, 9 Nov 2010 07:24:47 -0500
>> From: deswing(a)vsticorp.com
>> To: mediawiki-l(a)lists.wikimedia.org
>> Subject: Re: [Mediawiki-l] How can I get the content of a wiki page
>>
>> Are the user comments included as part of the wiki text? Or do you
> mean
>> the comments that a user supplies when editing a page?
>>
>> -----Original Message-----
>> From: mediawiki-l-bounces(a)lists.wikimedia.org
>> [mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of yanick
>> bajazet
>> Sent: Tuesday, November 09, 2010 5:58 AM
>> To: mediawiki-l(a)lists.wikimedia.org
>> Subject: [Mediawiki-l] How can I get the content of a wiki page
>>
>>
>> Hello,
>>
>> I'm creating wiki pages automatically using the script
>> ImportTextFile.php (thank you Sam :-) ). But I need to be able update
>> these pages without deleting the possible user comment.
>>
>> To do that, the only solution that I found, on the web, is to get the
>> content of my wiki page with the php script "getText.php" (from
>> maintenance) and do the merge by myself but this script seems to have
>> been deleted in the last version of mediawiki.
>>
>> Is there a new script or a other way to get the content of my wiki
> page
>> ? What other solutions to update wiki pages or part of them ?
>>
>> Sorry for my english, I'm french,
>>
>> Yanick
>>
>> _______________________________________________
>> MediaWiki-l mailing list
>> MediaWiki-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>>
>> This e-mail and any attachments are for the intended recipient(s) only
> and may contain proprietary, confidential material. If you are not the
> intended recipient, (even if the email address above is yours) do not
> use, retain, copy or disclose any part of this communication or any
> attachment as it is strictly prohibited and may be unlawful. If you
> believe that you have received this e-mail in error, please notify the
> sender immediately and permanently delete. This email may be a personal
> communication from the sender and as such does not represent the views
> of the company.
>>
>> _______________________________________________
>> MediaWiki-l mailing list
>> MediaWiki-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
> This e-mail and any attachments are for the intended recipient(s) only and may contain proprietary, confidential material. If you are not the intended recipient, (even if the email address above is yours) do not use, retain, copy or disclose any part of this communication or any attachment as it is strictly prohibited and may be unlawful. If you believe that you have received this e-mail in error, please notify the sender immediately and permanently delete. This email may be a personal communication from the sender and as such does not represent the views of the company.
>
>
>
> ------------------------------
>
> Message: 5
> Date: Tue, 09 Nov 2010 08:02:02 -0800
> From: Trevor Parscal <tparscal(a)wikimedia.org>
> Subject: Re: [Mediawiki-l] MW 1.16.0 usability extension default skin
> not working
> To: mediawiki-l(a)lists.wikimedia.org
> Message-ID: <4CD9707A.4060809(a)wikimedia.org>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> Your user account is set to vector?
>
> - Trevor
>
> On 11/9/10 4:18 AM, Dr Bob JAnsen wrote:
>> Trevor writes
>>
>> Vector does away with the book background image in favor of a gradient.
>>
>> But the question is, why is the Vector skin being display at all when the local settings file is configured to use my skin and why does logging in then correctly show my skin and then logging out revert back to Vector?
>>
>> Bobj
>>
>> Dr Bob Jansen
>> Turtle Lane Studios
>> PO Box 26 Erskineville NSW 2043 Australia
>> Ph: +61 414 297 448
>> Skype: bobjtls
>> http://www.turtlelane.com.au
>>
>>
>> On 09/11/2010, at 23:00, mediawiki-l-request(a)lists.wikimedia.org wrote:
>>
>>> Vector does away with the book background image in favor of a gradient.
>> _______________________________________________
>> MediaWiki-l mailing list
>> MediaWiki-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
>
>
> ------------------------------
>
> Message: 6
> Date: Tue, 9 Nov 2010 13:38:38 -0500
> From: Benjamin Lees <emufarmers(a)gmail.com>
> Subject: Re: [Mediawiki-l] PHP 5.2.x and ini_set(
> 'pcre.backtrack_limit', '2M' ); in LocalSettings.php
> To: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
> Message-ID:
> <AANLkTi=K8aZgSyguXw4vorjpoO24vZgwok9t0bRovgv=(a)mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
>
> Current versions of MediaWiki (1.16+) will actually check the memory
> limit on every pageview and raise it if it's below 50MB, by default.
> (The installer used to add an ini_set line in LocalSettings.php; see
> http://www.mediawiki.org/wiki/Manual:$wgMemoryLimit ). There
> shouldn't be any need to futz with it.
>
> What exactly leads you to suspect pcre.backtrack_limit is causing
> problems? Do you have a huge regex that's being silently ignored, or
> are you getting an error?
>
>
>
> ------------------------------
>
> Message: 7
> Date: Tue, 9 Nov 2010 15:12:07 -0800
> From: roger(a)rogerchrisman.com
> Subject: Re: [Mediawiki-l] MW 1.16.0 Usability Extension - Default
> Skin not working.
> To: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
> Message-ID:
> <AANLkTimziRoAV9Ru5KHuQT3zq_pQ7mcTBK_W3sp2FwzH(a)mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
>
>> $wgDefaultSkin = 'cancervoicesnsw';
>
> Is that variable reset to 'vector' farther down in your LocalSettings.php?
>
> If the variable is set in more than one place in LocalSettings.php,
> the last value set is the value PHP uses. PHP reads LocalSettings.php
> from top to bottom.
>
> Roger
>
>
>
> ------------------------------
>
> Message: 8
> Date: Tue, 9 Nov 2010 17:41:10 -0800
> From: roger(a)rogerchrisman.com
> Subject: Re: [Mediawiki-l] PHP 5.2.x and ini_set(
> 'pcre.backtrack_limit', '2M' ); in LocalSettings.php
> To: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
> Message-ID:
> <AANLkTim--DXa1N5B6-4rQhHEEyj0CBigL9JYFu4GX_e2(a)mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
>
> On Tue, Nov 9, 2010 at 10:38 AM, Benjamin Lees <emufarmers(a)gmail.com> wrote:
>> Current versions of MediaWiki (1.16+) will actually check the memory
>> limit on every pageview and raise it if it's below 50MB, by default.
>> (The installer used to add an ini_set line in LocalSettings.php; see
>> http://www.mediawiki.org/wiki/Manual:$wgMemoryLimit ). ?There
>> shouldn't be any need to futz with it.
>>
>> What exactly leads you to suspect pcre.backtrack_limit is causing
>> problems? ?Do you have a huge regex that's being silently ignored, or
>> are you getting an error?
>
> I don't know where to look for errors about this. Is there is a log,
> maybe a PHP log, that would record when a $wgSpamRegex runs out of
> memory?
>
> Here's the evidence I have that it might be running out of memory:
>
> With 'pcre.backtrack_limit' at default (not set in LocalSettings.php,
> and 'memory_limit' not set either) a clumsy regex like this one (meant
> to match external links not separated by at least two words),
> $wgSpamRegex = "/http:\/\/\S*\s*\S*\s*\S*\s*\S*http:\/\/\S*/i";
>
> ..fails to match any of the following wiki edit field text:
>
> ==Banana==
> [http://link.link.banababot.com/linkety-link:Linkety-plonk-1 Link-1]
> b a n a n a
> [http://link.link.banababot.com/linkety-link:Linkety-plonk-2 Link-2]
> [http://link.link.banababot.com/linkety-link:Linkety-plonk-3 Link-3]
>
>
> ..However, it match links 2 and 3 when they are on top, as in this edit text:
> ==Banana==
> [http://link.link.banababot.com/linkety-link:Linkety-plonk-2 Link-2]
> [http://link.link.banababot.com/linkety-link:Linkety-plonk-3 Link-3]
> [http://link.link.banababot.com/linkety-link:Linkety-plonk-1 Link-1]
> b a n a n a
>
>
> Links 2 and 3 should match whether or not link 1 is above them. If I
> set 'pcre.backtrack_limit' to 8M, or even 2M, then the regex
> successfully matches links 1 and 2 in both texts above, as intended.
> With 'pcre.backtrack_limit' at its default setting, not set in
> LocalSettings.php, the regex fails to match the first text above. I
> think it runs out of memory because it is a _clumsy_ regex. Where
> might I look for a log of it failing or running out of memory?
>
>
> The following regex is _better formed_, I think, and successfully
> matches both above texts with or without a 'pcre.backtrack_limit'
> setting in LocalSettings.php:
>
> $wgSpamRegex = "http:\/\/(\S+\s+){1,3}\S*http:\/\/\S+|"/i; #
> Matches 2 external links with less than " x.. x.. " between them
>
>
> I did the above tests by editing LocalSettings.php and then trying
> test edits in my wiki.
>
> What is the relationship and precedence between PHP's
> 'pcre.backtrack_limit' and 'memory_limit' settings?
>
> Thanks for your help!
>
> Roger --Wikigogy.org
> MediaWiki 1.16.0
> PHP 5.2.14 (litespeed)
> MySQL 5.0.91-community-log
>
>
>
> ------------------------------
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
>
> End of MediaWiki-l Digest, Vol 86, Issue 8
> ******************************************