Welcome to mediawiki-l. This mailing list exists for discussion and questions
about the MediaWiki software[0]. Important MediaWiki-related announcements
(such as new versions) are also posted to this list.
Other resources.
If you only wish to receive announcements, you should subscribe to
mediawiki-announce[1] instead.
MediaWiki development discussion, and all Wikimedia technical questions, should
be directed to the wikitech-l[2] mailing list.
Several other MediaWiki-related lists exist:
- mediawiki-api[5] for API discussions,
- mediawiki-enterprise[6] for discussion of MediaWiki in the enterprise,
- mediawiki-cvs[7] for notification of commits to the Subversion repository,
- mediawiki-i18n[8] for discussion of MediaWiki internationalisation support,
- wikibugs-l[9] for notification of changes to the bug tracker.
List administrivia (unsubscribing, list archives).
To unsubscribe from this mailing list, visit [12]. Archives of previous postings
can be found at [3].
This list is also gatewayed to the Gmane NNTP server[4], which you can use to
read and post to the list.
Posting to the list.
Before posting to this list, please read the MediaWiki FAQ[10]. Many common
questions are answered here. You may also search the list archives to see if
your question has been asked before.
To post to the list, send mail to <mediawiki-l(a)lists.wikimedia.org>. This is a
public list, so you should not include confidential information in mails you
send.
When replying to an existing thread, use the "Reply" or "Followup" feature of
your mail client, so that clients that understand threading can sort your
message properly. When quoting other messages, please use the "inline" quoting
style[11], for clarity.
When creating a new thread, do not reply to an existing message and change the
subject. This will confuse peoples' mail readers, and will result in fewer
people reading your mail. Instead, compose a new message for your post.
Messages posted to the list have the "Reply-To" header set to the mailing list,
which means that by default, replies will go to the entire list. If you are
posting a reply which is only interesting to the original poster, and not the
list in general, you should change the reply to only go to that person. This
avoids cluttering the list with irrelevant traffic.
About this message.
This message is posted to the list once per week by <river(a)wikimedia.org>.
Please contact me if you have any questions or concerns about this mailing.
References.
[0] http://www.mediawiki.org/
[1] http://lists.wikimedia.org/mailman/listinfo/mediawiki-announce
[2] http://lists.wikimedia.org/mailman/listinfo/wikitech-l
[3] http://lists.wikimedia.org/pipermail/mediawiki-l/
[4] http://dir.gmane.org/gmane.org.wikimedia.mediawiki
[5] http://lists.wikimedia.org/mailman/listinfo/mediawiki-api
[6] http://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
[7] http://lists.wikimedia.org/mailman/listinfo/mediawiki-cvs
[8] http://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
[9] http://lists.wikimedia.org/mailman/listinfo/wikibugs-l
[10] http://www.mediawiki.org/wiki/FAQ
[11] http://en.wikipedia.org/wiki/Posting_style#Inline_replying
[12] http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
I have devised a way to host a PEAR channel on SVN repositories on
GoogleCode project hosting.
The first 2 extensions available are:
* [[Extension:StubManager]]
* [[Extension:SecurePHP]]
...more to follow.
Install the channel using:
/pear channel-discover http://mediawiki.googlecode.com/svn/
Consult my user page [[user:jldupont]] for pointers on how to setup your
PEAR channel on a SVN repository.
Have fun,
Jean-Lou.
Hi all,
I'm trying to implement the force preview before save feature on my
1.10 install to stop unnecessary edits.
I found these instructions for 1.8.2 and 1.9.0
http://www.mediawiki.org/wiki/Manual:FAQ#How_can_I_force_users_to_preview_b…
but I don't think that will work for 1.10
I then found the 1.10 javascript, but it doesn't say where to place
it. Also, this is for forcing a preview before a save for certain
groups, but I want it to be everyone.
http://www.mediawiki.org/wiki/Manual:Force_preview
Any idea which method I should use? And where to implement the 1.10 method?
Thanks,
-GT
Hi,
I also have the same problem. The defualt value for $wgMaxArticleSize is 2 mb. Then why is it showing this warnign message for articles less than that? And the message says the article is longer than 32 kb.
Regards,
Jack
----------------------------------------------------------------
"People forget how fast you did a job - but they remember how well you did it"
-----Original Message-----
From: mediawiki-l-bounces(a)lists.wikimedia.org [mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of Dave Sigafoos
Sent: Tuesday, October 02, 2007 8:30 PM
To: MediaWiki announcements and site admin list
Subject: Re: [Mediawiki-l] Page Size
It was bugging me so this might help
http://lists.wikimedia.org/mailman/htdig/mediawiki-l/2006-March/010686.html
DSig
David Tod Sigafoos | SANMAR Corporation
PICK Guy
-----Original Message-----
From: mediawiki-l-bounces(a)lists.wikimedia.org [mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of Paulo Ramos
Sent: Tuesday, October 02, 2007 7:16
To: mediawiki-l(a)lists.wikimedia.org
Subject: [Mediawiki-l] Page Size
Hi all,
When I create a big page on Wiki I receive a message: WARNING: This page is 76 kilobytes long; some browsers may have problems editing pages approaching or longer than 32kb. Please consider breaking the page into smaller sections.
If I continue inserting information on the page It didn´t save the page and I lost all the content.
Does anybody know how to configure Wiki to support longer pages ?
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
This electronic mail (including any attachment thereto) may be confidential and privileged and is intended only for the individual or entity named above. Any unauthorized use, printing, copying, disclosure or dissemination of this communication may be subject to legal restriction or sanction. Accordingly, if you are not the intended recipient, please notify the sender by replying to this email immediately and delete this email (and any attachment thereto) from your computer system...Thank You
Hi all,
When I create a big page on Wiki I receive a message: WARNING: This page is 76 kilobytes long; some browsers may have problems editing pages approaching or longer than 32kb. Please consider breaking the page into smaller sections.
If I continue inserting information on the page It didn´t save the page and I lost all the content.
Does anybody know how to configure Wiki to support longer pages ?
Tks.
Paulo.
I'm setting up MediaWiki on a CentOS 4.5 (RHEL 4 clone) box. The
default MySQL is 4.1. This is horrible, but is installable via yum
(the box is firewalled such that I can't hit the centosplus
repositories easily), which saves me the pain of RPMs.
But how long is it likely MediaWiki will support MySQL 4?
(We try to run what MySQL we have here on the latest 5.0, which we
usually install from source or *maybe* RPMs. We prefer Postgres for
our internal stuff, but I'd rather MySQL for MediaWiki 'cos it's the
main platform, i.e. what Wikimedia runs. (Same reason I'm putting it
on our Linux app server instead of continuing to battle Solaris 9.))
- d.
When I use maintenance/dumpHTML.php to create a static copy of our wiki,
the static pages still point to the original "images" directory, using
paths like:
<a href="/w/images/a/a0/myimage.gif">myimage.gif</a>
This works fine when the static copy is viewed via HTTP, but if there's
no webserver and you want to view the pages on disk, no images are found
because of the absolute "/w" paths. I've tried using the
--image-snapshot and --force-copy options without affecting this
behavior.
$ php dumpHTML.php -d d:\mediawiki\static --image-snapshot
--force-copy
I don't completely understand the image-snapshot option - I thought it
would copy all images into the upload folder, but it copies only a few.
Is there some way to make dumpHTML produce a folder hierarchy that can
be viewed on disk, with images? Or am I doing something wrong?
This is on Windows 2003 Server if that makes any difference.
Thanks,
DanB
Hello everyone,
Please excuse me if the question is really dumb... but I am a newbie here,
so I should be forgiven for that:) The question is this: can PDF files
uploaded to a Mediawiki-powered site be made searchable, and if so, how does
one go about that?
Thanks a lot for all your help.
Cheers,
--
Boris Epstein
http://www.dogandponny.org/http://dikayasobaka.livejournal.com/ (Russian)
Hello,
I'm looking for someone who could make a clean mediawiki skin based on
an existing skin. This is of course a paid job; the skin will be used
for a Free Software project named Flouzo, and will be entirely copyleft.
Here is the description:
Create a http://mediawiki.org/ version 1.10 skin named Flouzo.php,
Flouzo.deps.php with graphics in the flouzo directory. The skin must be
identical, to the pixel, to the skin available at
http://warsow.flouzo.net/.
The implementation must be minimal. The selenium tests must try each
component of the interface (menu, links etc.).
The deliverable must be provided with a single page with a HOWTO
describing how to run the tests.
The test must run with the latest version of IE and Firefox.
The skin must not modify any of the mediawiki functionalities or user
experience. It must only modify the look of the page.
http://www.rentacoder.com/RentACoder/misc/BidRequests/ShowBidRequest.asp?ln…
--
Xavier.
Hi all,
I just upgraded my mediawiki installation from version 1.9.3 to 1.11.0.
The upgrade went smoothly (including the maintenance/update.php part) but
now page delivery from the server is incredibly slow (about 40s for each
page).
For testing purposes I also did a fresh install of version 1.11.0 using a
copy of the original database. This one is running smoothly (2s per page)
although the LocalSettings.php are identical (except the necessary
differences; checked with diff).
I turned on profiling and it seems that the server is waiting for a
memcached-server (see below):
--------------------------------------------------------------------------------------------------------------------------
Start request
GET /wiki/index.php5/Hauptseite
Accept: */*
Referer: http://XXXXX/er_page/
Accept-Language: de
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET
CLR 1.1.4322; .NET CLR 2.0.50727)
Host: dephd0080.rockwellcollins.com
Connection: Keep-Alive
Cookie: wikidbUserID=fDSY6tzj5mRmRSbScc7kkfb5MVNtrdUpbtn-YKamQeI.;
wikidbUserName=BU6md5v7QoBjaPWgjiDFKQ6zUFA6qcnb5kDAN4bLZDo.;
wikidbToken=WCECG_8rs-DpCfE9biU2YMIQQNwjsrbzvdiSwJTLAaFHDMZ-wNU4NaARwKVGXgY-;
wikidb_session=Ly7ZvAQ1kMeToIurVQUAbzTyN6aG7WmzRCC7RkbP7VY9AztrT_pPCzTpmDsuTMwQ
Main cache: FakeMemCachedClient
Message cache: MediaWikiBagOStuff
Parser cache: MediaWikiBagOStuff
Unstubbing $wgParser on call of $wgParser->setHook from efSubpageList
Fully initialised
Unstubbing $wgContLang on call of $wgContLang->checkTitleEncoding from
WebRequest::getGPCVal
Language::loadLocalisation(): got localisation for de from source
Language::loadLocalisation(): got localisation for en from source
Unstubbing $wgUser on call of $wgUser->isAllowed from Title::userCanRead
Cache miss for user 14
Unstubbing $wgLoadBalancer on call of $wgLoadBalancer->getConnection from
wfGetDB
Logged in from session
Unstubbing $wgOut on call of $wgOut->setSquidMaxage from
MediaWiki::performAction
Unstubbing $wgLang on call of $wgLang->getCode from
User::getPageRenderingHash
OutputPage::checkLastModified: client did not send If-Modified-Since
header
Article::tryFileCache(): not cacheable
Article::view using parser cache: yes
Trying parser cache wikidb:pcache:idhash:1-0!1!0!!de!2
Parser cache miss.
Unstubbing $wgMessageCache on call of $wgMessageCache->getTransform from
wfMsgGetKey
MessageCache::load(): cache is empty
MessageCache::load(): loading all messages from DB
MemCached set error in MessageCache: restart memcached server!
Saved in parser cache with key wikidb:pcache:idhash:1-0!1!0!!de!2 and
timestamp 20071001120757
OutputPage::sendCacheControl: private caching; Mon, 01 Oct 2007 11:55:50
GMT **
20071001120757 39.736 /wiki/index.php5/Hauptseite
Request ended normally
------------------------------------------------------------------------------------------------------------------------------------
Debug output from the new and clean installation shows no MemCached
errors:
---------------------------------------------------------------------------------------------------------------------------------
Start request
GET /php/mediawiki-1.11.0rc1/index.php5/Proteus/WCA
Host: dephd0080
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.8.1.7)
Gecko/20070914 Firefox/2.0.0.7
Accept:
text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5
Accept-Language: de-de,de;q=0.8,en-us;q=0.5,en;q=0.3
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Referer:
http://XXX/php/mediawiki-1.11.0rc1/index.php5/E3000/6%2C8V_Z-Diode
Cookie:
wikidbToken=nDjM5UyIQUUrFVQDEvMVp378QyY9dVNkGXwLioUKkJ82A9PBsrJwRAZlYN_KkGCL;
wikidbUserName=i4l5AWOwFjyqUIzh1KW-WCmn0vNuECv_8dkZvn9lkWE.;
wikidbUserID=kbm8jn7GNRYpai9dEy3RqN6bZJcItgKdkOjHvMhKjsc.;
wikidb_session=R4qkwULU20-Ez1DiDK6vt7fkZ1jkl1zHlJcpJJ3IzIXb9UcGbKYUVBe67S8w5d40
Main cache: FakeMemCachedClient
Message cache: MediaWikiBagOStuff
Parser cache: MediaWikiBagOStuff
Unstubbing $wgParser on call of $wgParser->setHook from efSubpageList
Fully initialised
Unstubbing $wgContLang on call of $wgContLang->checkTitleEncoding from
WebRequest::getGPCVal
Language::loadLocalisation(): got localisation for de from source
Language::loadLocalisation(): got localisation for en from source
Unstubbing $wgUser on call of $wgUser->isAllowed from Title::userCanRead
Unstubbing $wgLoadBalancer on call of $wgLoadBalancer->getConnection from
wfGetDB
IP: XXX.XXX.XXX.XXX
Unstubbing $wgOut on call of $wgOut->setSquidMaxage from
MediaWiki::performAction
Unstubbing $wgLang on call of $wgLang->getCode from
User::getPageRenderingHash
Unstubbing $wgMessageCache on call of $wgMessageCache->loadAllMessages
from User::getGroupName
MessageCache::load(): got from global cache
OutputPage::checkLastModified: client did not send If-Modified-Since
header
Article::tryFileCache(): not cacheable
Article::view using parser cache: yes
Trying parser cache wikidb_TEST:pcache:idhash:1786-0!1!0!!de!2!edit=0
Parser cache miss.
Saved in parser cache with key
wikidb_TEST:pcache:idhash:1786-0!1!0!!de!2!edit=0 and timestamp
20071001105101
OutputPage::sendCacheControl: private caching; Mon, 01 Oct 2007 10:04:50
GMT **
Request ended normally
-------------------------------------------------------------------------------------------------------------------------------
I don't know why MemCache shows up in the debug output, since it is
disabled in my LocalSettings.php:
---------------------------------------------------------------------
## Shared memory settings
$wgMainCacheType = CACHE_NONE;
$wgMemCachedServers = array();
---------------------------------------------------------------------
Any advices how to solve this?!
Thanks in advance,
Arnd