$wgDBtransactions gets set to true if using InnoDB tables. Is there
an advantage to using InnoDB tables?
The disadvantage is that with MySQL there is a file, ibdata1, that
seems to grow endlessly if InnoDB tables are used. See
http://bugs.mysql.com/bug.php?id=1341
We're wondering if we should just convert everything to MyISAM. Any
thoughts?
=====================================
Jim Hu
Associate Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hi all,
I've created some custom namespaced on one of my wikis, Botwiki
(previously known as pywikipedia).
I've put these lines in my LocalSettings.php file:
- ---
#Custom namespaces
$wgExtraNamespaces =
array(100 => "Manual",
101 => "Manual talk",
102 => "Python",
103 => "Python talk",
104 => "Php",
105 => "Php talk",
106 => "Perl",
107 => "Perl talk",
108 => "AWB",
109 => "AWB talk",
110 => "IRC",
111 => "IRC talk",
112 => "Other",
113 => "Other talk"
);
$wgContentNamespaces[] = 100;
$wgContentNamespaces[] = 102;
$wgContentNamespaces[] = 104;
$wgContentNamespaces[] = 106;
$wgContentNamespaces[] = 108;
$wgContentNamespaces[] = 110;
$wgContentNamespaces[] = 112;
- ---
However, I have a big problem: when I go to a page in one of these new
namespaces (not the discussion, the main ones), for example
http://botwiki.sno.cc/wiki/Perl:Copyright_Violation_Bot , I found the
red link to the discussion page. It's right, as there is no discussion
page for that article. But if you click on it, it brings you to
http://botwiki.sno.cc/w/index.php?title=Perl_talk:Copyright_Violation_Bot&a…
correct, of course. But have a look of the article and discussion tabs:
they are both red! The first, "article", leads to
http://botwiki.sno.cc/w/index.php?title=Perl_talk:Copyright_Violation_Bot&a…
when it should lead to
http://botwiki.sno.cc/wiki/Perl:Copyright_Violation_Bot and the second,
"discussion", leads to
http://botwiki.sno.cc/w/index.php?title=Talk:Perl_talk:Copyright_Violation_…
, when it should lead to
http://botwiki.sno.cc/w/index.php?title=Perl_talk:Copyright_Violation_Bot&a…
.
It's the first time I deal with custom namespaces :-( but I have some
ideas of what it can be. Can the problem be with the
$wgContentNamespaces settings? So it detects everything as ns0? (don't
think so).
Or can it be the fact that I haven't used an underscore in the
$wgExtraNamespaces definition?
Snowolf
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.7 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFGWhk7sdafW5NQMtERAuX+AKDQ7QLNjXv9cu+ZbSLXidMzgi/vNgCaA7VT
+VTgR3iI/BI7FVDqcyRZVJ0=
=a4yP
-----END PGP SIGNATURE-----
Hi,
I have been trying to create a maintainable way of grouping pages
together and allowing readers to page through them in sequence. Most
samples I've seen use templates that require you to supply the
'previous' and 'next' page. This however results in three page edits
to insert a new page between two existing pages and does not guarantee
that your prev-sequence is identical to the next-sequence...
Being a programmer, that is way too much duplicate and error-prone
work for me ;-) There must be a better way to do this and I was hoping
to solve it with a plain MW installation (with ParserFunctions and
DynamicPageList at the moment).
Given an 'index' page that holds a list of all page titles in the
preferred order, isn't it possible to create a template that selects
the correct previous and next page title given the current page title?
I get stuck in getting the correct lines from the index page. DPL can
select based on section name (= page title), but then the contents of
that index section must be the prev/next links themselves:
=First Page=
Prev [[Second Page|Next]]
=Second Page=
[[First Page|Prev]] [[Third Page|Next]]
=Third Page=
.. etc.
This works, except that it is still a lot of duplication of page names
(but the edits are contained in a single page, big plus).
I hoped to simplify the index page by creating a template that writes
the section header and prev/next links, but then DPL no longer
recognizes the sections :-( Apparently DPL 'sees' the page text before
the templates are called ({{Page|Prev page|Page title|Next page}}:
{{Page||First Page|Second Page}}
{{Page|First Page|Second Page|Third Page}}
{{Page|Second Page|Third Page|Fourth Page}}
Basically my questions are:
1. Am I completely off track here?
2. Can DPL be coerced to evaluate templates before looking at the page
3. Can DPL (or another extension) select the text from a section
before/after a matched section?
4. Is it possible to determine the section sequence number given the
section name (so 'Second Page' results in 2, allowing me to use DPL to
retrieve the name of section 2 -/- 1 and 2 + 1 to create the prev/next
links?
Apologies for the long post, hopefully someone can point me to some
good resources (I've been to Meta, Wikibooks, Medawiki.org but could
very well have overlooked something there as the amount of info is a
bit overwhelming and it is difficult to judge how up to date it is).
--
Regards,
Jean-Marc
I'm seeing this weird anomaly in wiki syntax. Suppose you have this
template, called A:
{{#switch: {{{1}}}
| foo = [[File:foo.png]]
| bar = [[File:bar.png]]
| baz = [[File:baz.png]]
| UNKNOWN
}}
Now, with something like
{{A|foo}} and {{A|bar}}.
I get
<p><img src="foo.png"/>
</p>
<pre>and <img src="bar.png"/>
</pre>
<p>.
</p>
That is, the " and {{A|bar}}" is treated as a new paragraph. However,
if I have
[[File:foo.png]] and [[File:bar.png]].
I get
<p><img src="foo.png"/> and <img src="bar.png"/>.
</p>
So there is something with the newlines that are at the end of each
case, I'd presume. Compacting this all into one line appears not to
help. How can I change Template:A so that I get the more desirable
second outcome?
MediaWiki version: 1.15.1
PHP version: 5.3.0
MySQL version: 5.1.37
URL: N/A (localhost)
I'm using XAMPP 1.7.2 When I'm editing an article and then navigate to an
new page and afterwards press back the content that I edited reverts to the
version before the edit. Also when pressing back after pressing Show preview
I get a session timeout error.
This doesn't happen on the official English Wikipedia.
Can you help me fix this?
I am running mediawiki on Linux a box.
Here is what I have noticed:
[file://G:\TEMP\linktest\test.pdf]
works fine
But when there are spaces in the folder/file name (for example:G:\TEMP\link test\test.pdf):
[file://G:\TEMP\link%20test\test.pdf]
does not work
However if you add an extra "/" in the file:// keyword like this:
[file:///G:\TEMP\link%20test\test.pdf]
works fine.
Is this something normal? Is there any setting (in Mediawiki) by which adding the extra "/" can be taken care off. It is easy for technical users but non-technical users can get frustrated if they have to remember these details.
Kushal Koolwal
I do blog at http://9to5.koolwal.net/
_________________________________________________________________
Windows 7: It helps you do more. Explore Windows 7.
http://www.microsoft.com/Windows/windows-7/default.aspx?ocid=PID24727::T:WL…
MediaWiki 1.15.1 Installation
* Don't forget security updates! Keep an eye on the low-traffic
release announcements mailing list.
Checking environment...
Please include all of the lines below when reporting installation problems.
* PHP 5.2.6 installed
* Found database drivers for: MySQL PostgreSQL SQLite
* Warning: PHP's safe mode is active. You may have problems caused
by this, particularly if using image uploads.
* PHP server API is apache2handler; ok, using pretty URLs
(index.php/Page_Title)
* Have XML / Latin1-UTF-8 conversion support.
* Session save path
(/var/www/vhosts/opensurf.it/httpdocs/media/tmp) appears to be valid.
* PHP's memory_limit is 32M.
* Couldn't find Turck MMCache, eAccelerator, APC or XCache; cannot
use these for object caching.
* GNU diff3 not found.
-------------
I had to change
+++ mediawiki-1.15.1/install-utils.inc 2009-10-28 00:04:59.000000000 +0100
@@ -126,8 +126,9 @@
* @return string
*/
function mw_get_session_save_path() {
- $path = ini_get( 'session.save_path' );
- $path = substr( $path, strrpos( $path, ';' ) );
+// $path = ini_get( 'session.save_path' );
+// $path = substr( $path, strrpos( $path, ';' ) );
+ $path = '/var/www/vhosts/opensurf.it/httpdocs/media/tmp';
return $path;
}
@@ -144,4 +145,4 @@
&& is_callable( 'dl' )
&& wfIniGetBool( 'enable_dl' )
&& !wfIniGetBool( 'safe_mode' );
-}
\ No newline at end of file
+}
guess I have path problems in config/index.php .
I'm installing on a remote public server, running plesk.
---------------
Any help?
According to the documentation, I should be able to reference a template
by doing this:
<noinclude>
{{NameOfTemplate}}
</noinclude>
But Mediawiki is transcluding the template into the page and I don't
want it there. I just want developers to be able to click the template
and grab the template text when they're creating a new article.
Is there a setting in LocalSettings.php that I need to make to get
<noinclude> to work? Someone else here installed the software, so I
haven't worked with that file, yet.
Anyone else have this problem?
Thanks,
Maffy
Welcome to mediawiki-l. This mailing list exists for discussion and questions
about the MediaWiki software[0]. Important MediaWiki-related announcements
(such as new versions) are also posted to this list.
Other resources.
If you only wish to receive announcements, you should subscribe to
mediawiki-announce[1] instead.
MediaWiki development discussion, and all Wikimedia technical questions, should
be directed to the wikitech-l[2] mailing list.
Several other MediaWiki-related lists exist:
- mediawiki-api[5] for API discussions,
- mediawiki-enterprise[6] for discussion of MediaWiki in the enterprise,
- mediawiki-cvs[7] for notification of commits to the Subversion repository,
- mediawiki-i18n[8] for discussion of MediaWiki internationalisation support,
- wikibugs-l[9] for notification of changes to the bug tracker.
List administrivia (unsubscribing, list archives).
To unsubscribe from this mailing list, visit [12]. Archives of previous postings
can be found at [3].
This list is also gatewayed to the Gmane NNTP server[4], which you can use to
read and post to the list.
Posting to the list.
Before posting to this list, please read the MediaWiki FAQ[10]. Many common
questions are answered here. You may also search the list archives to see if
your question has been asked before.
Please try to ask your question in a way that enables people to answer you.
Provide all relevant details, explain your problem clearly, etc. You may
wish to read [13], which explains how to ask questions well.
To post to the list, send mail to <mediawiki-l(a)lists.wikimedia.org>. This is a
public list, so you should not include confidential information in mails you
send.
When replying to an existing thread, use the "Reply" or "Followup" feature of
your mail client, so that clients that understand threading can sort your
message properly. When quoting other messages, please use the "inline" quoting
style[11], for clarity.
When creating a new thread, do not reply to an existing message and change the
subject. This will confuse peoples' mail readers, and will result in fewer
people reading your mail. Instead, compose a new message for your post.
Messages posted to the list have the "Reply-To" header set to the mailing list,
which means that by default, replies will go to the entire list. If you are
posting a reply which is only interesting to the original poster, and not the
list in general, you should change the reply to only go to that person. This
avoids cluttering the list with irrelevant traffic.
About this message.
This message is posted to the list once per week by <river(a)wikimedia.org>.
Please contact me if you have any questions or concerns about this mailing.
References.
[0] http://www.mediawiki.org/
[1] http://lists.wikimedia.org/mailman/listinfo/mediawiki-announce
[2] http://lists.wikimedia.org/mailman/listinfo/wikitech-l
[3] http://lists.wikimedia.org/pipermail/mediawiki-l/
[4] http://dir.gmane.org/gmane.org.wikimedia.mediawiki
[5] http://lists.wikimedia.org/mailman/listinfo/mediawiki-api
[6] http://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
[7] http://lists.wikimedia.org/mailman/listinfo/mediawiki-cvs
[8] http://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
[9] http://lists.wikimedia.org/mailman/listinfo/wikibugs-l
[10] http://www.mediawiki.org/wiki/FAQ
[11] http://en.wikipedia.org/wiki/Posting_style#Inline_replying
[12] http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
[13] http://www.catb.org/~esr/faqs/smart-questions.html
> From: "Sandy Rozhon" <srozhon(a)oh.rr.com>
>
> I used this site with great success....very easy. All you do is copy
> all the cells you want and paste them into their form, then it
> regurgitates the code for your wiki.
>
> http://excel2wiki.net/
Lovely! It properly converts Apple Numbers stuff, as well.
I guess the incorrect handling of delimited separators is not so
important as long as you only do copy/paste from Excel or Numbers,
which defaults to tab delimiters.
But be aware that this extension probably will not work properly with
arbitrary exported CSV data.
:::: MacOS X -- when you need today what Bill Gates has promised you
tomorrow. ::::
:::: Jan Steinman http://www.VeggieVanGogh.com ::::