$wgDBtransactions gets set to true if using InnoDB tables. Is there
an advantage to using InnoDB tables?
The disadvantage is that with MySQL there is a file, ibdata1, that
seems to grow endlessly if InnoDB tables are used. See
http://bugs.mysql.com/bug.php?id=1341
We're wondering if we should just convert everything to MyISAM. Any
thoughts?
=====================================
Jim Hu
Associate Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hi all,
I've created some custom namespaced on one of my wikis, Botwiki
(previously known as pywikipedia).
I've put these lines in my LocalSettings.php file:
- ---
#Custom namespaces
$wgExtraNamespaces =
array(100 => "Manual",
101 => "Manual talk",
102 => "Python",
103 => "Python talk",
104 => "Php",
105 => "Php talk",
106 => "Perl",
107 => "Perl talk",
108 => "AWB",
109 => "AWB talk",
110 => "IRC",
111 => "IRC talk",
112 => "Other",
113 => "Other talk"
);
$wgContentNamespaces[] = 100;
$wgContentNamespaces[] = 102;
$wgContentNamespaces[] = 104;
$wgContentNamespaces[] = 106;
$wgContentNamespaces[] = 108;
$wgContentNamespaces[] = 110;
$wgContentNamespaces[] = 112;
- ---
However, I have a big problem: when I go to a page in one of these new
namespaces (not the discussion, the main ones), for example
http://botwiki.sno.cc/wiki/Perl:Copyright_Violation_Bot , I found the
red link to the discussion page. It's right, as there is no discussion
page for that article. But if you click on it, it brings you to
http://botwiki.sno.cc/w/index.php?title=Perl_talk:Copyright_Violation_Bot&a…
correct, of course. But have a look of the article and discussion tabs:
they are both red! The first, "article", leads to
http://botwiki.sno.cc/w/index.php?title=Perl_talk:Copyright_Violation_Bot&a…
when it should lead to
http://botwiki.sno.cc/wiki/Perl:Copyright_Violation_Bot and the second,
"discussion", leads to
http://botwiki.sno.cc/w/index.php?title=Talk:Perl_talk:Copyright_Violation_…
, when it should lead to
http://botwiki.sno.cc/w/index.php?title=Perl_talk:Copyright_Violation_Bot&a…
.
It's the first time I deal with custom namespaces :-( but I have some
ideas of what it can be. Can the problem be with the
$wgContentNamespaces settings? So it detects everything as ns0? (don't
think so).
Or can it be the fact that I haven't used an underscore in the
$wgExtraNamespaces definition?
Snowolf
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.7 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFGWhk7sdafW5NQMtERAuX+AKDQ7QLNjXv9cu+ZbSLXidMzgi/vNgCaA7VT
+VTgR3iI/BI7FVDqcyRZVJ0=
=a4yP
-----END PGP SIGNATURE-----
Hi there,
I am using Lucene-Search 2.1/MWSearch for my MediaWiki 1.15.1.
It's working fine, but it can't search any Japanese characters.
I have tried (language,ja) in the lsearch-global.conf file, but it doesn't seem to make any difference.
Any idea would be appreciated,
Ross
__________________________________________________________________
Looking for the perfect gift? Give the gift of Flickr!
http://www.flickr.com/gift/
Hi,
until I made an upgrade from 1.15.1 (applied the 15.2,3 and 4 patches) I
was happily editing my wiki pages with gnu emacs via ee.pl
and a bookmarklet that goes:
javascript:location=location+'?action=edit&externaledit=true';
(http://www.mediawiki.org/wiki/Manual:External_editors)
Now, external editing is broken I and I suspect that this is related to
(bug 23076)
Fixed login CSRF vulnerability. Logins now require a token to be
submitted along with the user name and password. Patch by Roan Kattouw.
Any ideas on what I could/should do to get TRUE external editing back ?
I don't mind installing stuff. I am not much of a programmer/sysadmin,
but I can read :)
I don't want to use something like "It's all text" since it's not
practical to edit several wiki pages at the same time and having
to keep several browser tabs open just for saving. (I do this at home
on a windows box and it's a pain).
- thanx for any help / insight ! - Daniel
PS: Of course I am not sure about the cause, but in my case
identification of the problem (upgrading) is really simple. I got two
wikis on the same machine:
The 1.15.4 one (http://edutechwiki.unige.ch/en/) is broken and the
1.15.1 one (http://edutechwiki.unige.ch/fr/) works just fine.
Hi,
I have been trying to create a maintainable way of grouping pages
together and allowing readers to page through them in sequence. Most
samples I've seen use templates that require you to supply the
'previous' and 'next' page. This however results in three page edits
to insert a new page between two existing pages and does not guarantee
that your prev-sequence is identical to the next-sequence...
Being a programmer, that is way too much duplicate and error-prone
work for me ;-) There must be a better way to do this and I was hoping
to solve it with a plain MW installation (with ParserFunctions and
DynamicPageList at the moment).
Given an 'index' page that holds a list of all page titles in the
preferred order, isn't it possible to create a template that selects
the correct previous and next page title given the current page title?
I get stuck in getting the correct lines from the index page. DPL can
select based on section name (= page title), but then the contents of
that index section must be the prev/next links themselves:
=First Page=
Prev [[Second Page|Next]]
=Second Page=
[[First Page|Prev]] [[Third Page|Next]]
=Third Page=
.. etc.
This works, except that it is still a lot of duplication of page names
(but the edits are contained in a single page, big plus).
I hoped to simplify the index page by creating a template that writes
the section header and prev/next links, but then DPL no longer
recognizes the sections :-( Apparently DPL 'sees' the page text before
the templates are called ({{Page|Prev page|Page title|Next page}}:
{{Page||First Page|Second Page}}
{{Page|First Page|Second Page|Third Page}}
{{Page|Second Page|Third Page|Fourth Page}}
Basically my questions are:
1. Am I completely off track here?
2. Can DPL be coerced to evaluate templates before looking at the page
3. Can DPL (or another extension) select the text from a section
before/after a matched section?
4. Is it possible to determine the section sequence number given the
section name (so 'Second Page' results in 2, allowing me to use DPL to
retrieve the name of section 2 -/- 1 and 2 + 1 to create the prev/next
links?
Apologies for the long post, hopefully someone can point me to some
good resources (I've been to Meta, Wikibooks, Medawiki.org but could
very well have overlooked something there as the amount of info is a
bit overwhelming and it is difficult to judge how up to date it is).
--
Regards,
Jean-Marc
Mediawiki was originally installed by another person on a server
before Apache was configured for virtual hosts. I would have just
wiped and and installed again were it not for lots of pages we want to
keep are now in there. So I'm just trying to manually reconfigure
Apache and Mediawiki to get the new virtual URL to work.
Originally the paths everything was installed in was /etc/mediawiki
and /var/lib/mediawiki. I have not changed these.
I'm reading this document and trying to apply it to my situation:
http://www.mediawiki.org/wiki/Manual:Short_URL
The host is called "fermat" (no TLD), and that is DNS resolved on the
internal network. A URL that worked previously was
"http://fermat/mediawiki/index.php/Main_Page". Apache now no longer
listens on the IP address that "fermat" resolves to, so that URL now
gets a connection failure. This is expected. Instead, a new name
"wiki" resolves to another IP address. The server has that IP address
configured, and so does Apache. Apache listens on it. It does
connect. I'm just not getting the contents expected.
Old URL: http://fermat/mediawiki/index.php/Main_Page
New URL: http://wiki/wiki/Main_Page
I am also changing the first directory name from "/mediawiki" to just
"/wiki". I would have liked to have it even shorter by eliminating
this first path altogether, but one document I read said not to do
this, so I didn't. I did put in an Apache Alias directive "Alias
/wiki /var/lib/mediawiki/index.php" so that I do not have to use
"index.php" in the URL.
So I have these in "/etc/mediawiki/LocalSettings.php":
$wgScriptPath = "/mediawiki";
$wgArticlePath = "/wiki/$1";
$wgScriptExtension = ".php";
Previously, $wgArticlePath was not assigned (I added it since the
document had that). I copied this from
"http://www.mediawiki.org/wiki/Manual:Short_URL#Setup_steps" but left
"/mediawiki" NOT changed to "/w" (since I didn't see a need to change
the filesystem path, at least not yet).
OK ... so I go to "http://wiki/wiki/Main_Page". I actually get our
page contents (the contents we put in there back when it was under the
other URL). So it seems mostly I got it right. BUT ... what fails is
that get NONE of the prettiness. No CSS, no images, etc. It just
looks like it was rendered from plain HTML. Are there other steps
which are needed to get that? I don't see any in the
"http://www.mediawiki.org/wiki/Manual:Short_URL" document, and don't
see any link that suggests that. I would have expected that if more
steps are needed, they would be right there with the other steps. So
I don't know if this is something missing or something not done right.
But I got the right contents, which suggests to me the right script
was run in ... maybe ... the right context.
So, anyone have any idea what else needs to be done? A reference to
the right document should work (if it's a big one, what section to
use).
Hi there,
this question sure was posted a 1000 times before, but...
How do i get all Templates and Extensions from e.g Wikipedia into my own
Wiki? Since I needed a copy of some of Wikipedia's articles as a
presentation for my own wiki, I need those templates and extensions as
well. Do I have to copy them all seperately? Or might there be an easy
way?
Cheers,
Mathias
Hi List,
this is my first posting into this mailinglist, so please excuse me if I am doing sth wrong. I am maintaining an company-wide, internaly used mediawiki installation (Version 1.12). Now, the users of this list have asked me how to link into localy saved documents, stored on a server-system. I know, this is not the normal way of doing things, but my colleagues asked me to do so as they don't want to turn the documents into html or anything, just linking them so they open (Rules for Firefox / Internet Explorer apply). Additionaly I can't force them into typing wiki-code itself, so we're using the TinyMCE Plugin to edit content.
Problem: all the links to be made will contain "space" characters, so a path will look like this: file:\\srv1\Resources & Databases\WIKI development\used\*.odt. There is no way to replace the backspace characters on the server itself...
I tried the extensions I found on this topic, (for example http://www.easy-coding.de/mediawiki-filelinkextension-t1475.html), but they did not work.
Are there any other extensions recommended, any hints? Otherwise I am stuck in this mediawiki project unfortunatly.
Kind Regards,
Michael
I've posted this question twice before without so much as a courtesy response of
"Sorry, we don't know." I'd really appreciate some help with this, as I have not
been able to track down any information on protecting pages (aside from "click
protect at the top of the page") and asking in the IRC chan on freenode usually
goes unanswered as well.
--------------------------------------------------------------------------------
Greetings all,
Is it possible for an extension to set a page as "protected"? I have a tag that
displays help info for my extension, I'd like to drop it in a page, and have the
tag set the page as protected via a parameter. The idea is to allow the tag to
display its output but have the page be protected from editing.
Thanks for any advice, much appreciated.
- Vadtec
Hi,
is it possible to set the default search behavior to using wildcards? That
is, entering the search term "token" would actually search for "token*" and
therefor also find "tokenization".
IMO not so tech savvy users might suspect this to be the default behavior.
Frank
--
View this message in context: http://old.nabble.com/Wildcard-search-as-default-search-behavior-tp29032821…
Sent from the WikiMedia General mailing list archive at Nabble.com.