$wgDBtransactions gets set to true if using InnoDB tables. Is there
an advantage to using InnoDB tables?
The disadvantage is that with MySQL there is a file, ibdata1, that
seems to grow endlessly if InnoDB tables are used. See
http://bugs.mysql.com/bug.php?id=1341
We're wondering if we should just convert everything to MyISAM. Any
thoughts?
=====================================
Jim Hu
Associate Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hi all,
I've created some custom namespaced on one of my wikis, Botwiki
(previously known as pywikipedia).
I've put these lines in my LocalSettings.php file:
- ---
#Custom namespaces
$wgExtraNamespaces =
array(100 => "Manual",
101 => "Manual talk",
102 => "Python",
103 => "Python talk",
104 => "Php",
105 => "Php talk",
106 => "Perl",
107 => "Perl talk",
108 => "AWB",
109 => "AWB talk",
110 => "IRC",
111 => "IRC talk",
112 => "Other",
113 => "Other talk"
);
$wgContentNamespaces[] = 100;
$wgContentNamespaces[] = 102;
$wgContentNamespaces[] = 104;
$wgContentNamespaces[] = 106;
$wgContentNamespaces[] = 108;
$wgContentNamespaces[] = 110;
$wgContentNamespaces[] = 112;
- ---
However, I have a big problem: when I go to a page in one of these new
namespaces (not the discussion, the main ones), for example
http://botwiki.sno.cc/wiki/Perl:Copyright_Violation_Bot , I found the
red link to the discussion page. It's right, as there is no discussion
page for that article. But if you click on it, it brings you to
http://botwiki.sno.cc/w/index.php?title=Perl_talk:Copyright_Violation_Bot&a…
correct, of course. But have a look of the article and discussion tabs:
they are both red! The first, "article", leads to
http://botwiki.sno.cc/w/index.php?title=Perl_talk:Copyright_Violation_Bot&a…
when it should lead to
http://botwiki.sno.cc/wiki/Perl:Copyright_Violation_Bot and the second,
"discussion", leads to
http://botwiki.sno.cc/w/index.php?title=Talk:Perl_talk:Copyright_Violation_…
, when it should lead to
http://botwiki.sno.cc/w/index.php?title=Perl_talk:Copyright_Violation_Bot&a…
.
It's the first time I deal with custom namespaces :-( but I have some
ideas of what it can be. Can the problem be with the
$wgContentNamespaces settings? So it detects everything as ns0? (don't
think so).
Or can it be the fact that I haven't used an underscore in the
$wgExtraNamespaces definition?
Snowolf
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.7 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFGWhk7sdafW5NQMtERAuX+AKDQ7QLNjXv9cu+ZbSLXidMzgi/vNgCaA7VT
+VTgR3iI/BI7FVDqcyRZVJ0=
=a4yP
-----END PGP SIGNATURE-----
I want to moving all pages in a certain namespace (about 60 pages) into
the the "main" namespace. I couldn't find how to do this, so I tried
exporting the pages and importing them and I ran into all sorts of
problems. Is there a way to do what I want without using the import and
export features (and without having to move each of them manually)?
Thanks.
Hello everybody,
When I search after some terms, which have not been defined yet, or after editing an entry (but not always???), I get the following error message:
Database error
>From MiWiki
For query "sales"
Jump to: navigation, search
A database query syntax error has occurred. This may indicate a bug in the software. The last attempted database query was:
(SQL query hidden)
from within function "". MySQL returned error "1030: Got error 127 from storage engine (localhost)".
Has anybody had this error before? How may I fix it?
Thanks a lot in advance for your help and best regards,
Veronica
P.S: I found a similar message at the The Wikitech-l Archives, but in my case no table name was written, so what shall I do????
--
Der GMX SmartSurfer hilft bis zu 70% Ihrer Onlinekosten zu sparen!
Ideal für Modem und ISDN: http://www.gmx.net/de/go/smartsurfer
Morning!
I am building a web site for a family member who is in the process
of completing a PhD and MediaWiki looks like the ideal CMS to use
as they will be able to update their own web site (delegation is
always important :-)
Two questions:
1) There may be a need to have certain pages in multiple languages.
Is the best way to do this for a small site is to add a language
identifier to the page title eg:
here is the CV in [[CV.en|English]] or in [[CV.nl|Dutch]].
Is there a way to do this and still have a more readable title eg
"English CV" rathern than "CV.en"?
2) I see from the LocalSettings.php that it is possible to prevent
access to certain pages for users who are not logged in. Do I
need to list all the pages I want blocked or is it possible
to do wildcards eg internal*
TIA
James
Sorry everyone for the "repost" but this was but that was my first post to
the MediaWiki-l mailing list, and I guess I didn't fully understand that
HTML would get stripped out, so the last post was truly ugly, I could see
alot of potential assistance could be turned away due to the horridmessage
format, please forgive me and allow me this "correction repost". thanks!
To cut the chase, and get to the point = I've been trying for a *very* long
time now attempting to roll-out my own MediaWiki-based website with
full-text search capabilities using the Lucene extension, I only wish I
could get it to just plain *work*! but, alas : I am at a dead end.
I've read any/every-thing I can find on the internet about MediaWiki +
Lucene-search2 + MWSearch, but no luck so far - I keep running into the same
problem which is ZERO SEARCH RESULTS via the MediaWiki search engine AFTER
installing the Lucene-search2 daemon & MWSearch extension.
Prior to the installation of Lucene-search2 daemon & MWSearch extension, my
MediaWiki search worked without a hitch -- but I *need* the full-text search
capability Lucene brings along, it is *the solution I need*.
Which brings me to my problem, and my plea to the MedaWiki-l mailing list ;
When-ever I search with Lucene-search2 daemon & MWSearch extension
installed, I get ZERO search results (specifically the search "results" page
error reads = 'No page text matches') but after troubleshooting the issue, I
found by enabling MediaWiki debugging, my
/var/log/mediawiki/debug_svn_log.txt shows (summarizing here) ;
Fetching search data from
http://localhost:8123/search/svnwikidb/loopback?namespaces=0&offset=0&limit…
Http::request: GET
http://localhost:8123/search/svnwikidb/loopback?namespaces=0&offset=0&limit…
total [0] hits
OutputPage::sendCacheControl: private caching; **
Request ended normally
Follow me here :: if I load up the URL in the debug log above (or
*everytime* I search now and read the debug log) in a web-browser, like
'lynx' it I see this (or something similar) ;
1
1.0 0 Main_Page
Now that tells me that I am actually getting a REAL result when going to the
link in debug manually! It seems to be saying there is 1 mention of the word
I searched for, and it is on the "Main_Page" (which is correct!)
I have put ALOT more information about my problem, _every step_ I took
starting with my MediaWiki installation on a newly formatted hard drive+new
Slackware linux v12.1 Operating System install (slackware 12.1), I also have
log files and configuration files posted at this website (actually, its the
"talk" page for the MediaWiki extension = MWSearch) =
http://www.mediawiki.org/wiki/Extension_talk:MWSearch#MediaWiki_SVN.2BLucen…
I hope someone out there has done this, and/or has a working setup that
could point me in the right direction (to a website/HOWTO document, or
anything!), or someone who would be willing to troubleshoot this issue with
me! I am NOT a linux newbie in any sense of the word, and can use anyones
assistance who may be able to help me FIX the issue. IMHO I think the
problem is with MWSearch extension itself, but it is VERY possible I did
something wrong, and am willing to admit my fault, and correct it - someone,
ANYONE, please guide me thru how to fix this issue!
(I've been at troubleshooting this issue for _OVER_ 6months now, just to
give you an idea of how much searching and troubleshooting I've given this
issue)
Feel free to email me, or post here, or even post on the MediaWiki Extension
talk:MWSearch page =
http://www.mediawiki.org/wiki/Extension_talk:MWSearch#MediaWiki_SVN.2BLucen…
Thanks for your time! peace -
agentdcooper(a)gmail.com
PS :: One last thing/FYI - the following is my current overall system setup
;;
* Slackware linux v12.1, (Linux 2.6.24.5-smp Slackware's smp-generic
kernel, unchanged)
* MediaWiki: 1.13alpha (SVN 06-25-2008)
* PHP: 5.2.6 (I used Slackware 12.1's PHP v5.2.6 update package)
* MySQL: 5.0.51b
* MediaWiki Extension(s): MWSearch SVN 06-25-2008, and Lucene-search2
SVN 06-25-2008, + I downloaded & installed mwdumper.jar into the
Lucene-search2 "lib" dir = /usr/local/search/ls2
* other tools: jre-6u6-i586-3, jdk-1_5_0_09-i586-1,
apache-ant-1.7.0-i486, rsync-3.0.2-i486-1
Hi
I was wondering if there's a way to create a group of users that have access to
private content
A group below admin but a level different than regular users?
or is my only option to just create another wiki accesssible only to this
group?
thanks
kate
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hi there,
at the end of
http://de.wikipedia.org/wiki/Hilfe:Listen
you can find an example for an ordered list interspersed with a simple
text line and with a new new starting number.
Wiki-text is
..............................
# eins
# zwei
Zwischenüberschrift od. Leerzeile
#<li value="3"> drei</li>
# vier
..............................
and will/should lead to
..............................
~ 1. eins
~ 2. zwei
Zwischenüberschrift od. Leerzeile
~ 3. drei
~ 4. vier
..............................
In our old MS 1.9.3 we got instead:
..............................
~ 1. eins
~ 2. zwei
Zwischenüberschrift od. Leerzeile
~ 1.
~ 3. drei
~ 4. vier
..............................
So we have decided to test it in a fresh installed and empty MW 1.12.0
- - with the same result!!
In de.wikipedia.org (1.13.alpha) it works fine.
So my question: What's wrong?
What I saw in HTML souce code was:
<ol>
<li> eins</li>
<li> zwei</li>
</ol>
<p>Zwischenüberschrift od. Leerzeile</p>
<ol>
<li> <li value="3"> drei</li> </li>
<li> vier</li>
</ol>
So why the second ordered list comes with a double <li> tag where
1.13.Alpha (de.wikipedia.org) shows correct:
<ol>
<li>eins</li>
<li>zwei</li>
</ol>
<p>Zwischenüberschrift od. Leerzeile</p>
<ol>
<li value="3">drei</li>
<li>vier</li>
</ol>
Uwe (Baumbach)
U.Baumbach(a)web.de
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFIZmdoFEbayCH8zXkRAnO5AJ9Zft2kvXhHoaZh05Iulhz7JWeM0ACguh85
S0nKvMcbZTYdFJDBLmnGXGU=
=PSSA
-----END PGP SIGNATURE-----
I run a wiki that I use to mainly catalog art pieces. Someone was
wondering if there was a way to show all the images used in pages of a
given category.
If not...
Anyone know the sql query the image uses to know what pages it is used
in?
Thnaks.