$wgDBtransactions gets set to true if using InnoDB tables. Is there
an advantage to using InnoDB tables?
The disadvantage is that with MySQL there is a file, ibdata1, that
seems to grow endlessly if InnoDB tables are used. See
http://bugs.mysql.com/bug.php?id=1341
We're wondering if we should just convert everything to MyISAM. Any
thoughts?
=====================================
Jim Hu
Associate Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hi all,
I've created some custom namespaced on one of my wikis, Botwiki
(previously known as pywikipedia).
I've put these lines in my LocalSettings.php file:
- ---
#Custom namespaces
$wgExtraNamespaces =
array(100 => "Manual",
101 => "Manual talk",
102 => "Python",
103 => "Python talk",
104 => "Php",
105 => "Php talk",
106 => "Perl",
107 => "Perl talk",
108 => "AWB",
109 => "AWB talk",
110 => "IRC",
111 => "IRC talk",
112 => "Other",
113 => "Other talk"
);
$wgContentNamespaces[] = 100;
$wgContentNamespaces[] = 102;
$wgContentNamespaces[] = 104;
$wgContentNamespaces[] = 106;
$wgContentNamespaces[] = 108;
$wgContentNamespaces[] = 110;
$wgContentNamespaces[] = 112;
- ---
However, I have a big problem: when I go to a page in one of these new
namespaces (not the discussion, the main ones), for example
http://botwiki.sno.cc/wiki/Perl:Copyright_Violation_Bot , I found the
red link to the discussion page. It's right, as there is no discussion
page for that article. But if you click on it, it brings you to
http://botwiki.sno.cc/w/index.php?title=Perl_talk:Copyright_Violation_Bot&a…
correct, of course. But have a look of the article and discussion tabs:
they are both red! The first, "article", leads to
http://botwiki.sno.cc/w/index.php?title=Perl_talk:Copyright_Violation_Bot&a…
when it should lead to
http://botwiki.sno.cc/wiki/Perl:Copyright_Violation_Bot and the second,
"discussion", leads to
http://botwiki.sno.cc/w/index.php?title=Talk:Perl_talk:Copyright_Violation_…
, when it should lead to
http://botwiki.sno.cc/w/index.php?title=Perl_talk:Copyright_Violation_Bot&a…
.
It's the first time I deal with custom namespaces :-( but I have some
ideas of what it can be. Can the problem be with the
$wgContentNamespaces settings? So it detects everything as ns0? (don't
think so).
Or can it be the fact that I haven't used an underscore in the
$wgExtraNamespaces definition?
Snowolf
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.7 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFGWhk7sdafW5NQMtERAuX+AKDQ7QLNjXv9cu+ZbSLXidMzgi/vNgCaA7VT
+VTgR3iI/BI7FVDqcyRZVJ0=
=a4yP
-----END PGP SIGNATURE-----
Hi,
I have been trying to create a maintainable way of grouping pages
together and allowing readers to page through them in sequence. Most
samples I've seen use templates that require you to supply the
'previous' and 'next' page. This however results in three page edits
to insert a new page between two existing pages and does not guarantee
that your prev-sequence is identical to the next-sequence...
Being a programmer, that is way too much duplicate and error-prone
work for me ;-) There must be a better way to do this and I was hoping
to solve it with a plain MW installation (with ParserFunctions and
DynamicPageList at the moment).
Given an 'index' page that holds a list of all page titles in the
preferred order, isn't it possible to create a template that selects
the correct previous and next page title given the current page title?
I get stuck in getting the correct lines from the index page. DPL can
select based on section name (= page title), but then the contents of
that index section must be the prev/next links themselves:
=First Page=
Prev [[Second Page|Next]]
=Second Page=
[[First Page|Prev]] [[Third Page|Next]]
=Third Page=
.. etc.
This works, except that it is still a lot of duplication of page names
(but the edits are contained in a single page, big plus).
I hoped to simplify the index page by creating a template that writes
the section header and prev/next links, but then DPL no longer
recognizes the sections :-( Apparently DPL 'sees' the page text before
the templates are called ({{Page|Prev page|Page title|Next page}}:
{{Page||First Page|Second Page}}
{{Page|First Page|Second Page|Third Page}}
{{Page|Second Page|Third Page|Fourth Page}}
Basically my questions are:
1. Am I completely off track here?
2. Can DPL be coerced to evaluate templates before looking at the page
3. Can DPL (or another extension) select the text from a section
before/after a matched section?
4. Is it possible to determine the section sequence number given the
section name (so 'Second Page' results in 2, allowing me to use DPL to
retrieve the name of section 2 -/- 1 and 2 + 1 to create the prev/next
links?
Apologies for the long post, hopefully someone can point me to some
good resources (I've been to Meta, Wikibooks, Medawiki.org but could
very well have overlooked something there as the amount of info is a
bit overwhelming and it is difficult to judge how up to date it is).
--
Regards,
Jean-Marc
I'm seeing this weird anomaly in wiki syntax. Suppose you have this
template, called A:
{{#switch: {{{1}}}
| foo = [[File:foo.png]]
| bar = [[File:bar.png]]
| baz = [[File:baz.png]]
| UNKNOWN
}}
Now, with something like
{{A|foo}} and {{A|bar}}.
I get
<p><img src="foo.png"/>
</p>
<pre>and <img src="bar.png"/>
</pre>
<p>.
</p>
That is, the " and {{A|bar}}" is treated as a new paragraph. However,
if I have
[[File:foo.png]] and [[File:bar.png]].
I get
<p><img src="foo.png"/> and <img src="bar.png"/>.
</p>
So there is something with the newlines that are at the end of each
case, I'd presume. Compacting this all into one line appears not to
help. How can I change Template:A so that I get the more desirable
second outcome?
Hello all,
After dumping and restoring to a fresh wiki install a fairly large database
(over 2GB) the main page fails to load, all I get is a white page. I'm
assuming the backup/restore failed, but I'm not sure which one. Are there
any special considerations to take when working with large databases that
could be causing my backup and/or import to fail?
Thanks!
I've been experimenting with the parameters to Special:Export to
retrieve the whole history of an article. I haven't been able to get
more than 1000 revisions (from en wikipedia).
Does anyone know of a way to obtain the full history of an article?
Those huge 7z exports seem too crazy to work with to extract data for
only one page.
http://www.mediawiki.org/wiki/Manual:Parameters_to_Special:Export
Rob
As our wiki grows I become more aware that any major re-organisation needs
doing now, before it becomes impossible. With that in mind I've taken the
first steps in categorizing the pages. I've been looking at extensions that
provide navigation aids, and see that category pages are essential to most of
those. That brings up this very thorny question of language pages.
Unless something is done about it, the Welcome page could have 20 or more
entries on its category page, depending on how active our translators have
been. This clearly gets to the point where the page is unreadable.
It has been said to me that we should be using namespaces for languages, and
that doing that would mean that in a search, for example, only pages in your
system language would be returned. I've only just started reading the
documentation, but I get the impression that this is not so.
I should add that we are sandbox testing the use of the translatewiki
extension and are currently considering the changes needed before
implementation.
Apart from http://www.mediawiki.org/wiki/Manual:Namespace and the pages linked
from there, is there any other reading that I should be considering?
Anne
--
KDE Community Working Group
New to KDE4? - get help from http://userbase.kde.org
Does anyone here work with multilanguage Wiki content? Are there any
extensions that ease the pain of translating Wiki articles?
On the OpenOffice.org Wiki we have a lot of languages in use. Various
community members pop in, see a favorite page they want in their
language, and they start translating. Unfortunately a lot of these well
intentioned translations are done directly in/over the original content.
I'm trying to find a way to make this... ummm.. easier. Say for example
an extension that adds a "Translate this page" link next to the "Edit"
link, and then presents the user with a selectable list of predefined
languages. Pick a language, and a new page is started with the right
article naming (something we can do on the OOoWiki due to article naming
styles we use) and copies the original text into the new page edit box
ready to be edited/translated.
OK, that might just be a dream extension, but... maybe something that
allows side by side views of articles while translating... or...
I've been poking around looking for info on how people manage this on
other Wikis, but haven't really turned up any useful info yet... does
someone here have any suggestions?
C.
--
Clayton Cornell ccornell(a)openoffice.org
OpenOffice.org Documentation Project co-lead
StarOffice - Sun Microsystems, Inc. - Hamburg, Germany
I, sometimes, have problem with sessions.
For example, even if config and tmp dir are with good rights a login
problem happens.
This page: http://www.opensurf.it/w/index.php/Il_pedone_e_l'automobilista:_la_moralità
is not editable by me with admin rights.
Instead this one,
http://www.opensurf.it/w/index.php/Il_pedone_e_l'automobilista:_i_binari
,
that I have created now, is editable.
Some days ago my server did something and the problem disappears, but
now is again present.
Can be a mediawiki problem, or it is a server rights problem?
Thx
[Sun Nov 29 23:51:25 2009] [error] [client 93.144.132.82] PHP Warning:
Unknown: Failed to write session data (files). Please verify that the
current setting of session.save_path is correct (/var/lib/php/session)
in Unknown on line 0
[Sun Nov 29 23:51:25 2009] [error] [client 93.144.132.82] PHP Warning:
Unknown: open(/var/lib/php/session/sess_udj5jl4j8neugf8m3k3tggdta5,
O_RDWR) failed: Permission denied (13) in Unknown on line 0
[Sun Nov 29 23:51:25 2009] [error] [client 93.144.132.82] PHP Warning:
Unknown: Failed to write session data (files). Please verify that the
current setting of session.save_path is correct (/var/lib/php/session)
in Unknown on line 0
[Sun Nov 29 23:51:34 2009] [error] [client 93.144.132.82] PHP Warning:
Unknown: open(/var/lib/php/session/sess_udj5jl4j8neugf8m3k3tggdta5,
O_RDWR) failed: Permission denied (13) in Unknown on line 0, referer:
http://www.opensurf.it/w/index.php?title=Special:Log&type=protect&page=Il_p…
[Sun Nov 29 23:51:34 2009] [error] [client 93.144.132.82] PHP Warning:
Unknown: Failed to write session data (files). Please verify that the
current setting of session.save_path is correct (/var/lib/php/session)
in Unknown on line 0, referer:
http://www.opensurf.it/w/index.php?title=Special:Log&type=protect&page=Il_p…
[Sun Nov 29 23:51:35 2009] [error] [client 93.144.132.82] PHP Warning:
Unknown: open(/var/lib/php/session/sess_udj5jl4j8neugf8m3k3tggdta5,
O_RDWR) failed: Permission denied (13) in Unknown on line 0
[Sun Nov 29 23:51:35 2009] [error] [client 93.144.132.82] PHP Warning:
Unknown: Failed to write session data (files). Please verify that the
current setting of session.save_path is correct (/var/lib/php/session)
in Unknown on line 0
[Sun Nov 29 23:51:35 2009] [error] [client 93.144.132.82] PHP Warning:
Unknown: open(/var/lib/php/session/sess_udj5jl4j8neugf8m3k3tggdta5,
O_RDWR) failed: Permission denied (13) in Unknown on line 0
[Sun Nov 29 23:51:35 2009] [error] [client 93.144.132.82] PHP Warning:
Unknown: Failed to write session data (files). Please verify that the
current setting of session.save_path is correct (/var/lib/php/session)
in Unknown on line 0
--
www.opensurf.it
Bonjour,
Je suis à la recherche d'une extension ou d'un moyen d'avoir des statistiques sur l'utilisation de mon Wiki (nombre de visites par des utilisateurs uniques de telle ou telle pages ou rubriques etc...)
J'ai trouvé 2 extensions mais qui permettent seulement de connaître le nombre d'éditions des pages par les différents utilisateurs.
J'ai également trouvé une extension mais qui nécessite l'installation d'un logiciel sur un serveur Linux, or je suis hébergé chez OVH. Il semble qu'on puisse également l'utiliser avec google chart qui sert à générer des graphiques.
Pensez-vous qu'on puisse utiliser BBClone avec Mediawiki ou connaissez-vous un autre moyen d'avoir des stat ?
Lorsqu'il faut insérer du code dans chacune des pages en php à prendre en compte, je ne sais pas où inserer ce code car les pages semblent être génerer automatiquement par le wiki si j'ai bien compris.
Merci pour votre aide.
Anna Castilla
_________________________________________________________________
Windows 7 à 35€ pour les étudiants !
http://www.windows-7-pour-les-etudiants.com