Since the update to version 1.19.2 links to our local files (in the form: file:///) do not work anymore (no error message, the browser (IE)) does not start any action) - although the line $wgUrlProtocols[] = "file:///" is in the LocalSettings. What can I do?
I installed the svn Cite files and added the "require_once" command in the
LocalSettings.php file. I tried a simple reference in Sandbox, but nothing
seemed to happen. The <ref> tags are still there and <references /> is
empty.
What do I need to do?
--
D. E. (Steve) Stevenson
(Almost emeritus) Associate Professor
Director, Institute for Modeling and Simulation Applications.
Clemson University
steve at clemson dot edu
Anyone who has ever looked into the glazed eyes of a soldier dying on the
battlefield will think hard before starting a war. -Otto von Bismarck,
statesman (1815-1898)
I have a wiki running an old installation of version 1.16.4. I'd very
much like to be able to track releases using Git as described in the
manual, however, I'm confused about how to do this. The manual says
to "If using Git, export the files into a clean location. Replace all
existing files with the new versions, preserving the directory
structure. The core code is now up to date.".
What does "export the files" mean? Does this mean to clone the core
repo somewhere first?
Is it really as simple as replacing files with new versions? What
about the hidden files that designate my directory as a clone of core,
do I copy those too?
I'm sure I'm making this more complicated than it really is, but I
really want to be able to upgrade in the future with a simple git
pull.
Thanks
Bill
So, rationalwiki.org has been *much* faster and more usable with a
coupla squids and a load-balancer in front of the
Apache/Lucene/database node. (We could probably cope with just one
squid, but Trent wanted to experiment.) The nodes are all Ubuntu 10.04
Linodes, the software manually kept up to date.
Our problem now is that the Apache box sometimes ... just goes nuts,
fills memory with Apache processes, then it goes into swap, then the
oom-killer comes out to play and we have to work out what it's killed
or (quicker) reboot the node.
We have had occasional load spikes - where the load-balancer sees
someone or something hammering it at 300 hits/sec or so - but they
*don't* always coincide with Apache going nuts. The squids don't show
any excess load during Apache going nuts either.
If we happen to catch it when it's in swap but before oom-killer comes
out, apache2ctl restart brings things back to normality.
The Apache node has 4GB memory, about 3GB of that being free/cache in
normal operation.
We have NO IDEA what or why this is happening. Last happened around
three days ago. Since then it's been lovely, but it always is until it
falls over. Clues welcomed.
- d.
Does anyone know how to detect or log failed logins ?
Also, there doesn't seem to be a consensus on file permissions in the wiki
directory - can anyone say with some authority ?
thanks,
Chris
I am getting this error instead of a thumbnail rendering of SVG files on my
MediaWikis:
Error creating thumbnail: ERROR: meta.c (179): wmf_header_read: this isn't
a wmf file
convert: Delegate failed `"wmf2svg" -o "%o" "%i"'.
convert: unable to open image `/tmp/magick-XXn4EQJ3': No such file or
directory.
convert: unable to load module
`/usr/lib64/ImageMagick-6.2.8/modules-Q16/coders/svg.la': libdbus-1.so.3:
failed to map segment from shared object: Cannot allocate memory.
convert: unable to open file `/tmp/magick-XXn4EQJ3': No such file or
directory.
convert: missing an image filename `PNG:/tmp/transform_402f06e-1.png'.
Why is MediaWiki trying to treat the SVG as a "wmf" file?
I am getting this on both MW1.18.1 and MW1.19.2 on my shared Web hosting
wikis (hosted at Dathorn.com).
ImageMagick v 6.2.8 is installed and converts an svg to png just fine via
command line.
Any ideas what is wrong?
Thanks,
Roger
You can see the problem and experiment at my temporary fresh MW1.19.2
install at http://www.eslpedia.com/File:Darknight.svg. (To avoid potential
confusion, let me just note that I am serving a plain index.html instead of
the wiki at eslpedia.com/ (no filename) to redirect any users who gets lost
there to my production site of a similar domain name.)
Hello,
When the Auth_remoteuser is in the LocalSettings file, I am not able to use
the “Create Account”.
I get the following error:
*
*
*Login error*
There was either an authentication database error or you are not allowed to
update your external account.
Is there any way of getting around this?
Thanks.
Sorry. My last post got slightly garbled...still new to using this mailing list, so here's my message again:
I was curious if there is a way to take a bullet pointed list of internal wiki links to other articles (whether they contain content or not) on a single page and have them automatically categorized to the category to same name as the article page.
For example, I have the article page "Snow", and I want to add a list of subtopics on the article page "Snow" to the category "Snow", with subtopics being:
(bulletpoint one)Flurries (bulletpoint two) Hail (bulletpoint three) Sleet
My question is if there is a template, extension, or wiki parser expressions I should use to make all items on a bullet pointed list be autocategorized in the category with the same name as the article page.
If anyone has any ideas or knows what I need, any assistance would be greatly appreciated.
I've asked this question on the MediaWiki users forums and at the support desk at MediaWiki.org and came up dry for answers in both places, so I'm reposting the same question to the mailing list here in hopes of an answer:
I was curious if there is a way to take a bullet pointed list of
internal wiki links to other articles (whether they contain content or
not) on a single page and have them automatically categorized to the
category to same name as the article page.
For example, I have the article page "Snow", and I want to add a list
of subtopics on the article page "Snow" to the category "Snow", with
subtopics being:
FlurriesHailSleet
My question is if there is a template, extension, or wiki parser expressions I should use to effect this, if possible.
If anyone has any ideas or knows what I need, any assistance would be greatly appreciated.