It looks like searches within MediaWiki 1.9.1 use "or" when searching for more than one word. Is there any way to change the default search to use "and" when a space is between words?
Thanks,
Shannon
____________________________________________________________________________________
Don't get soaked. Take a quick peek at the forecast
with the Yahoo! Search weather shortcut.
http://tools.search.yahoo.com/shortcuts/#loc_weather
I want to be able to do a fulltext search for three (3)-character words in MediaWiki. I am on mysql 4.1.20 and MediaWiki 1.9.1. Here are the basic steps I followed, but it still doesn't work:
1. edited /etc/my.cnf and added ft_min_word_len=3 to [mysqld] section.
2. ran php maintenance/updateSearchIndex.php.
3. edited LocalSettings.php added: $wgDBmysql4 = $wgEnablePersistentLC = true;
4. restarted mysqld and httpd.
Any help will be appreciated.
Shannon
____________________________________________________________________________________
It's here! Your new message!
Get new email alerts with the free Yahoo! Toolbar.
http://tools.search.yahoo.com/toolbar/features/mail/
I switched hosting providers and had them move my files over and
import the database.
Now, all images display black, except new ones which display fine but
when I click on one, it takes me to "/wiki/ Image:X.jpg" - with the
space in the URL. I introduced pretty URL rewrite rules on this host,
they weren't supported on old - is this the problem?
Thanks.
--
Gary Kirk
I'm getting some images where the thumbnail is displaying much more
artifacting than the original, even though the thumbnail is much smaller
than the original. Is there a way to fix this, and does anyone know how
MW determines what compression quality to use when producing thumbnails,
so I can avoid it in the future?
Thanks,
Ken
Hello all,
Is there a way to make the RecentChanges list ignore changes in certain
namespaces (e.g. USER and USER_TALK)?
I think it's a bit odd that changes on my user page's ToDo list show up
in the global RecentChanges list, so it'd be nice if I could fix that
somehow.
Thanks,
F.
Hi, in the multilang-extension (
http://www.mediawiki.org/wiki/Extension:Multilang) is a bug.
in a template with following:
<multilang>
@en|[[{{{1}}}]]
@de|[[{{{2}}}]]
</multilang>
the result is this:
[[{{{2}}}]]
This means that i cannot put a parameter in a wiki-link.
Is there anybody who understands php? I asked the author, but he does not
know what causes the problem.
Armin
Hello all,
After first installing our wiki, I did not activate subpages (I didn't know that you had to). Now I've enabled them by adding
$wgNamespacesWithSubpages[NS_MAIN] = true;
to LocalSettings.php, and all seems to work fine.
However, there are a few previously-created (pseudo-)subpages - e.g. User:WikiSysOp/ToDo - which are not converted to "proper" subpages automatically. They seem to be regarded simply as pages with a slash in their title (thus missing the automatic "breadcrumbs" link to the parent page).
Is there any way I can fix that? Normally I'd delete the respective pages and re-create them from scratch - but since it's not possible to completely erase all records of an existing page (AFAIK), that's not an option here.
Thanks,
F.
--
"Feel free" - 5 GB Mailbox, 50 FreeSMS/Monat ...
Jetzt GMX ProMail testen: www.gmx.net/de/go/mailfooter/promail-out
Hi all,
Import problem:
I tried to import (in several ways) the latest English dump, converted
to proper SQL, but seemingly there's limit/setting somewhere halting the
import when the text table reaches 4GB size (1995267 rows) in MyISAM
(page and revision imported without problem though). What have I missed?
A setting somewhere in mysql, table kind, no?
--
Recommended DB tools?:
I also wonder if anyone has seen any very very well working db tools
which shuffles data between databases efficiently. I need the following
(unfortunately MySQL master/slave replication won't do (all) the tricks
for me):
1. Both Local and Remote replication (via TCP/IP and/or Http/PHP
"tunneling" for more restrictive hosts).
2. "Restore" database from sql-dump AND directly from another database,
see #1 (i.e. Data Transfer and Data Synchronization, one way suffice).
3. Data Compression on remote connections.
In short, the tools should be able to do "all the tricks" regarding
Backup/Restore & Transfer and Synchronization - with good speed over
regular ~20Mb(/1) broadband internet connections. Perhaps one need to
pay for such a tool but I would'nt pay much over $100.
What I have tried:
1. MySQL Adminstrator. I've played around with MySQL Adminstrator
(Backup/Restore) but it takes tiiiiime to shuffle big tables. It took
like 5 hours to upload the enwiki_page table. Unless I have missed some
settings that's way too slow.
2. DumpTimer.
http://www.dumptimer.com/index.php
A lightweight and smart tool, opens stable connections to most any host
out there (gzip-->unpacked by a serverside php-script), and uploads with
fairly good pace, but only until ~200 Gb something is reached when it
starts to slow down significantly. It's almost stalled at some 300 Gb.
Dead end for big tables.
3. Navicat.
http://www.navicat.com/
Looks really really (yes, really) good at first, it has all kinds of
combinations possible (Backup/Restore, Transfer, Synchro, etc) but your
joy is turned into desperation when it occurs to you that the thing
first "Processes" the data, say 4,5 million rows... all the while memory
peak is, well, increasing and increasing, and increasing... and when
when passing 2 GB it decides to start shuffle the data also, but when
the counters starts to spin its time for "out of memory". Sigh. (seems
to be a good choice though if you don't have huuuge tables).
So, anyone, what's is the "ultimate tool" out there? Some tool which
deals with both size and speed. I mean, can't be the only one having the
need for a tool which does "all the stuff" you need to do with a bunch
of remote databases...
Oh, and all features must be possible to schedule too, of course.
TIA,
// Rolf Lampa
Hello folks!
I have a rather strange problem which if anyone is in the know about the inner workings of the Mediawiki logging in/out system could help me with, i'd be truely greatful.
I've been tinkering with a PunBB(www.punbb.org) integration script which uses the AutoAuthenticate hook.. and it all works fine in general practise, allowing me to login and logout of the forums and the wiki correctly. The problem occurs when I try to change my preferences. If I attempt to log out directly after changing my preferences, on the logout page I'm displayed as logged out, however if I navigate from the logout page to the Main_Page it logs me back in again.
The even stranger thing is, that if I don't go from the logout page to the main page, say if I go from the logout page to the recent changes page, it works correctly and doesn't log me back in. I only seem to get this behaviour when two conditions are true:
*I have pressed the logout button after changing my preferences.
*I go directly from the "Logged Out" page to the Main_Page.
Everything else works fine.. could anyone shed some light on this? As I've been trying to fix it and it's driving me mad. The source for the extension i'm using can be viewed at http://www.noopectro.com/PunBBAuth_2.phps
Hello,
I recently moved to my own server and noticed the following error
message when trying to get into my wiki.
*Parse error*: syntax error, unexpected T_STRING, expecting T_OLD_FUNCTION
or T_FUNCTION or T_VAR or '}' in *
/home/cwf/public_html/wiki/includes/Exception.php* on line *140
*I saw on mwusers.com, on their help forum, a thread on this issue
but there was no solution listed. They mentioned something about PHP5 and
parsing with it.
I can use any and all help to fix this problem. I'm not sure why I
am receiving this error or how to fix it.
Thanks,
Jonathan
--
Thanks,
Jonathan J. Reinhart - jreinhart(a)gmail.com
Secretary, Dedham Public Library Board of Trustees
"I cannot live without books."-Thomas Jefferson