The default page title is its name. Is there a way to change that? For
example, you might have a page named "VPNSetup", but then want the title
(i.e., the top heading) to be "How to Set up a VPN Network".
I tried using a single = (e.g., "=How to Set up a VPN Network=") but
that just adds it as an additional heading, and doesn't replace the
title.
--
Dave Brewster <dbrewster(a)guidewire.com>
I try to make the pywikipedia-scripts work for my own Wiki to do a mass import of articles. I make
the settings described in the article. (http://meta.wikimedia.org/wiki/Using_the_python_wikipediabot)
When I try to login there is an error: socket.gaierror, 2, name or service is unknown.
Checked for running processes. 1 processes currently running, including the current process.
Password for user Bot on my:de:
Logging in to my:de as Bot
Traceback (most recent call last):
File "login.py", line 210, in ?
main()
File "login.py", line 206, in main
loginMan.login()
File "login.py", line 159, in login
cookiedata = self.getCookie()
File "login.py", line 113, in getCookie
conn.request("POST", pagename, data, headers)
File "/usr/lib/python2.4/httplib.py", line 800, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python2.4/httplib.py", line 823, in _send_request
self.endheaders()
File "/usr/lib/python2.4/httplib.py", line 794, in endheaders
self._send_output()
File "/usr/lib/python2.4/httplib.py", line 675, in _send_output
self.send(msg)
File "/usr/lib/python2.4/httplib.py", line 642, in send
self.connect()
File "/usr/lib/python2.4/httplib.py", line 610, in connect
socket.SOCK_STREAM):
socket.gaierror: (-2, 'Der Name oder der Dienst ist nicht bekannt')
What does this means and what can I do to make it run? Are there possibilities to do a mass import
of articles with the pywikipedia-scripts or are there other solutions (e.g. with perl-scripts or so).
Thanks
Hi,
How can I export a MediaWiki page to HTML? What I want is to get a page without the left sidebar, the tabs, and the bottom bar, just the actual content.
Thanks,
Ittay
--
===================================
Ittay Dror (ittayd(a)qlusters.com)
Application Team Leader, R&D
Qlusters Inc.
+972-3-6081994 Fax: +972-3-6081841
Hi Rob:
Many thanks for your suggestion. Unfortunately, the change to the LocalSettings file didn't work. However, the problem was fixed by moving the MediaWiki app over to a web server running Linux. It previously resided on a server running HP-UX.
John
-----Original Message-----
From: mediawiki-l-bounces(a)Wikimedia.org [mailto:mediawiki-l-bounces@Wikimedia.org] On Behalf Of Rob Church
Sent: 23 September 2005 15:41
To: MediaWiki announcements and site admin list
Subject: Re: [Mediawiki-l] MediaWiki link problem following install process
Nip into your LocalSettings.php, find the two lines that control URL
styling; I bet you've got the "pretty" (i.e. /index.php/PageName)
naming chosen. Comment out that line, and uncomment the line which
uses the uglier looking URL style (something like
/index.php?page=PageName)
This has worked for me before, although it may not solve this
particular problem.
Rob Church
On 22/09/05, john.struthers(a)agilent.com <john.struthers(a)agilent.com> wrote:
>
> I've ran through the MediaWiki 1.4.9 install process which completed OK. However, when I now try any of the following links on the newly created Wiki Main Page I get a "Page cannot be displayed" browser error (IE6).
>
> Main page links causing error message ..
>
> discussion
> edit
> history
> Recent Changes
> Random Page
> What Links Here
> Related Changes
> Special Pages
>
> I welcome comments/suggestions. Thanks.
>
> J Struthers
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)Wikimedia.org
> http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
>
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
Please let me if this would suffice for getting results as an xml page
Hack Special:Export to parse the html page generated( or get the resultset
from LuceneSearch.php directly) and manipulate it with tags
<title>
<summary>
<relevancy>
<no of words>
<lasttimestamp>
Any pointers on this
--
Thanks and Best Regards,
Pavan
Thanx brion,
Please let me know if this would suffice for getting results as an xml page
Hack Special:Export to parse the html page generated( or get the resultset
from LuceneSearch.php directly) and manipulate it with tags
<title>
<summary>
<relevancy>
<no of words>
<lasttimestamp>
Any futhere pointers on this
--
Thanks and Best Regards,
Pavan
--
Thanks and Best Regards,
Pavan
Dear Mr. Klein,
did I understand you right that, in order to enable substring search in
MediaWiki, one does not have to change anything else than adding the
php-file and changing the local settings as you recommended in the last
posting?
I had to change two lines where you made references to SearchMySQL to
SearchMySQL4. After that the Wiki worked, but the behaviour of the search
didn´t change. I still cannot use * or other wild cards.
Do I have to change anything else to make your suggestion work?
Thanks very much in advance for your help.
Regards / mit freundlichen Gruessen
Sebastian Dosch
sebastian.dosch(a)depre.de
-------------------You wrote:-----------------------
>
>Message: 8
>Date: Wed, 28 Sep 2005 12:32:16 +0200
>From: Klein.Thomas(a)eae.com
>Subject: RE: [Mediawiki-l] mediawiki 1.4 & 1.5rc4, substring search,
> how?
>To: <mediawiki-l(a)Wikimedia.org>
>Message-ID:
> <OF2A6BE39F.59E045B1-ONC125708A.003933F9-C125708A.0039E2F7(a)eae.com>
>Content-Type: text/plain; charset=ISO-8859-1
>
>
>Now, here is the class for substring search without to change the source
>and MySQL4!
>
>The class SearchMySQL4SubString is a copy of SearchMySQL4.
>
>Make a new file in subdir 'includes' with this name 'SearchMySQL4SubString'
>and copy
>the next code into this file:
>
><?php
>/**
> * Search engine hook for MySQL 4+
> * @package MediaWiki
> * @subpackage Search
> */
>
>require_once( 'SearchMySQL.php' );
>
>/**
> * @package MediaWiki
> * @subpackage Search
> */
>class SearchMySQL4SubString extends SearchMySQL {
> var $strictMatching = true;
>
> /** @todo document */
> function SearchMySQL4SubString( &$db ) {
> $this->db =& $db;
> }
>
> function legalSearchChars() {
> return "A-Za-z_'0-9\\x80-\\xFF\\-*?+";
> }
>
> /** @todo document */
> function parseQuery( $filteredText, $fulltext ) {
> global $wgContLang;
> $lc = SearchEngine::legalSearchChars();
> $searchon = '';
> $this->searchTerms = array();
>
> wfDebug( "parseQuery filteredText is: '$filteredText'\n" );
> wfDebug( "parseQuery fulltext is: '$fulltext'\n" );
>
> # FIXME: This doesn't handle parenthetical expressions.
> if( preg_match_all( '/([-+<>~]?)(([' . $lc .
>']+)(\*?)|"[^"]*")/',
> $filteredText, $m, PREG_SET_ORDER ) ) {
> foreach( $m as $terms ) {
> if( $searchon !== '' ) $searchon .= ' ';
> $searchon .= $terms[1] .
>$wgContLang->stripForSearch( $terms[2] );
> if( !empty( $terms[3] ) ) {
> $regexp = preg_quote( $terms[3], '/' );
> if( $terms[4] ) $regexp .= "[0-9A-Za-z_]+";
> } else {
> $regexp = preg_quote( str_replace( '"', '',
>$terms[2] ), '/' );
> }
> $this->searchTerms[] = $regexp;
> }
> wfDebug( "Would search with '$searchon'\n" );
> wfDebug( "Match with /\b" . implode( '\b|\b',
>$this->searchTerms ) . "\b/\n" );
> } else {
> wfDebug( "Can't understand search query
>'{$this->filteredText}'\n" );
> }
>
> $searchon = $this->db->strencode( $searchon );
> $field = $this->getIndexField( $fulltext );
> return " MATCH($field) AGAINST('$searchon' IN BOOLEAN MODE) ";
> }
>}
>?>
>
>And add to lines into your 'LocalSettings.php':
>
>require_once( 'SearchMySQL4SubString.php' );
>
>$wgSearchType = SearchMySQL4SubString;
>
>
>Now the search function have the same syntax of MySQL4.
>
>Also, see http://dev.mysql.com/doc/mysql/en/fulltext-boolean.html for
>details
>about full text sql queries in MySQL.
>
>
>Mit freundlichen Gr|ssen / Kind regards
>
>i.A. Thomas Klein
>OPS Koordination / OPS coordination
>
>EAE software GmbH
>Kornkamp 8
>22926 Ahrensburg/Germany
>
>Tel : +49 4102/480-513
>Fax: +49 4102/480-561
>e-mail: mailto:klein.thomas@eae.com
>http://www.eae.com/
Hi All,
i am using Mediawiki 1.4.3 and got lucene search ( java version ) version
working.
Please let me know how to get the search results in an xml page?
--
Thanks and Best Regards,
Pavan
Hi!
I want to use Mediawiki behind a SSL-Proxy. So I add the whole URL with the
proxy and the url of the server into $wgServer from LocalSettings.php. But
this doesn't work: When I load the Mainpage with the complete URL
https://ssl-proxy/normal-url/mediawiki/index.php... I get the Mainpage, but
without a style sheet running. The links are displayed, but just show the
adress without the so important "normal-url" in it.
Can anybody help me? I didn't find a information in the archive of this
mailing list.
Thanks
Sebastian Brinkmann
sbrinkmann(a)gmail.com
Thx , do you have a link I tried it with google but I found nothing
usefull so far
……
I am not sure if I answered correctly it´s my first time with a mailinglist
Greetz Stephan
>There is a maintenance script which, IIRC, dumps the entire content of
>the wiki as a massive XML file. You could use this, and just back up
>the users tables if you wanted to cut down on the size of the file.
>Never tried this, however, so I don't know if it works; I'd guess it
>does, but...
>
>
>Rob Church
>
>On 27/09/05, Weishaupt, Stephan <weishaupt(a)ramada.de> wrote:
>> Hello i have a big problem:
>>
>>
>>
>> (version in my own forum with screenshots
>> http://forum.darklevel.org/viewtopic.php?p=15195#15195)
>>
>>
>>
>> i backup via phpmyadmin and everything seems to be fine ....
>>
>>
>>
>>
>>
>> Than i try to import the sql file via phpmyadmin and getting first
>> trouble cause i have a 2mb limit but than i tried a diffrent host and
>> there i get a error 1071 - Specified key was too long; max key length is
>> 1000 bytes
>>
>>
>>
>>
>>
>> The third thing i tried is:
>>
>> I set up an sql server at my workstation via xampp than than i created
>> via command line a new table and than i loaded the sql file in it.
>>
>>
>>
>> I had a few errors but first everything look fine all 26 tables were
>> build ..... than i tried to acess the wiki and just the headlines are
>> there everything else is gone :-(
>>
>>
>>
>>
>>
>> Greetz Stephan