Can somebody point me to the installation of Chinese mediawiki? The source refer to some daemon, does it work only in Linux?
Chinese mediawiki?just like other languages,when you install mediawiki,you choose your language,then you will have your "chinese wiki".
mediawiki only works in PHP+MySQL
There are zhdaemon, do i need it?
I have some Chinese pages in the wiki. It works fine, but search do not work.
How can I make search work?
On 5/13/06, Zj Bean bean.study@gmail.com wrote:
Chinese mediawiki?just like other languages,when you install mediawiki,you choose your language,then you will have your "chinese wiki".
mediawiki only works in PHP+MySQL _______________________________________________ MediaWiki-l mailing list MediaWiki-l@Wikimedia.org http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
I can only jump to titles exactly. When I type some chinese and press SEARCH, even I knew it is in the title, but it is not in the search result. It seems to me that the chinese is not segmented.
The source say that zhdaemon is used to convert traditional and simplified Chinese as well as use for segmentation. I am wondering if this is related to the search function. If it is needed, how can I set it up?
On 5/13/06, Brion Vibber brion@pobox.com wrote:
kent sin wrote:
There are zhdaemon, do i need it?
No.
I have some Chinese pages in the wiki. It works fine, but search do not work.
Please be detailed.
-- brion vibber (brion @ pobox.com)
MediaWiki-l mailing list MediaWiki-l@Wikimedia.org http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
kent sin wrote:
I can only jump to titles exactly. When I type some chinese and press SEARCH, even I knew it is in the title, but it is not in the search result. It seems to me that the chinese is not segmented.
Please confirm the following: 1) Your wiki is set to Chinese ($wgLanguageCode = "zh"; in LocalSettings.php) 2) This was set before you edited the pages. If not, edit them again to ensure they are re-saved into the search index table. 3) Provide some exact sample search terms you are using. 4) Provide the URL to your site and some pages you are trying to search for so we can examine them. If your site is a private site behind a firewall, please try to set up a public one or else provide the exact texts of the pages you are testing with so we can attempt to duplicate the problem.
The source say that zhdaemon is used to convert traditional and simplified Chinese as well as use for segmentation. I am wondering if this is related to the search function. If it is needed, how can I set it up?
No, it's not used, and not needed. That refers to some experimental tool which is not used.
-- brion vibber (brion @ pobox.com)
Thank you. It works now.
On 5/14/06, Brion Vibber brion@pobox.com wrote:
kent sin wrote:
I can only jump to titles exactly. When I type some chinese and press SEARCH, even I knew it is in the title, but it is not in the search result. It seems to me that the chinese is not segmented.
Please confirm the following:
- Your wiki is set to Chinese ($wgLanguageCode = "zh"; in LocalSettings.php)
- This was set before you edited the pages. If not, edit them again to ensure
they are re-saved into the search index table. 3) Provide some exact sample search terms you are using. 4) Provide the URL to your site and some pages you are trying to search for so we can examine them. If your site is a private site behind a firewall, please try to set up a public one or else provide the exact texts of the pages you are testing with so we can attempt to duplicate the problem.
The source say that zhdaemon is used to convert traditional and simplified Chinese as well as use for segmentation. I am wondering if this is related to the search function. If it is needed, how can I set it up?
No, it's not used, and not needed. That refers to some experimental tool which is not used.
-- brion vibber (brion @ pobox.com)
MediaWiki-l mailing list MediaWiki-l@Wikimedia.org http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
I created a new extension. You can see it implemented and not working here: http://wikimoto.org/wikimoto/index.php?title=Test
a working version of it OUTSIDE the wiki is here: http://wikimoto.org/test_bin/styleTests3.php
It needs to be run everytime the button is hit but I only get the cached page in the wiki. I'm running version 1.5.8 and http://meta.wikimedia.org/wiki/ MediaWiki_extensions_FAQ suggests that I should include
$parser->disableCache(); in my hook function.
The suggestion on that same page for version 1.4 suggests other options that also don't work. Such as:
$ts = mktime(); $now = gmdate("YmdHis", $ts + 120); $ns = $wgTitle->getNamespace(); $ti = wfStrencode($wgTitle->getDBkey()); $sql = "UPDATE cur SET cur_touched='$now' WHERE cur_namespace=$ns AND cur_title='$ti'"; wfQuery($sql, DB_WRITE, "");
This results in
Fatal error: Call to a member function on a non-object in /home/httpd/ vhosts/wikimoto.org/httpdocs/wikimoto/extensions/donation4email.php on line 71
If I back up even further and attempt to utilize
$wgParser->disableCache(); global $wgTitle; $dbw =& wfGetDB( DB_MASTER ); $dbw->update( 'cur', array( 'cur_touched' => $dbw->timestamp( time() + 120 ) ), array( 'cur_namespace' => $wgTitle->getNamespace(), 'cur_title' => $wgTitle->getDBkey() ), 'wfEmailDonationExtension' );
then I recieve the error message A database query syntax error has occurred. This may indicate a bug in the software. The last attempted database query was: (SQL query hidden) from within function "wfEmailDonationExtension". MySQL returned error "1146: Table 'wiki254915KjkhKJH354.mw_cur' doesn't exist (localhost)".
the code for the extenstion is here (too many characters to post here) [url]http://wikimoto.org/wikimoto/index.php? title=Mail_Donation&action=edit[/url]
I sure could use some help folks. Thanks, Erik
for version 1.5.8 just include in the second function (not the hook function):
global $wgTitle; $dbw =& wfGetDB( DB_MASTER ); $dbw->update( 'page', array( 'page_touched' => $dbw->timestamp( time () + 120 ) ), array( 'page_namespace' => $wgTitle->getNamespace(), 'page_title' => $wgTitle->getDBkey() ), 'name of your Extension as defined in $wgExtensionFunctions [] =' );
It works fine now as seen here: http://wikimoto.org/wikimoto/index.php?title=WikiMoto:Site_support
On May 13, 2006, at 1:22 PM, Erik Mermagen wrote:
I created a new extension. You can see it implemented and not working here: http://wikimoto.org/wikimoto/index.php?title=Test
a working version of it OUTSIDE the wiki is here: http://wikimoto.org/test_bin/styleTests3.php
It needs to be run everytime the button is hit but I only get the cached page in the wiki. I'm running version 1.5.8 and http://meta.wikimedia.org/wiki/ MediaWiki_extensions_FAQ suggests that I should include
$parser->disableCache(); in my hook function.
The suggestion on that same page for version 1.4 suggests other options that also don't work. Such as:
$ts = mktime(); $now = gmdate("YmdHis", $ts + 120); $ns = $wgTitle->getNamespace(); $ti = wfStrencode($wgTitle->getDBkey()); $sql = "UPDATE cur SET cur_touched='$now' WHERE cur_namespace=$ns AND cur_title='$ti'"; wfQuery($sql, DB_WRITE, "");
This results in
Fatal error: Call to a member function on a non-object in /home/httpd/ vhosts/wikimoto.org/httpdocs/wikimoto/extensions/donation4email.php on line 71
If I back up even further and attempt to utilize
$wgParser->disableCache(); global $wgTitle; $dbw =& wfGetDB( DB_MASTER ); $dbw->update( 'cur', array( 'cur_touched' => $dbw->timestamp( time() + 120 ) ), array( 'cur_namespace' => $wgTitle->getNamespace(), 'cur_title' => $wgTitle->getDBkey() ), 'wfEmailDonationExtension' );
then I recieve the error message A database query syntax error has occurred. This may indicate a bug in the software. The last attempted database query was: (SQL query hidden) from within function "wfEmailDonationExtension". MySQL returned error "1146: Table 'wiki254915KjkhKJH354.mw_cur' doesn't exist (localhost)".
the code for the extenstion is here (too many characters to post here) [url]http://wikimoto.org/wikimoto/index.php? title=Mail_Donation&action=edit[/url]
I sure could use some help folks. Thanks, Erik _______________________________________________ MediaWiki-l mailing list MediaWiki-l@Wikimedia.org http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
mediawiki-l@lists.wikimedia.org