The current mw is running on mysql, and the upgrade should too, no desire to run an sqlite anything
-------- Original Message --------
From: Platonides <Platonides(a)gmail.com>
Sent: Tue, 26/06/2012 07:19 PM
To: mediawiki-l(a)lists.wikimedia.org
CC:
Subject: Re: [MediaWiki-l] problems upgrading from 13.2 to 19.1
On 27/06/12 00:42, Dave Hunsberger wrote:
> * Edited /htdocs/mw119/localSettings.php to point to /mw119
> folder and new db ('mediawiki_1_19')
Note it should be LocalSettings.php (capital L). If it can't
> Errors when running [URL]/mw119/mw-config/:
>
> * Blank page with error message "Warning: Cannot load module
> 'pdo_sqlite' because required module 'pdo' is not loaded in Unknown on
> line 0"
>
> MediaWiki 1.19.1 Updater
> PHP Fatal error: Call to undefined function mysql_error() in
> /[PATH]/mw119/includes/db/DatabaseMysql.php on line 305
> Fatal error: Call to undefined function mysql_error() in
> /[PATH]/mw119/includes/db/DatabaseMysql.php on line 305
So, are you trying to use a sqlite database or a mysql one?
Seems you have a problem with php.ini configuration there.
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Hello everyone,
I was careless enough to run a MediaWiki installation allowing people to
sign up without a moderator's approval. Hence a few hundred of them did
sign up and started to use the site to swap images.
How can I delete their accounts in the most expeditious way?
Thanks.
Boris.
I'm trying to find some further documentation or guidance (if any) on
site duplication and then master-slave replication.
The scenario is this:
We have 2 Wiki's, each in their own network. The first wiki is available
to our users now. The second one is a clone of the first which was made
a couple of months ago and hasn't been touched since. What I'd like to
do is have the clone be the slave and the first one be the master. I'm
thinking that the database would not contain the extensions, skins,
possibly photos, templates ? So I would have to push those over. The
clone or slave would only be for reading and not writing.
Can anyone give any guidance ?
thanks,
Chris
Hi, mr. All!
When i installed MW-1.19.1 on my hosting i got error messages like
'rename() [function.rename
<http://fwiki.whitefossa.ru/index.php/function.rename>]: SAFE MODE
Restriction in effect. The script whose uid is 55398 is not allowed to
access /var/tmp owned by uid 0 in
*/home/u48601/whitefossaru/fwiki/includes/upload/UploadStash.php* on
line *173*'
when trying to upload any image.
Safe mode was turned OFF, but this error didn't disappear.
Investigations, made by me and Sir. Sheti <sheti(a)furtails.ru> revealed, that my hoster is completely moron. He uses one tempdir /var/tmp for all users of server, and, this directory belongs to root:root. Because it any modification in this directory fails. And, hoster don't allow to change this directory.
So, here is dirty workaround by Sheti:
1) Create temp directory in filesystem, accessible by you, than chmod -R 777 it
2) Locate file <mediawiki_dir>/includes/GlobalFunctions.php
3) Find function named 'wfTempDir()' and comment it (this function determines temp directory):
//Hoster are idiot, we must make temp yourself
/*
function wfTempDir() {
foreach( array( 'TMPDIR', 'TMP', 'TEMP' ) as $var ) {
$tmp = getenv( $var );
if( $tmp && file_exists( $tmp ) && is_dir( $tmp ) && is_writable( $tmp ) ) {
return $tmp;
}
}
if( function_exists( 'sys_get_temp_dir' ) ) {
return sys_get_temp_dir();
}
# Usual defaults
return wfIsWindows() ? 'C:\Windows\Temp' : '/tmp';
}
*/
Than add new version of function:
//BEGIN PATCH
function wfTempDir() {
return '/home/u48601/whitefossaru/fwiki/temp'; //!!!! << FULL PATH TO YOUR TEMP MUST GO HERE !!!!
}
//END PATCH
4) Save file and locate <mediawiki_dir>/includes/uploads/UploadStash.php
5) Find function 'stashFile' and lines
if ( ! preg_match( "/\\.\\Q$extension\\E$/", $path ) ) {
$pathWithGoodExtension = "$path.$extension";
in it.
Comment out this lines.
Place between commented lines and 'if ( ! rename( $path,
$pathWithGoodExtension ) ) {' next code:
//BEGIN PATCH
//For moron hosters, who do not allow to work with file in
tmp directory
if ( ! preg_match( "/\\.\\Q$extension\\E$/", $path ) ) {
$file_basename = basename($path);
$pathWithGoodExtension =
wfTempDir().'/'.$file_basename.$extension;
//END PATCH
This code will move uploaded file into our temp.
6) Enjoy. It's working, but it's very dirty!
7) Write you hoster testimonial email.
P.s. My hoster is Majordomo (Majordomo.ru, Russia).
I have a small Mediawiki (around 6000 pages) and currently using
MWSearch(recent) and Lucene (2.1) and have a problem with Lucene where I
get "java.io.FileNotFoundException:.../segments_u (No such file or
directory)"; I have created a scripted solution around this but its
slightly inefficient to rebuild my indexes THAT often as this happens 3-6
times a day. It seems a few people have posted about this issue on the
Discussion portion of the Lucene page but no traction, is this issue just
rare and caused by some incorrect configuration on my part? or do other
wiki's have this issue as well? I am guessing the main Wikipedia is using
the 2.1 branch of Lucene as it has "Did you mean" functionality which
appears to be only apart of the 2.1 tree, and if this is true how do they
deal with it? Any advice is appreciated, thanks in advance!
Mediawiki (1.16.2)
PHP
MySQL
I am using mediawiki and smw and a few plugins...I have the wiki data
loaded and am searching for ways to get more big data into my wiki - I am
thinking about yagos2 or some of the other datasets but am a bit perplexed
as to what the best course of action is - any suggestions or hints will be
much appreciated. maybe i should just try somee more of the standard
plugins/extensions. ultimately i would like to implement some targeted ads
to go with the content. what is the best combo of data and extensions in
your opinions? Any feedback appreciated. thanks, shep
On Tue, Jul 31, 2012 at 8:00 AM, <mediawiki-l-request(a)lists.wikimedia.org>wrote:
> Send MediaWiki-l mailing list submissions to
> mediawiki-l(a)lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> or, via email, send a message with subject or body 'help' to
> mediawiki-l-request(a)lists.wikimedia.org
>
> You can reach the person managing the list at
> mediawiki-l-owner(a)lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of MediaWiki-l digest..."
>
>
> Today's Topics:
>
> 1. Re: MWSeach and Lucene (Robert Stojnic)
> 2. Re: Proxy for CURL requests, how to set a proxy bypass list?
> (Platonides)
> 3. Re: Section links and redirects in search results (Joel DeTeves)
> 4. Re: [mwlib] Re: Fwd: RE: Status of Collection Extension in
> Powerpedia (Ralf Schmitt)
> 5. Image problems in Safari (Sondre Kvipt)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 30 Jul 2012 23:10:08 +0100
> From: Robert Stojnic <rainmansr(a)gmail.com>
> To: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
> Subject: Re: [MediaWiki-l] MWSeach and Lucene
> Message-ID: <50170640.5070002(a)gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
>
> Hi Zach,
>
> No, WMF is using a version of that script as well. Not sure what is
> wrong in that case.
>
> Cheers, r.
>
> On 30/07/12 18:16, Zach Hilliard wrote:
> > Currently we are running OAI and using the "update" script supplied with
> lucene, is their another method outside of this?
> > On Mon, Jul 30, 2012 at 02:34:46PM +0100, Robert Stojnic wrote:
> >
> >> Hi Zach,
> >>
> >> Yes this is a known issue when using the ./build script with cron.
> >> WMF uses incremental updates which don't have such problems. I
> >> couldn't reproduce the problem once when I looked into it, I can
> >> only imagine it has something to do with previous build processes
> >> leaving the files in an inconsistent state. Some people have solved
> >> this problem by adding a rm -rf /path/to/your/index into cron before
> >> running the build script.
> >>
> >> Cheers, Robert
> >>
> >> On 30/07/12 06:53, Zach H. wrote:
> >>> I have a small Mediawiki (around 6000 pages) and currently using
> >>> MWSearch(recent) and Lucene (2.1) and have a problem with Lucene where
> I
> >>> get "java.io.FileNotFoundException:.../segments_u (No such file or
> >>> directory)"; I have created a scripted solution around this but its
> >>> slightly inefficient to rebuild my indexes THAT often as this happens
> 3-6
> >>> times a day. It seems a few people have posted about this issue on the
> >>> Discussion portion of the Lucene page but no traction, is this issue
> just
> >>> rare and caused by some incorrect configuration on my part? or do other
> >>> wiki's have this issue as well? I am guessing the main Wikipedia is
> using
> >>> the 2.1 branch of Lucene as it has "Did you mean" functionality which
> >>> appears to be only apart of the 2.1 tree, and if this is true how do
> they
> >>> deal with it? Any advice is appreciated, thanks in advance!
> >>>
> >>> Mediawiki (1.16.2)
> >>> PHP
> >>> MySQL
> >>> _______________________________________________
> >>> MediaWiki-l mailing list
> >>> MediaWiki-l(a)lists.wikimedia.org
> >>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >>>
> >>
> >> _______________________________________________
> >> MediaWiki-l mailing list
> >> MediaWiki-l(a)lists.wikimedia.org
> >> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> > _______________________________________________
> > MediaWiki-l mailing list
> > MediaWiki-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
>
>
>
>
> ------------------------------
>
> Message: 2
> Date: Tue, 31 Jul 2012 00:26:23 +0200
> From: Platonides <Platonides(a)gmail.com>
> To: mediawiki-l(a)lists.wikimedia.org
> Subject: Re: [MediaWiki-l] Proxy for CURL requests, how to set a proxy
> bypass list?
> Message-ID: <jv71nf$68l$1(a)dough.gmane.org>
> Content-Type: text/plain; charset=ISO-8859-1
>
> On 30/07/12 14:23, Roland Wohlfahrt wrote:
> > Hi folks,
> >
> > i am using the parameter "wgHTTPProxy" to use our proxy.
> > See http://www.mediawiki.org/wiki/Manual:$wgHTTPProxy
> >
> > *Now I need to define a proxy bypass list. *
> > (Because of transcluding articles from another *internal
> *mediawiki-server).
> >
> > How can I accomplish this?
> >
> > Thx for any hints!
> >
> > Cheers,
> > Roland
>
> Not exactly what it was designed for, but you could do
> $wgConf->localVHosts[] = 'another-server.com';
> and it won't be accessed by the proxy.
>
>
>
>
> ------------------------------
>
> Message: 3
> Date: Mon, 30 Jul 2012 22:13:40 -0700
> From: Joel DeTeves <askteves(a)gmail.com>
> To: mediawiki-l(a)lists.wikimedia.org
> Subject: Re: [MediaWiki-l] Section links and redirects in search
> results
> Message-ID: <50176984.3060100(a)gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> UPDATE: Answered by Rainman
>
> Hi Joel,
>
> Just to clarify, the search results include section titles when the
> section title itself matches the search term (and not when the text in
> the section matches the search term). This should happen automatically
> as long as you have the 2.1 features enabled in MWSearch (i.e.
> $wgLuceneSearchVersion = 2.1). The Lucene-search backend parses section
> titles and collects them into a special fields. This assumes that the
> titles were generated using the standard syntax (e.g. == title ==). If
> the section titles are generate via templates or some other extension,
> it won't work.
>
> Hope this helps,
> Cheers, Robert
>
> On 30/07/12 23:21, Joel DeTeves wrote:
> Good day Robert,
>
> My name is Joel.
>
> I've been poking around the forums, mediawiki.org + IRC channels and
> haven't been able to find an answer to this question - however, it's
> been said by many people I've talked to that you are the great mind
> behind many of the critical search functions being used in Wikimedia
> sites today (including Lucene-Search extension) and you might know the
> answer.
>
> Basically,
>
> I am trying to get search results to display links to the section of the
> article the search result was found in, rather than just a link the
> article itself - the result I am looking for is much like the way
> Wikipedia + Wikimedia search is working now.
>
> I am currently running MediaWiki 1.19.1, PHP 5.4.4, MySQL 5.5.25, on an
> Arch Linux / Apache 2 backend.
>
> I have Lucene-Search 2.1 up and running, and according to some folks
> I've talked to, it has capability along with MWSearch to do what I'm
> asking.
>
> Are you able to tell me how this is done?
>
> Let me know if you need clarification, and thank you so much for your
> time - PS, I am a great fan of your work... though I'm not a developer,
> and I can't fathom how you are able to do it. If you have a donation
> page set up, I would gladly toss in a small contribution as a token of
> my gratitude.
>
>
> Cheers,
> -Joel DeTeves-
>
>
> On 30/07/2012 12:53 PM, Joel DeTeves wrote:
> > Hello,
> >
> > I have been all over the forums + mediawiki support desk, IRC, etc.
> > and so far no-one seems to know the answer to this.
> >
> > I am wondering how to get Wikipedia-like search results on my Wiki.
> >
> > I am running the following:
> >
> > MediaWiki 1.19.1
> > PHP 5.4.5 (apache2handler)
> > MySQL 5.5.25a-log
> > Lucene 2.1
> > MWSearch + Lucene-Search, both latest
> >
> > Everything works great except for the following:
> >
> > I would like to have links to the section / redirect come up in my
> > results as well, similar to that of Wikipedia / MediaWiki.org.
> >
> > For example, when I search Wikipedia for the term 'Test Concept', it
> > gives results like this:
> >
> > /The page "Test concept
> > <
> http://en.wikipedia.org/w/index.php?title=Test_concept&action=edit&redlink=1
> >"
> > does not exist. You can ask for it to be created
> > <http://en.wikipedia.org/wiki/Wikipedia:Articles_for_creation>, but
> > consider checking the search results below to see whether the topic is
> > already covered./
> > For search help, please visit Help:Searching
> > <http://en.wikipedia.org/wiki/Help:Searching>.
> >
> > *
> > Concept testing <http://en.wikipedia.org/wiki/Concept_testing>
> > (redirect from Concept Test
> > <http://en.wikipedia.org/wiki/Concept_Test>)
> > Concept testing is the process of using quantitative methods and
> > qualitative methods to evaluate consumer response to a product
> > idea prior *...*
> > 6 KB (819 words) - 13:29, 22 May 2012
> > *
> > Concept inventory <http://en.wikipedia.org/wiki/Concept_inventory>
> > A concept inventory is a criterion-referenced test designed to
> > evaluate whether a student has an accurate working knowledge of a
> > specific *...*
> > 14 KB (1,982 words) - 18:54, 5 July 2012
> > *
> > Prototype <http://en.wikipedia.org/wiki/Prototype>
> > A prototype is an early sample or model built to test a concept or
> > process or to act as a thing to be replicated or learned from. *...*
> > 22 KB (3,188 words) - 00:23, 15 July 2012
> > *
> > Stalking horse <http://en.wikipedia.org/wiki/Stalking_horse>
> > (section Related concepts
> > <http://en.wikipedia.org/wiki/Stalking_horse#Related_concepts>)
> > A stalking horse is a figure that tests a concept with someone or
> > mounts a challenge against someone on behalf of an anonymous third
> > party *...*
> > 16 KB (2,571 words) - 15:17, 10 July 2012
> >
> >
> >
> > But my own wiki only gives results like this:
> >
> > *
> > Concept testing <http://en.wikipedia.org/wiki/Concept_testing>
> >
> > == Concept Testing ==
> > Concept testing is the process of using quantitative methods
> > and qualitative methods to evaluate consumer response to a
> > product idea prior *...
> > *
> >
> >
> > How can I enable this feature so that == headings == are converted to
> > (section Headings<http://en.wikipedia.org/wiki/Concept_Test>) and
> > redirects show up as (Redirect Concept Test
> > <http://en.wikipedia.org/wiki/Concept_Test>) etc.
> >
> > I hope this is clear. Thank you so much for any help you can provide!
> >
> >
> >
> >
>
>
>
> ------------------------------
>
> Message: 4
> Date: Tue, 31 Jul 2012 09:17:00 +0200
> From: Ralf Schmitt <ralf(a)brainbot.com>
> To: Jeremy Baron <jeremy(a)tuxmachine.com>
> Cc: Tomasz Finc <tfinc(a)wikimedia.org>, MediaWiki announcements and
> site admin list <mediawiki-l(a)lists.wikimedia.org>
> Subject: Re: [MediaWiki-l] [mwlib] Re: Fwd: RE: Status of Collection
> Extension in Powerpedia
> Message-ID: <873948tn6b.fsf(a)winserver.brainbot.com>
> Content-Type: text/plain
>
> Jeremy Baron <jeremy(a)tuxmachine.com> writes:
>
> >
> > Great. I didn't know about any of that. I don't have access to those
> > boxes so I can't just pull it off the machines directly. Can it be
> > published somewhere?
>
> please try to get someone from the ops team to help you with that.
>
> >
> > re finding someone to puppetize: I can't commit to anything for at
> > least a week; will reevaluate then unless someone else has done it
> > first.
>
> jeff green was once working on this. you may want to contact him.
>
> --
> cheers
> ralf
>
>
>
> ------------------------------
>
> Message: 5
> Date: Tue, 31 Jul 2012 11:15:51 +0200
> From: Sondre Kvipt <sondre(a)kustomrama.com>
> To: mediawiki-l(a)lists.wikimedia.org
> Subject: [MediaWiki-l] Image problems in Safari
> Message-ID: <FEE62D5C-A9C9-4D5B-A411-EE47C650634F(a)kustomrama.com>
> Content-Type: text/plain; charset=us-ascii
>
> Hi, after upgrading my mediawiki to version 1.19.1 strange things have
> been happening, and it seems to be a memory problem or something.
>
> Using Safari on MAC as web browser, the webpage can work fine for some
> time, then suddenly images aren't loading on a page. I just get the blue
> boxes with a questionmark inside. So far I have found two ways of getting
> the images to show. One is by clicking on an image to see the full version.
> It then shows up, and when I hit the back button, all the other photos on
> the page that were missing shows up as well. The second solution is to quit
> Safari. If I opens it again on the same page, the images loads fine.
> Refreshing the page does not help. In Google Chrome or Firefox, these
> problems never happens.
>
> An example page that doesn't work can be seen here:
>
> http://www.kustomrama.com/index.php?title=California
>
> Another similar problem in Safari happens often if I use the search box. A
> search will often only result in a white page.
>
> Do anyone know what these errors might come of?? As the page is an
> established site it is important for me to have these problems fixed.
>
>
> ------------------------------
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
>
> End of MediaWiki-l Digest, Vol 106, Issue 33
> ********************************************
>
--
Best Regards,
Shep Husted
opensourceservers.comopensourcenetworks.comengineeredcomputer.com
1-207-409-4038
809 congress st. #7
portland, maine
04102
Hi, after upgrading my mediawiki to version 1.19.1 strange things have been happening, and it seems to be a memory problem or something.
Using Safari on MAC as web browser, the webpage can work fine for some time, then suddenly images aren't loading on a page. I just get the blue boxes with a questionmark inside. So far I have found two ways of getting the images to show. One is by clicking on an image to see the full version. It then shows up, and when I hit the back button, all the other photos on the page that were missing shows up as well. The second solution is to quit Safari. If I opens it again on the same page, the images loads fine. Refreshing the page does not help. In Google Chrome or Firefox, these problems never happens.
An example page that doesn't work can be seen here:
http://www.kustomrama.com/index.php?title=California
Another similar problem in Safari happens often if I use the search box. A search will often only result in a white page.
Do anyone know what these errors might come of?? As the page is an established site it is important for me to have these problems fixed.
Hi,
On Mon, Jul 30, 2012 at 4:38 AM, Ralf Schmitt <ralf(a)brainbot.com> wrote:
> Right (and at least I did it intentionally). I think the OP's problem
> description is rather incomplete (logs are missing; it's missing a
> description of the exact commands used to run the render server; even
> the exact error message is missing).
A good report of what the problem observed is may help and we can
argue about whether the original reports are good. IMHO there's enough
information there to make some guesses about what the problem is. But
for now can we focus on fixing this big problem:
> I tried the onwiki docs[1], readthedocs[2], and the WMF config for the
> regular cluster[3] and didn't find any answers. I'm not sure where the
> config is stored for the PDF rendering cluster. (the canonical copy
> should live at gerrit so it probably needs to be moved there)
?
Please publish a copy of the PDF rendering cluster configuration
(ideally in git) or at least (if it needs sanitizing) give a copy to
WMF ops or me offlist. And then commit to maintaining the public copy
that is produced as the canonical copy.
Thanks,
Jeremy