I am using mediawiki and smw and a few plugins...I have the wiki data
loaded and am searching for ways to get more big data into my wiki - I am
thinking about yagos2 or some of the other datasets but am a bit perplexed
as to what the best course of action is - any suggestions or hints will be
much appreciated. maybe i should just try somee more of the standard
plugins/extensions. ultimately i would like to implement some targeted ads
to go with the content. what is the best combo of data and extensions in
your opinions? Any feedback appreciated. thanks, shep
On Tue, Jul 31, 2012 at 8:00 AM, <mediawiki-l-request(a)lists.wikimedia.org>wrote;wrote:
Send MediaWiki-l mailing list submissions to
mediawiki-l(a)lists.wikimedia.org
To subscribe or unsubscribe via the World Wide Web, visit
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
or, via email, send a message with subject or body 'help' to
mediawiki-l-request(a)lists.wikimedia.org
You can reach the person managing the list at
mediawiki-l-owner(a)lists.wikimedia.org
When replying, please edit your Subject line so it is more specific
than "Re: Contents of MediaWiki-l digest..."
Today's Topics:
1. Re: MWSeach and Lucene (Robert Stojnic)
2. Re: Proxy for CURL requests, how to set a proxy bypass list?
(Platonides)
3. Re: Section links and redirects in search results (Joel DeTeves)
4. Re: [mwlib] Re: Fwd: RE: Status of Collection Extension in
Powerpedia (Ralf Schmitt)
5. Image problems in Safari (Sondre Kvipt)
----------------------------------------------------------------------
Message: 1
Date: Mon, 30 Jul 2012 23:10:08 +0100
From: Robert Stojnic <rainmansr(a)gmail.com>
To: MediaWiki announcements and site admin list
<mediawiki-l(a)lists.wikimedia.org>
Subject: Re: [MediaWiki-l] MWSeach and Lucene
Message-ID: <50170640.5070002(a)gmail.com>
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Hi Zach,
No, WMF is using a version of that script as well. Not sure what is
wrong in that case.
Cheers, r.
On 30/07/12 18:16, Zach Hilliard wrote:
Currently we are running OAI and using the
"update" script supplied with
lucene, is their another method outside of
this?
On Mon, Jul 30, 2012 at 02:34:46PM +0100, Robert
Stojnic wrote:
> Hi Zach,
>
> Yes this is a known issue when using the ./build script with cron.
> WMF uses incremental updates which don't have such problems. I
> couldn't reproduce the problem once when I looked into it, I can
> only imagine it has something to do with previous build processes
> leaving the files in an inconsistent state. Some people have solved
> this problem by adding a rm -rf /path/to/your/index into cron before
> running the build script.
>
> Cheers, Robert
>
> On 30/07/12 06:53, Zach H. wrote:
>> I have a small Mediawiki (around 6000 pages) and currently using
>> MWSearch(recent) and Lucene (2.1) and have a problem with Lucene where
I
>> get
"java.io.FileNotFoundException:.../segments_u (No such file or
>> directory)"; I have created a scripted solution around this but its
>> slightly inefficient to rebuild my indexes THAT often as this happens
3-6
>> times a day. It seems a few people have
posted about this issue on the
>> Discussion portion of the Lucene page but no traction, is this issue
just
>> rare and caused by some incorrect
configuration on my part? or do other
>> wiki's have this issue as well? I am guessing the main Wikipedia is
using
>> the 2.1 branch of Lucene as it has
"Did you mean" functionality which
>> appears to be only apart of the 2.1 tree, and if this is true how do
they
deal with it? Any advice is appreciated, thanks in
advance!
Mediawiki (1.16.2)
PHP
MySQL
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
------------------------------
Message: 2
Date: Tue, 31 Jul 2012 00:26:23 +0200
From: Platonides <Platonides(a)gmail.com>
To: mediawiki-l(a)lists.wikimedia.org
Subject: Re: [MediaWiki-l] Proxy for CURL requests, how to set a proxy
bypass list?
Message-ID: <jv71nf$68l$1(a)dough.gmane.org>
Content-Type: text/plain; charset=ISO-8859-1
On 30/07/12 14:23, Roland Wohlfahrt wrote:
Hi folks,
i am using the parameter "wgHTTPProxy" to use our proxy.
See
http://www.mediawiki.org/wiki/Manual:$wgHTTPProxy
*Now I need to define a proxy bypass list. *
(Because of transcluding articles from another *internal
*mediawiki-server).
How can I accomplish this?
Thx for any hints!
Cheers,
Roland
Not exactly what it was designed for, but you could do
$wgConf->localVHosts[] = 'another-server.com';
and it won't be accessed by the proxy.
------------------------------
Message: 3
Date: Mon, 30 Jul 2012 22:13:40 -0700
From: Joel DeTeves <askteves(a)gmail.com>
To: mediawiki-l(a)lists.wikimedia.org
Subject: Re: [MediaWiki-l] Section links and redirects in search
results
Message-ID: <50176984.3060100(a)gmail.com>
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
UPDATE: Answered by Rainman
Hi Joel,
Just to clarify, the search results include section titles when the
section title itself matches the search term (and not when the text in
the section matches the search term). This should happen automatically
as long as you have the 2.1 features enabled in MWSearch (i.e.
$wgLuceneSearchVersion = 2.1). The Lucene-search backend parses section
titles and collects them into a special fields. This assumes that the
titles were generated using the standard syntax (e.g. == title ==). If
the section titles are generate via templates or some other extension,
it won't work.
Hope this helps,
Cheers, Robert
On 30/07/12 23:21, Joel DeTeves wrote:
Good day Robert,
My name is Joel.
I've been poking around the forums,
mediawiki.org + IRC channels and
haven't been able to find an answer to this question - however, it's
been said by many people I've talked to that you are the great mind
behind many of the critical search functions being used in Wikimedia
sites today (including Lucene-Search extension) and you might know the
answer.
Basically,
I am trying to get search results to display links to the section of the
article the search result was found in, rather than just a link the
article itself - the result I am looking for is much like the way
Wikipedia + Wikimedia search is working now.
I am currently running MediaWiki 1.19.1, PHP 5.4.4, MySQL 5.5.25, on an
Arch Linux / Apache 2 backend.
I have Lucene-Search 2.1 up and running, and according to some folks
I've talked to, it has capability along with MWSearch to do what I'm
asking.
Are you able to tell me how this is done?
Let me know if you need clarification, and thank you so much for your
time - PS, I am a great fan of your work... though I'm not a developer,
and I can't fathom how you are able to do it. If you have a donation
page set up, I would gladly toss in a small contribution as a token of
my gratitude.
Cheers,
-Joel DeTeves-
On 30/07/2012 12:53 PM, Joel DeTeves wrote:
Hello,
I have been all over the forums + mediawiki support desk, IRC, etc.
and so far no-one seems to know the answer to this.
I am wondering how to get Wikipedia-like search results on my Wiki.
I am running the following:
MediaWiki 1.19.1
PHP 5.4.5 (apache2handler)
MySQL 5.5.25a-log
Lucene 2.1
MWSearch + Lucene-Search, both latest
Everything works great except for the following:
I would like to have links to the section / redirect come up in my
results as well, similar to that of Wikipedia /
MediaWiki.org.
For example, when I search Wikipedia for the term 'Test Concept', it
gives results like this:
/The page "Test concept
<
http://en.wikipedia.org/w/index.php?title=Test_concept&action=edit&…
"
does not exist. You can ask for it to be created
<http://en.wikipedia.org/wiki/Wikipedia:Articles_for_creation>, but
consider checking the search results below to see whether the topic is
already covered./
For search help, please visit Help:Searching
<http://en.wikipedia.org/wiki/Help:Searching>.
*
Concept testing <http://en.wikipedia.org/wiki/Concept_testing>
(redirect from Concept Test
<http://en.wikipedia.org/wiki/Concept_Test>)
Concept testing is the process of using quantitative methods and
qualitative methods to evaluate consumer response to a product
idea prior *...*
6 KB (819 words) - 13:29, 22 May 2012
*
Concept inventory <http://en.wikipedia.org/wiki/Concept_inventory>
A concept inventory is a criterion-referenced test designed to
evaluate whether a student has an accurate working knowledge of a
specific *...*
14 KB (1,982 words) - 18:54, 5 July 2012
*
Prototype <http://en.wikipedia.org/wiki/Prototype>
A prototype is an early sample or model built to test a concept or
process or to act as a thing to be replicated or learned from. *...*
22 KB (3,188 words) - 00:23, 15 July 2012
*
Stalking horse <http://en.wikipedia.org/wiki/Stalking_horse>
(section Related concepts
<http://en.wikipedia.org/wiki/Stalking_horse#Related_concepts>)
A stalking horse is a figure that tests a concept with someone or
mounts a challenge against someone on behalf of an anonymous third
party *...*
16 KB (2,571 words) - 15:17, 10 July 2012
But my own wiki only gives results like this:
*
Concept testing <http://en.wikipedia.org/wiki/Concept_testing>
== Concept Testing ==
Concept testing is the process of using quantitative methods
and qualitative methods to evaluate consumer response to a
product idea prior *...
*
How can I enable this feature so that == headings == are converted to
(section
Headings<http://en.wikipedia.org/wiki/Concept_Test>) and
redirects show up as (Redirect Concept Test
<http://en.wikipedia.org/wiki/Concept_Test>) etc.
I hope this is clear. Thank you so much for any help you can provide!
------------------------------
Message: 4
Date: Tue, 31 Jul 2012 09:17:00 +0200
From: Ralf Schmitt <ralf(a)brainbot.com>
To: Jeremy Baron <jeremy(a)tuxmachine.com>
Cc: Tomasz Finc <tfinc(a)wikimedia.org>rg>, MediaWiki announcements and
site admin list <mediawiki-l(a)lists.wikimedia.org>
Subject: Re: [MediaWiki-l] [mwlib] Re: Fwd: RE: Status of Collection
Extension in Powerpedia
Message-ID: <873948tn6b.fsf(a)winserver.brainbot.com>
Content-Type: text/plain
Jeremy Baron <jeremy(a)tuxmachine.com> writes:
Great. I didn't know about any of that. I don't have access to those
boxes so I can't just pull it off the machines directly. Can it be
published somewhere?
please try to get someone from the ops team to help you with that.
re finding someone to puppetize: I can't commit to anything for at
least a week; will reevaluate then unless someone else has done it
first.
jeff green was once working on this. you may want to contact him.
--
cheers
ralf
------------------------------
Message: 5
Date: Tue, 31 Jul 2012 11:15:51 +0200
From: Sondre Kvipt <sondre(a)kustomrama.com>
To: mediawiki-l(a)lists.wikimedia.org
Subject: [MediaWiki-l] Image problems in Safari
Message-ID: <FEE62D5C-A9C9-4D5B-A411-EE47C650634F(a)kustomrama.com>
Content-Type: text/plain; charset=us-ascii
Hi, after upgrading my mediawiki to version 1.19.1 strange things have
been happening, and it seems to be a memory problem or something.
Using Safari on MAC as web browser, the webpage can work fine for some
time, then suddenly images aren't loading on a page. I just get the blue
boxes with a questionmark inside. So far I have found two ways of getting
the images to show. One is by clicking on an image to see the full version.
It then shows up, and when I hit the back button, all the other photos on
the page that were missing shows up as well. The second solution is to quit
Safari. If I opens it again on the same page, the images loads fine.
Refreshing the page does not help. In Google Chrome or Firefox, these
problems never happens.
An example page that doesn't work can be seen here:
http://www.kustomrama.com/index.php?title=California
Another similar problem in Safari happens often if I use the search box. A
search will often only result in a white page.
Do anyone know what these errors might come of?? As the page is an
established site it is important for me to have these problems fixed.
------------------------------
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
End of MediaWiki-l Digest, Vol 106, Issue 33
********************************************
1-207-409-4038
809 congress st. #7
portland, maine
04102