All,
The Citizendium Foundation would like to call for bids from MediaWiki
programmers on this extension:
https://lists.purdue.edu/pipermail/citizendium-tools/2008-January/000239.htm
l
If you like the extension idea, make a proposal...you might end up getting
paid to do something you'd like to do anyway. We can afford to pay for this
now, and we'll certainly pay something that's reasonable. If you have any
questions about what we're asking, that might affect the bids, please let me
know and I'll post a clarification.
Let's say that the bids will close (if any reasonable bids are received) in
a week--July 9.
--Larry
Hello,
This is my first message in the list, so excuse any annoyance I could cause.
I have searched a lot on the web without success about this topic:
How can I install templates in the mediawiki I just installed ?
I read the docs and could successfuly create a template manually. What I
want is to find a way to quickly create a bunch of templates. For example,
I've found this one: http://en.wikipedia.org/wiki/Template:Current. I would
like to copy it (if it is legally permitted) to my own wiki. Copying a
template is not so easy because the template reference other templates which
in turn reference other templates and images. Maybe I missed something. I
imagine there is a place I can find templates I could start with.
Any advice is welcome.
Best regards,
--
francois.piette(a)overbyte.be
Author of ICS (Internet Component Suite, freeware)
Author of MidWare (Multi-tier framework, freeware)
http://www.overbyte.be
I have a user that I want to find out what IP address they are using. What
tables is that information in so I can pull out that data? If you already
have those queries. I can build them if necessary though.
Thanks.
Lennie
Welcome to mediawiki-l. This mailing list exists for discussion and questions
about the MediaWiki software[0]. Important MediaWiki-related announcements
(such as new versions) are also posted to this list.
Other resources.
If you only wish to receive announcements, you should subscribe to
mediawiki-announce[1] instead.
MediaWiki development discussion, and all Wikimedia technical questions, should
be directed to the wikitech-l[2] mailing list.
Several other MediaWiki-related lists exist:
- mediawiki-api[5] for API discussions,
- mediawiki-enterprise[6] for discussion of MediaWiki in the enterprise,
- mediawiki-cvs[7] for notification of commits to the Subversion repository,
- mediawiki-i18n[8] for discussion of MediaWiki internationalisation support,
- wikibugs-l[9] for notification of changes to the bug tracker.
List administrivia (unsubscribing, list archives).
To unsubscribe from this mailing list, visit [12]. Archives of previous postings
can be found at [3].
This list is also gatewayed to the Gmane NNTP server[4], which you can use to
read and post to the list.
Posting to the list.
Before posting to this list, please read the MediaWiki FAQ[10]. Many common
questions are answered here. You may also search the list archives to see if
your question has been asked before.
Please try to ask your question in a way that enables people to answer you.
Provide all relevant details, explain your problem clearly, etc. You may
wish to read [13], which explains how to ask questions well.
To post to the list, send mail to <mediawiki-l(a)lists.wikimedia.org>. This is a
public list, so you should not include confidential information in mails you
send.
When replying to an existing thread, use the "Reply" or "Followup" feature of
your mail client, so that clients that understand threading can sort your
message properly. When quoting other messages, please use the "inline" quoting
style[11], for clarity.
When creating a new thread, do not reply to an existing message and change the
subject. This will confuse peoples' mail readers, and will result in fewer
people reading your mail. Instead, compose a new message for your post.
Messages posted to the list have the "Reply-To" header set to the mailing list,
which means that by default, replies will go to the entire list. If you are
posting a reply which is only interesting to the original poster, and not the
list in general, you should change the reply to only go to that person. This
avoids cluttering the list with irrelevant traffic.
About this message.
This message is posted to the list once per week by <river(a)wikimedia.org>.
Please contact me if you have any questions or concerns about this mailing.
References.
[0] http://www.mediawiki.org/
[1] http://lists.wikimedia.org/mailman/listinfo/mediawiki-announce
[2] http://lists.wikimedia.org/mailman/listinfo/wikitech-l
[3] http://lists.wikimedia.org/pipermail/mediawiki-l/
[4] http://dir.gmane.org/gmane.org.wikimedia.mediawiki
[5] http://lists.wikimedia.org/mailman/listinfo/mediawiki-api
[6] http://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
[7] http://lists.wikimedia.org/mailman/listinfo/mediawiki-cvs
[8] http://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
[9] http://lists.wikimedia.org/mailman/listinfo/wikibugs-l
[10] http://www.mediawiki.org/wiki/FAQ
[11] http://en.wikipedia.org/wiki/Posting_style#Inline_replying
[12] http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
[13] http://www.catb.org/~esr/faqs/smart-questions.html
Does anyone have both extensions installed and have them working
together? I am not able to use FCKeditor due to it re-writing the
properties into some other code.
How do i prevent a property: [[qd title::test]]
from turning into this:
test
<div class="smwfact"><span class="smwfactboxhead">Facts about <span
class="swmfactboxheadbrowse">[[Special:Browse/Sandbox-2Fsemantic|Sandbox/semantic]]</span></span><span
class="smwrdflink"><span
class="rdflink">[[Special:ExportRDF/Sandbox/semantic|RDF
feed]]</span></span>
{| class="smwfacttable"
|-
| class="smwpropname" | [[Property:Qd title|Qd title]]
| class="smwprops" | test
|}
</div>
I have the lastest nightly build of the fckeditor extension
<http://www.mediawiki.org/wiki/Extension:FCKeditor_%28by_FCKeditor_and_Wikia…>
with fckeditor 2.6.2 <http://www.fckeditor.net/whatsnew> with
MediaWiki <http://www.mediawiki.org/> 1.12.0
PHP <http://www.php.net/> 5.2.5 (cgi-fcgi)
MySQL <http://www.mysql.com/> 5.0.24-community-nt
Thank-you
Matt
Markus: really great work you've done.
I made a special page only with your indexes!
I use it as a fast search facility as well (with ctrl F):
it finds the links very fast and exact.
In my case it would be therefor very convenient if the indexes would always be open.
No need for the [+].
Now I have to explain that for searching they have to open with the [+] first, beginning with the last (the indexes are very long...).
Is there a way this can be done?
Rein
>
> Subject:
> Re: [Mediawiki-l] MediaWiki + Lucene-Search2 + MWSearch extension =
> ZERO search results
> From:
> Tim Starling <tstarling(a)wikimedia.org>
> Date:
> Tue, 01 Jul 2008 08:02:05 +1000
>
>
> It sounds like you've isolated the problem to within a couple of
> hundred lines of code. Maybe you should spend less time searching the
> web for someone with your exact problem, and more time reading that code.
=) I'd agree with ya, if I wasn't so much of a PHP newbie... I'd
consider myself more of a Perl and Bash type coder, but I definarely
understand where you are coming from with your suggestion. Luckily, I
found someone over @ MediaWiki.org's MWSearch Extension_talk page that
helped me troubleshoot my issue!
>> Follow me here :: if I load up the URL in the debug log above (or
>> *everytime* I search now and read the debug log) in a web-browser, like
>> 'lynx' it I see this (or something similar) ;
>>
>> 1
>> 1.0 0 Main_Page
>
> Is this the same response text that MWSearch sees? If yes, where does
> MWSearch go wrong in interpreting it? If no, what is different about
> the way MWSearch requests pages compared to lynx? Is it timing out?
> You can use tcpdump to snoop on the communication between MWSearch and
> the search server. You can use telnet to generate requests manually
> and see how the search daemon responds.
>
> -- Tim Starling
Here's what "Brian" from MWSearch Extension_talk page helped identify,
summing up his last post, and the results we found from some
troubleshooting ;
"we can conclude from this that: 1) PHP can connect to Lucene properly
and 2) Your HTTP fetch capabilities are broken. I'm not sure what we can
do about it. The proper way is of course to fix the HTTP functions, but
I don't know how we can do that. The other option is to write a new HTTP
layer which will surely work."
<(root@/var/www/htdocs/wiki-svn06252008)> cd /var/www/htdocs/wiki-svn06252008
<(root@/var/www/htdocs/wiki-svn06252008)> php maintenance/eval.php
> $sock = fsockopen('127.0.0.1', 8123); fwrite($sock, "GET /search/svnwikidb/loopback?namespaces=0&offset=0&limit=20&version=2&iwlimit=10 HTTP/1.0\r\nHost: localhost\r\n\r\n"); print fread($sock, 8192);
HTTP/1.1 200 OK
Content-Type: text/plain
1
1.0 0 Main_Page
> print
Http::get('http://127.0.0.1:8123/search/svnwikidb/loopback?namespaces=0&offset=0&limit…'
<http://127.0.0.1:8123/search/svnwikidb/loopback?namespaces=0&offset=0&limit…>);
>
> print
Http::get('http://localhost:8123/search/svnwikidb/loopback?namespaces=0&offset=0&limit…'
<http://localhost:8123/search/svnwikidb/loopback?namespaces=0&offset=0&limit…>);
>
What's the chance some kind soul on the Mediawiki-l mailing list knows,
or can point me where I can figure out more in-depth information about
MediaWiki's HTTP get function that may be causing my querries to my
Lucene-Search-2 daemon on port 8123 to get stripped out?
When using PHP to talk directly to my LuceneSearch2 daemon I get a valid
response, and everything works great = the response is displayed, as
search results. The problem comes into play within my MediaWiki site
once I enable the MWSearch extension (ZERO search results), or as seen
above = when I start-up MediaWiki's PHP debug script and try to use
HTTP::get to talk with the LuceneSeach2 Daemon, I get no response so it
seems... but my LS2 daemon is definately responding to the HTTP::get
request! It sounds like MediaWiki is the culprit and MW's HTTP fetch
function is somehow stripping the search results --- as demonstrated
above. I can also get the search results from my LS2 daemon with a web
browser "lynx", telnet or with PHP.
I really hope someone can point me in the right direction, or help a
fella' out with diagnosing the issue! Thanks for your time, peace -
agentdcooper(a)gmail.com
Hi All,
I have a few queries about this page, sorry if this gets a little long!
*Part 1*
I am trying to include the content of the Special:recentchanges but only for
a certain number of days.
This works if I do {{Special:Recentchanges}} however if I put
{{Special:Recentchanges&days=5}} I get a red link to create the page.
Is there a way of performing what I am trying to accomplish?
*Part 2*
Is there a way of having unique page names appear only (ie if there have
been multiple edits of a particular page over a period of time, that it only
appears once?).
*Part 3*
Is there a way to get this emailed out as a report every X days?
Thanks heaps in advance!