Here an interesting alternative implementation for MediaWiki/Wikipedia:
*
http://armstrongonsoftware.blogspot.com/2008/06/itching-my-programming-nerv…
* http://video.google.com/videoplay?docid=6981137233069932108 (Wikipedia
discussion starts 30min into the video)
Basically a p2p backend that claims order of magnitude performance gains
for writing pages. They ignore the front caches etc. Done in Erlang (+Java).
I was trying to figure out whether this would really be feature parity
but couldn't fully see it.
For the rendering, they use plog4u---does someone know whether this has
feature parity with Mediawiki (markup)? We used JAMWiki (Java
implementation of MediaWiki) only to see later that there was no
ParserFunctions extension available. (Why is this an extension rather
than a core part in the first place?)
Thanks!
Dirk
--
Phone: + 1 (650) 215 3459, Web: http://www.riehle.org
Hi,
Is there a way for the wikisysop to send an email to all users ?
Best Regards
Steph
--
Using Opera's revolutionary e-mail client: http://www.opera.com/mail/
Hello,
I am receiving the following error whenever I try to update points
within the SocialProfile extension:
Invalid argument supplied for foreach() in
/webroot/extensions/SocialProfile/UserStats/UserStatsClass.php on line
433
Line 433 starts:
foreach($this->point_values as $point_field =>
$point_value){
if($this->stats_fields[$point_field]){
$field =
$this->stats_fields[$point_field];
$new_total_points +=
$point_value * $row->$field;
} }
This is on Mediawiki 1.12.0 and SocialProfile extension version 1.1.
Does anyone have any ideas? I appreciate it.
Hi,
I'm new on this list, and I'm not sure I'm posting at the right place. I've got a question about text indexation.
I have a postgresql mediawiki instance installed on a very small server, and have been working on tweaking it's performance these last few days. When all that could be done from
the OS and database parameters was done, I decided to give a look to the SQL queries.
The main problem I'm seeing is with the text searchs. I don't really know how it's handled with mysql, but with postgresql, I think there may be some optimization that could be done,
but I'm not sure about it as I don't know all the code.
The text search query is this one :
EXPLAIN ANALYZE SELECT page_id, page_namespace, page_title, old_text AS page_text, ts_rank(titlevector, to_tsquery('default','postgres')) AS rnk FROM page p, revision r,
pagecontent c WHERE p.page_latest = r.rev_id AND r.rev_text_id = c.old_id AND textvector @@ to_tsquery('default','postgres') AND page_is_redirect = '0' AND page_namespace IN
(0,9,11) ORDER BY rnk DESC, page_id DESC LIMIT 20 OFFSET 0;
The plan is this one :
Limit (cost=718.49..718.50 rows=1 width=621) (actual time=305.943..305.982 rows=20 loops=1)
-> Sort (cost=718.49..718.50 rows=1 width=621) (actual time=305.939..305.952 rows=20 loops=1)
Sort Key: rank(p.titlevector, '''postgr'''::tsquery), p.page_id
-> Nested Loop (cost=0.00..718.48 rows=1 width=621) (actual time=4.278..305.671 rows=44 loops=1)
-> Nested Loop (cost=0.00..695.00 rows=21 width=204) (actual time=0.829..76.740 rows=3210 loops=1)
-> Seq Scan on page p (cost=0.00..524.95 rows=21 width=204) (actual time=0.804..19.686 rows=3210 loops=1)
Filter: (((page_is_redirect)::text = '0'::text) AND (page_namespace = ANY ('{0,9,11}'::integer[])))
-> Index Scan using revision_rev_id_key on revision r (cost=0.00..8.09 rows=1 width=8) (actual time=0.012..0.013 rows=1 loops=3210)
Index Cond: (p.page_latest = r.rev_id)
-> Index Scan using pagecontent_pkey on pagecontent c (cost=0.00..1.11 rows=1 width=425) (actual time=0.069..0.069 rows=0 loops=3210)
Index Cond: (r.rev_text_id = c.old_id)
Filter: (textvector @@ '''postgr'''::tsquery)
Total runtime: 306.118 ms
This plan joins page and revision to determine all the latests revisions of pagecontents, then scans all matching pagecontent to determine which ones match my query.
There is also an other plan, depending on the amount of ram available and the estimate of the number of 'latest pagecontents'
Limit (cost=2979.49..2979.50 rows=4 width=504) (actual time=224.594..224.646 rows=20 loops=1)
-> Sort (cost=2979.49..2979.50 rows=4 width=504) (actual time=224.591..224.610 rows=20 loops=1)
Sort Key: (ts_rank(p.titlevector, '''postgr'''::tsquery)), p.page_id
Sort Method: top-N heapsort Memory: 37kB
-> Hash Join (cost=2689.31..2979.45 rows=4 width=504) (actual time=211.141..224.432 rows=43 loops=1)
Hash Cond: (p.page_latest = r.rev_id)
-> Seq Scan on page p (cost=0.00..276.86 rows=3527 width=82) (actual time=0.460..10.202 rows=3118 loops=1)
Filter: ((page_is_redirect = '0'::bpchar) AND (page_namespace = ANY ('{0,9,11}'::integer[])))
-> Hash (cost=2688.26..2688.26 rows=84 width=430) (actual time=210.409..210.409 rows=1517 loops=1)
-> Hash Join (cost=534.76..2688.26 rows=84 width=430) (actual time=26.557..207.725 rows=1517 loops=1)
Hash Cond: (r.rev_text_id = c.old_id)
-> Seq Scan on revision r (cost=0.00..1836.94 rows=84194 width=8) (actual time=0.023..98.850 rows=84194 loops=1)
-> Hash (cost=533.59..533.59 rows=93 width=430) (actual time=18.182..18.182 rows=1515 loops=1)
-> Bitmap Heap Scan on pagecontent c (cost=190.83..533.59 rows=93 width=430) (actual time=0.585..15.663 rows=1515 loops=1)
Recheck Cond: (textvector @@ '''postgr'''::tsquery)
-> Bitmap Index Scan on ts2_page_text2 (cost=0.00..190.81 rows=93 width=0) (actual time=0.431..0.431 rows=1515 loops=1)
Index Cond: (textvector @@ '''postgr'''::tsquery)
Total runtime: 224.765 ms
Times are different because this machine is much more powerful.
This time, postgresql decides to get all articles from pagecontent, with all versions, and then determines which ones are latest.
In both cases this is rather inefficient, as I guess we search only on the latest version of the articles.
So I'm coming to the point ...
Is there a reason we index every version of every content ?
For instance, with my database, I've tested removing all textvectors from pagecontent except for the latest version of each page. My text index size went from 400Mb to 15Mb. And
my text search times went down to a near constant 10ms for all queries. I can then maintain the textvectors by modifying the trigger on pagecontent a bit to cleanup the previous
record while updating the table.
If I'm posting in the wrong place, please tell me. If the idea is stupid, please tell me also :)
Cheers
Marc Cousin
Hi all,
I'm trying to port my mediawiki v.1.10.0 from one server to another. I've
copied over all the files and the MySql database, and the pages are
appearing with one problem:
The redirect wants to send everything to the wrong directory. The
directory on the original server was "mwiki". It would access pages
through the ugly style of
"http://olddomain.com/mwiki/index.php?title=Main_Page". The new server
redirects everything to "wiki" instead, so on the main page it tries for
"http://newdomain.com/wiki/index.php?title=Main_Page" and gets an error.
The info in LocalSettings.php that seems pertinent is
$wgScriptPath = "/mwiki";
and
$cpInstallURL = "http://newdomain.com/mwiki";
That's all I got. Any pointers?
Thanks,
Dave Smey
Brooklyn, NY
I really like the FCKeditor for editing tables, but some of my users still
want the old edit toolbar for nostalgia or whatever. I came up with two
solutions for moving it, both of which work as long as FCKeditor is not
installed. The first is to move {$toolbar} in EditPage.php, and the second
is a bit of javascript:
document.getElementById("editform").insertBefore(document.getElementById("toolbar"),
document.getElementById("editpage-copywarn"))
You can run that in FF by prefixing j*avascript:* to it to get an exact idea
of what I want to do. Unfortunately it looks like FCKeditor is registering
itself as the official editor and my efforts are for naught. Anyone know a
way to do this?
I'm looking for an extension/magic word/etc that would allow me to
post the page ID in an article. I have searched Google, Mediawiki,
etc but can't find one. Figured I would see if someone knows out
there of functionality that exists like this.
Thanks,
Bobby
(Thanks to Brion for helping to correct my posting problem) ...
> I apologize in advance for asking what must surely be a simple
> question (or simple answer). Image uploaded to wiki. What syntax can
> I use to allow a click on the image's thumbnail to go directly to some
> site, rather than to the image at higher resolution?
>
> thanks in advance.
> Rob
Hi Rob,
The Baltimore Collective is working on this problem for our own site.
You can see our progress here ...
http://BaltoCo.org/wiki/Help:Imageshttp://BaltoCo.org/wiki/Help:Images2http://BaltoCo.org/wiki/Help:Images3
If this text-link works for you (for example):
[http://wikipedia.org WIKIPEDIA]
Then this image-link should work for you:
[http://wikipedia.orghttp://upload.wikimedia.org/wikipedia/en/thumb/b/bc/Wiki.png/100px-Wiki.png]
Instead of the textual link WIKIPEDIA, it substitues an image of the
Wikipedia logo. There are still issues with the little blue arrow icon
indicating off-site external link.
I can't recall exactly how we have our img urls set. Since we are still
low-traffic, we have it set so that users can link to external pictures,
which is not recommended.
Good luck, hope this helps ...
Richard_BaltoCo
Hi,
I have MW 1.12.0, PHP 5.25, MySQL 5.0.51a-3ubuntu1 and have ParserFunctions 1.1.1 installed. However, my wiki experienced some strange behaviour on the command #ifexist; at {{#ifexist:page|Yes|No}} the response is always no regardless of the existance of Page. Anyone can help? Thanks!
i want to leave this group but i dont know the data i used to register :-(
2008/7/28 <mediawiki-l-request(a)lists.wikimedia.org>
> Send MediaWiki-l mailing list submissions to
> mediawiki-l(a)lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> or, via email, send a message with subject or body 'help' to
> mediawiki-l-request(a)lists.wikimedia.org
>
> You can reach the person managing the list at
> mediawiki-l-owner(a)lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of MediaWiki-l digest..."
>
>
> Today's Topics:
>
> 1. Re: external links (Katharina Wolkwitz)
> 2. Re: Text search performance and postgresql (Marc Cousin)
> 3. Re: Text search performance and postgresql (Platonides)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 28 Jul 2008 08:51:40 +0200
> From: Katharina Wolkwitz <wolkwitz(a)fh-swf.de>
> Subject: Re: [Mediawiki-l] external links
> To: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
> Message-ID: <488D6C7C.4070100(a)fh-swf.de>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> Hi Matt,
>
> have a look at the bottom of the extension's talk page. The behaviour you
> describe is a bug with Firefox 3 and there's a fix that works...
>
> Matt Long schrieb am 25.07.2008 18:39:
> > The problem with that extension is that it grabs the filename only.. it
> > cuts off the path info. In the browse for file window it shows the full
> > path, but after clicking ok it truncates everything down to just the
> > file name.
> > Matt
> >
> > Katharina Wolkwitz wrote:
> >> Hi Matt,
> >>
> >> have a look at the FileLink-extension:
> >>
> >> http://www.mediawiki.org/wiki/Extension:FileLink
> >>
> >> Regards
> >> K. Wolkwitz
> >>
> >> Matt Long schrieb am 24.07.2008 19:55:
> >>
> >>> Hello, I am trying to find a much easier way for people to use external
> >>> file links in my wiki. Currently every space has to be replaced with a
> >>> %20. Is there something like {{urlencode:}} that would work? The
> >>> urlencode magic word converts spaces to +, how do i get %20 instead?
> >>>
> >> _______________________________________________
> >> MediaWiki-l mailing list
> >> MediaWiki-l(a)lists.wikimedia.org
> >> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >>
> >
> > _______________________________________________
> > MediaWiki-l mailing list
> > MediaWiki-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
> >
>
> --
>
> Mit freundlichen Gr??en
>
> Katharina Wolkwitz
>
> Fachhochschule S?dwestfalen
> Hochschulbibliothek
> Haldener Stra?e 182
> 58095 Hagen
> Tel.: 02331/987-2706
>
>
>
> ------------------------------
>
> Message: 2
> Date: Mon, 28 Jul 2008 11:16:19 +0200
> From: Marc Cousin <mcousin(a)sigma.fr>
> Subject: Re: [Mediawiki-l] Text search performance and postgresql
> To: MediaWiki announcements and site admin list
> <mediawiki-l(a)lists.wikimedia.org>
> Message-ID: <200807281116.19095.mcousin(a)sigma.fr>
> Content-Type: text/plain; charset="ansi_x3.4-1968"
>
> I totally agree with you.
>
> Is there a way of getting it in a future mediawiki release (I can do the
> work
> if necessary) ?
>
>
>
> On Friday 25 July 2008 17:05:48 Platonides wrote:
> > Marc Cousin wrote:
> > > For PostgreSQL, there are at least 2 solutions :
> > > - Put the textvector into the page table
> > > - Do the same as for mysql : create a searchindex table (even if the
> > > myisam justification for this table doesn't hold for mysql)
> > >
> > > Plus the temporary fix I was talking about in my mail : put an empty
> > > textvector for all versions of a document except its last one, by
> > > changing the trigger a bit. That divided the size of my fulltext index
> by
> > > 30 and made the text search fast again.
> > >
> > > What would be the preferred way to solve this problem ?
> >
> > Probably to create a new table. The less you differ from the mysql
> > setup, the better.
> >
> >
> > _______________________________________________
> > MediaWiki-l mailing list
> > MediaWiki-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
>
>
>
>
> ------------------------------
>
> Message: 3
> Date: Mon, 28 Jul 2008 13:34:41 +0200
> From: Platonides <Platonides(a)gmail.com>
> Subject: Re: [Mediawiki-l] Text search performance and postgresql
> To: mediawiki-l(a)lists.wikimedia.org
> Message-ID: <g6kasg$emi$1(a)ger.gmane.org>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> Marc Cousin wrote:
> > I totally agree with you.
> >
> > Is there a way of getting it in a future mediawiki release (I can do the
> work
> > if necessary) ?
> >
>
> Sure. Ask brion for a SVN account (you'll need to send him a public key).
> The postgresql backend is not too well maintained, people wanting to
> postgre appear from time to time and give it a kick.
> I don't remember who is currently supposed to be taking care of it, but
> i interpret that as he hasn't commented on this, he doesn't oppose ;)
>
>
>
>
> ------------------------------
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
>
> End of MediaWiki-l Digest, Vol 58, Issue 34
> *******************************************
>
--
Gerne stehe ich Ihnen für Fragen oder Informationen zur Verfügung.
Mit freundlichen Grüßen
Stephan Weishaupt
____________________________________________________________
Gottschedstr. 5, 22301 Hamburg, Germany
Mobil: +49 1577 7830232
http://www.weishaupt-stephan.de
stephan.weishaupt(a)gmail.com
____________________________________________________________