Some users on en wiki have started using a javascript tool that causes
them to http get every article that shows up on recent changes rapid
fire. (see http://en.wikipedia.org/w/index.php?title=User:Lupin/recent2.js
)
We wouldn't normally permit a bot to hit the wiki that fast and bots
aren't potentially in the hands of thousands of users equipped with
nothing more than a web browser.
Should we set up some guidelines on this now, or just wait until we
have to limit per-source http request throttling?
Hello,
I would like to ask for the creation of a Ripuarian Wikipedia. The
discussion has been around on MetaWiki for some time and seems to
be finished. The TestWiki on http://wikoelsch.dergruenepunk.de has
already more than 500 articles, and there have been several requests
from users who want to join as soon as it is a real Wikipedia.
Thankx
Dominik
Job Description
Wanted: an experienced programmer with a background in the MediaWiki code
base.
ManyOne Networks has been tasked by a client to build an encyclopedia in a
wiki format. The wiki platform we are using is MediaWiki, the same
underlying platform as Wikipedia. Our client is aiming, like Wikipedia, to
build a large body of freely available content developed in a massively
collaborative mode via a wiki. Unlike Wikipedia, this client is working in a
specific topic area and with a specific (though large) body of potential
contributors.
Because of these (and other) differences, we find ourselves in need of
someone who can work with us on making some changes or enhancements to the
underlying MediaWiki code. It is our intent, by the way, that changes made
be offered back as open source to the MediaWiki developers. Some of the
areas where we hope to get help include:
- Content replication across wiki namespaces;
- area of registration and authentication;
- role-based access control.
Interested applicants should contact Mike Matthews at ManyOne Networks with
resume and contact information.
Contract to hire possibilities.
Mike Matthews
ManyOne Networks
100 Enterprise Way, Suite G-370
Scotts Valley, CA 95066
831-438-9800 ext 132
mike(a)manyone.net
I figured this list would be a great place to find this person.
___________________________________________________
John Anthony Hartman
website: http://www.websage.org
podcast: Multi-Media Me- http://pmo.websage.org
"Any sufficiently advanced technology is indistinguishable from magic. "
--Arthur C. Clarke
Has anyone tried integrating a reputation system with MediaWiki?
So editors get awarded points/feed back for their work?
Any tips or pointers very welcome.
Paul
--
Yellowikis is to Yellow Pages, as Wikipedia is to The Encyclopedia Britannica
Just a heads up; I'm moving this weekend and may be scarce for a few
days until my phone/internet gets set up.
Any priority site/software issues should be directed to this list or the
bug tracker rather than private email to me to make sure somebody sees it.
-- brion vibber (brion @ pobox.com)
Yes. take a look at this website in pdf page 7
<http://std.dkuug.dk/jtc1/sc2/wg2/docs/n2633r.pdf>
If you notice, words =like ''sudah'', ''orang'' and anaknya has virama (we
called it ''ma'') diacritic. Tough that're Indonesian but, certain word in
buginese like ''padang'', ''barek'' and ''sallatang'' need this diacritic.
The underline technique is Andy Mallarangeng's idea, the first person to
create the lontara software.
Thank you.
Muhammad Zainuddin
Does anyone know if the mwdumper program has image import capabilities? I'm
able to import wikipedia data into my wiki, but I also need to import
certain images, but can't find a way to perform that function effectively.
Ryan
I have set up wiki which can only be read by logged in users, by setting:
$wgGroupPermissions['*' ]['createaccount'] = false;
$wgGroupPermissions['*' ]['read'] = false;
$wgGroupPermissions['*' ]['edit'] = false;
$wgGroupPermissions['user' ]['move'] = true;
$wgGroupPermissions['user' ]['read'] = true;
$wgGroupPermissions['user' ]['edit'] = true;
$wgGroupPermissions['user' ]['upload'] = true;
However, a non logged in user visiting any page raises an error, and
this appears to call:-
function returnToMain( $auto = true, $returnto = NULL )
which contains a 10 second refresh (on the the main page) with the notice
***
Login Required
You must login to view other pages.
Return to Main Page.
***
$wgOut->addMeta( 'http:Refresh', '10;url=' . $titleObj->escapeFullURL() );
}
I have set the refresh to 600 seconds to reduce server load. This
Wiki is on the Internet (that is not an intranet).
Any other thoughts or suggestions?
Regards,
Gordo
--
"Think Feynman"/////////
http://pobox.com/~gordo/
gordon.joly(a)pobox.com///
On 11/4/05, Magnus Manske <magnus_manske(a)users.sourceforge.net> wrote:
> Update of /cvsroot/wikipedia/phase3/includes
> In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv21820/includes
>
> Modified Files:
> EditPage.php
> Log Message:
> Additional hook
>
> Index: EditPage.php
> ===================================================================
> RCS file: /cvsroot/wikipedia/phase3/includes/EditPage.php,v
> retrieving revision 1.231
> retrieving revision 1.232
> diff -C2 -d -r1.231 -r1.232
> *** EditPage.php 3 Nov 2005 22:40:02 -0000 1.231
> --- EditPage.php 4 Nov 2005 15:32:26 -0000 1.232
> ***************
> *** 155,158 ****
> --- 155,161 ----
> function edit() {
> global $wgOut, $wgUser, $wgRequest, $wgTitle;
> + $l = strlen ( $wgOut->mBodytext ) ;
> + wfRunHooks( 'AlternateEdit', array( &$this ) ) ;
> + if ( $l != strlen ( $wgOut->mBodytext ) ) return ; # Something's changed the text, my work here is done
>
> $fname = 'EditPage::edit';
>
This is the wrong way to do something like this, it should look like:
if ( ! wfRunHooks( 'AlternateEdit', array( &$this ) ) )
return;
That way you avoid two calls to strlen() and allow the extension using
the hook to override the function even if it doesn't change
$wgOut->mBodyText (and you should really use the ->getHTML() accessor
if you wanted it)
Hi all,
First of all I have my own wiki installed running on mediawiki 1.5. I
have the latest current pages database dump in xml format which I
imported to the mediawiki using the php import script.
Now before I installed the mediawiki 1.5 I had mediawiki 1.4. I find
that the database schema has changed. My observations can be summarized
by the following table :
MediaWiki 1.5 (under command "show tables;" in mysql )
N.B I used the prefix "wiki" during installation for the tables.
wikiarchive
wikicategorylinks
wikihitcounter
wikiimage
wikiimagelinks
wikiinterwiki
wikiipblocks
wikilogging
wikimath
wikiobjectcache
wikioldimage
wikipage
wikipagelinks
wikiquerycache
wikirecentchanges
wikirevision
wikisearchindex
wikisite_stats
wikitext
wikitrackbacks
wikiuser
wikiuser_groups
wikiuser_newtalk
wikivalidate
wikiwatchlist
Apart from the above said tables Mediawiki 1.4 also had the tables as
listed below.
wikiblobs
wikibrokenlinks
wikicur
wikilinks
wikilinkscc
wikilogging
wikitranscache
wikiuser_rights
wikivalidate
Having specified the background I have the following questions :
1. Does this mean that the database schema of wikipedia has changed ?
2. The schema looks really simple ! Is that all in the backend of the
mighty wikipedia ?
3. I have observed that under the history tab in wikipedia, we get all
the versions of the current page. And we can select any 2 versions and
click on compare to find the changes. Apparently it seems that wikipedia
stores all the snapshots of every change in the page(Article). I presume
its stored in the recentchanges table. Am I correct in my presumption.
4. When does a page or Article enters the wikiarchive table ? I
experimented with my own wiki (running on mediawiki 1.5) and discovered
that if I add a new article to the wikipedia, its entry is there in the
wikirecentchanges, wikipage and wikirevision tables. But not in
wikiarchives table. So when does it enter the latter and under what
circumstances?
Please clarify these querries. In case you know the answer to some
please feel to reply to them.
Thanking you
C