For several months now, I have been busy with the interwiki links, using Rob
Hoofts bot. The more I am working with it, the more work there has to be
done, it seems. Others are also working hard on it. Actually I think the
system of interwiki links is not optimal at all. It would be better, to have
a central location were all the interwiki's are listed, like commons. The
big advantage would be there is only one location where you have to put you
interwikis, and were you can correct the errors. The number of edits would
be reduced enormously.
Maybe it could even be combined with the existing commons. The system could
start by making a dump of all interwiki's from English wikipedia, which I
think has the most interwiki's.
For instance for the article Bablefish:
- On the central location, there is a page Bablefish which lists all
interwikis only. Maybe it needs one or two sentences for explanation, if
required to explain the exact meaning.
- On your local language wikipedia you have to link only to this page
Bablefish. Automatically the links appears at the side of the article, the
same way as we are used to now.
Of course there will be difficulties to solve, like which languages to be
used. Will it be allowed to rename those pages (please don't). Maybe these
both problems might be solved by redirects. I personally wouldn't care if
the main language is English. Surely, these pages are only a technical means
to streamline of the enormous amount of interwiki's. Has anyone ever
estimated them? And how much effort all the editing costs? The name of these
pages could be a number as well, but that is not how people like to work.
Maybe this has been suggested before, in that case, my apologies.
Elly Waterman
Cool, thanks David! I had read some of the stuff about the validation project, but no specifics. I checked it out at test.leuksman.com, and it looks great - I just want to see how flexibly it's implemented, whether the questions are easily configurable, etc. (for epinions style ratings).
I saw somewhere concerns about people promoting their articles and also people have retaliatory negative votes. I see that the voting has the option to "Clear my older validation data" - setting this to "yes" and using "median" instead of "mean" average will help reduce the impact of self-promoters and others.
Well, I guess it's time to take a deep breath, set up a test installation of 1.5, and dig in!
David Gerard wrote:
> [[m:Article validation feature]]
> [[m:Category:Article validation]]
> [[m:En validation topics]]
> [[m:De Validierungsthemen]]
>
> It's live on Wikimedia wikis (or at least en: and de: Wikipedias) for 1.5.
> You can try it out on http://test.leuksman.com/ - click on 'Validate' in
> the Monobook skin.
>
> At the moment nothing is planned to be done with the data other than gather
> it, make it viewable (you can view every rating, just like you can view
> every edit) and show a numerical average. We'll see what people do with the
> test data, then we'll probably write some apps for it, tweak the questions
> and then throw away the data and start over.
>
> I think a *lot* of MediaWiki installations will want to play with this
> feature.
Best Regards,
Aerik
I've added my latest LDAP Authentication patch to bugzilla:
http://bugzilla.wikipedia.org/show_bug.cgi?id=814
I will update my corresponding documentation to match the current patch
level.
This documentation is located here:
http://meta.wikimedia.org/wiki/LDAP_Authentication
Is this still being considered to be added to mediawiki 1.5? I'm almost
positive
all of the changes to the core code that are required for all of my planned
functionality have been added. Almost all of the changes that were made were
hooks, the rest were for security. If there are any required changes,
bugfixes,
or security concerns, let me know.
At this time, the LDAP patch has support for:
* Simple authentication through SSL using direct binds, or
proxy-authentication
** Note: proxy authentication is not currently working using multiple
domains.
Also, you will not be able to add LDAP users when using proxy authentication
yet. This will be added next version.
* Storage/Retrieval of some user preferences
* Ability to add new users to LDAP from Mediawiki
* Ability to change LDAP passwords through Mediawiki
* Ability to mail a temporary password so that users can change their LDAP
password
* Ability to do all of the above on multiple domains (including the local
database)
Future versions will eventually have the following functionality:
* A custom schema for LDAP
* Access control using security groups (Authentication only)
* Ability to use smart cards, or CAC cards to login to mediawiki using
certificates
* Ability to use LDAP as a complete backend for user information using a
single,
or multiple domains (or a combination of LDAP and the local database as
cache or
backup)
If anyone can think of other features that should be added, let me know.
V/r,
Ryan Lane
NAVOCEANO
> Would it be possible to give username priority first to sysops,
> then by number of accounts owned across different wikis, then
> most recent edit activity, and finally date of first registration? or
> some combination of these. That should reduce the number of
> conflicts to almost zero.
This would lead to conflict of values.
We should be as technocratic as possible with all this stuff,
then least 'value conflicts' can be raised. And sure, it can be
implemented faster.
The best selector is most ancient revision actually
(that nearly matches date of first registration).
Number of accounts on different wikis? Sure, there are
people who register their accounts on every possible wiki.
Most recent edit activity? How does it relate to any sense?
Sysops? Should we encourage people to climb that ladder
just because their accounts were safer? ;-)
Cheers,
Domas
How's the antivirus filtering on mail.wikimedia.org? The less
obviously dispensible crap in the mailing list queue, the closer it
will be to humanly manageable ...
I'm not sure the spam has abated, but having any mails from real
people rejected is a bad thing. (wikien-l, unlike wikitech-l, has many
legitimate users who can't work computers. Also, it's a semi-official
contact point for blocked users on en:.
Mind you, the queue is going to be a nightmare to deal with. The Nazi
spam hasn't abated in the slightest, apparently, with a fresh attack
going out about a week ago. Good article from Der Spiegel (in English)
on the Nazi virus, which sends people to Spiegel articles (which
they're not happy with):
http://service.spiegel.de/cache/international/0,1518,356297,00.html
- d.
Hello
I'm constructing a large literary resource, and would like to query
articles about authors automatically, and receive Wikipedia articles
back as RSS/XML documents to present in my website, with all the
relevant backlinks to wikipedia.
Does Wikipedia have customised syndication of its content?
Regards
Amit
quotationsbook.com
Thanks for the recent advice on including php. I've made a modification to
my installation of mediawiki that uses <loggedin> </loggedin> to only
display what is within the tags if the person is logged in to their account.
I am now trying to make a different tag to display what is between a a pair
of tags only if a person with their IP address is listed as logged in on my
local installation of phpBB.
I can ask the logged in question on a normal php page by including this at
the top:
<?php
define('IN_PHPBB', true);
$phpbb_root_path = './forum/';
include($phpbb_root_path . 'extension.inc');
include($phpbb_root_path . 'common.'.$phpEx);
$userdata = session_pagestart($user_ip, PAGE_INDEX);
init_userprefs($userdata);
?>
and this to display the alternatives:
<?php
if( $userdata['session_logged_in'] )
{ echo('Logged in'); }
else { echo('Not logged in'); }
?>
Does anyone know of a similar extension that queries another database from
within mediawiki?
Thanks in advance.
The user_name field in the user table didn't have a unique index and
wasn't properly locked for creations, so under some circumstances (eg
heavy load and clicking 'submit' a lot exacerbating race conditions)
duplicate user accounts were getting created.
In CVS HEAD I've added an updater and a manual script (userDupes.*) for
cleaning up duplicate accounts so the unique index can be applied; the
script can also be copied in and run on 1.4 and I've gone ahead and done
this on the Wiki*edia wikis.
-- brion vibber (brion @ pobox.com)
I'm sorry if this is a duplicate request, since I have already posted in
bugzilla, but there was no answer... the question is simple, who can I contact
to change Namespaces? who can do this?
Thank you
Andre