Hi,
i use the lockdown extension to grant access to special pages for
specific user groups. My problem ist, that users with "nomral" right's
shouldn't realize that there are pages they are not allowed to access.
So i wanted to programm something that a link at the sidebar is visible
depending to what user group the logged on user exists:
user belongs to the special user group --> link is visible at the sidebar
user belongs to the normal user group --> link is not visible at the sidebar
Does anybody know what possibilities i have!? i thought of programming a
javascrpt hook implementen in monobook.js, but i don't know how to
manipulate the sidebar (the position of the link)
second idea was to manipulate the monobook.php...
is there another way or has anybody already did something similar, or
has an idea°?
thank you very much for any hint
julia
On 1/3/08, huji(a)svn.wikimedia.org <huji(a)svn.wikimedia.org> wrote:
> Revision: 29235
> . . .
> Log Message:
> -----------
> (bug 12489) Special:Userrights should be listed under restricted special pages
> . . .
> - 'Userrights' => array( 'SpecialPage', 'Userrights' ),
> + 'Userrights' => array( 'SpecialPage', 'Userrights', 'userrights' ),
Uh? This locks out anyone who has $wgAddGroups/$wgRemoveGroups but
doesn't have the userrights right. That's why Werdna didn't do it to
start with.
Hi
According to http://www.mediawiki.org/wiki/Wiki_farm , on wiki farms created
using symlinks we can run update.php for each wiki separately. The question
is, is there a way to run update.php for other types of wiki farms? To my
knowledge Wikimedia doesn't use symlinks for its wikis; how do they get
updated (and we have hundreds of them so there must be an easy way to do
it).
Hojjat (aka Huji)
On 1/3/08, huji(a)svn.wikimedia.org <huji(a)svn.wikimedia.org> wrote:
> Revision: 29220
> ...
>
> Log Message:
> -----------
> More explanatory messages shown when an upload error happens.
>
> ...
> + case 1: # The uploaded file exceeds the upload_max_filesize directive in php.ini.
> + return new WikiErrorMsg( 'importuploaderror' );
> ...
> -'importuploaderror' => 'Upload of import file failed; perhaps the file is bigger than the allowed upload size.',
> ...
> Modified: trunk/phase3/maintenance/language/messages.inc
> ...
> - 'importuploaderror',
You seem to have removed the 'importuploaderror' message, while still using it.
Hi Stacey,
great suggestion to have context hints for messages. Niklas thought the same
thing a few days ago and implemented it. Now we need 'people in the know' to
document them. The hints are being displayed when editing messages.
We keep the messages in a language code assigned to 'local' in ISO 639-3,
'qqq'. An overview of currently documented messages, that can be edited by
everyone with the role 'translator' can be found here:
* core messages: http://translatewiki.net/w/?title=Special%3ATranslate
<http://translatewiki.net/w/?title=Special%3ATranslate&task=reviewall&group=
core&language=qqq&limit=100>
&task=reviewall&group=core&language=qqq&limit=100
* extension messages: http://translatewiki.net/w/?title=Special%3ATranslate
<http://translatewiki.net/w/?title=Special%3ATranslate&task=reviewall&group=
ext-0-all&language=qqq&limit=100>
&task=reviewall&group=ext-0-all&language=qqq&limit=100
If you already have editing rights, you can see the hints in action on
<http://translatewiki.net/w/?title=MediaWiki:Ogg-long-audio/de&action=edit>
http://translatewiki.net/w/?title=MediaWiki:Ogg-long-audio/de&action=edit
It is of course a huge task to document everything, but if we all pitch in,
it's only 3600 pieces of documentation ;)
We are still thinking if we should commit the message documentation at some
point... Especially because those developers may edit MessagesQqq.php and
definately not edit in Betawiki.
Cheers! Siebrand
_____
Van: mediawiki-i18n-bounces(a)lists.wikimedia.org
[mailto:mediawiki-i18n-bounces@lists.wikimedia.org] Namens Stacey Doljack
Borsody
Verzonden: donderdag 3 januari 2008 17:09
Aan: MediaWiki internationalisation; Wikimedia developers
CC: 'Wikimedia Translators'
Onderwerp: Re: [Mediawiki-i18n] [Wikitech-l] An update on localisation
inMediaWiki
<snip>
Now if only we could get some better context documentation for each message
there won't be such problematic translations. Take for example
'editingsection' or 'editingcomment'. The English is pretty vague and sure
enough the translation was rather silly when I finally saw it in Wikipedia.
Siebrand claims it only takes 16 hours to translate the most used messages,
but that's only if you know exactly what the variables in messages are and
which messages are variables in other messages .
<snip>
Dear Wikitech list members,
This is my first post here, I have been redirected by Alfio who said you
might have some answers regarding my research.
Here's my orginal question
(http://it.wikipedia.org/wiki/Discussioni_utente:Alfio#Long_Tail_of_Wikipedi
a_Usage) and Alfio's answer
(http://en.wikipedia.org/wiki/User_talk:Junjulien) right below:
Dear Alfio,
I am part of an organization that tries, amongst other things, to promote
the use of wikipedias in native languages. I believe you take an active part
in compiling these statistics :
http://en.wikipedia.org/wiki/Wikipedia:Multilingual_statistics, and I hope
you might point me in the right direction for my research. I am interested
in establishing a matrix which would give the number of users for each
"below 100 000 articles" wikipedias (from #16 onward in this list
http://meta.wikimedia.org/wiki/List_of_Wikipedias), against the countries
where the visitor's traffix originates from, as well as against where the
editors are editing from. Obviously it would be great to have time as a 3rd
dimension to follow trends...
Where to start? Who to ask to?
Please contact me on my talk page
http://en.wikipedia.org/wiki/User_talk:Junjulien
Thanks a lot for your time,
Jun Julien Matsushita Project Coordinator Internews Europe
-----------------------
Hello,
sorry for the late answer (holidays...). It is true that I compile part of
the Multilingual statistics, but my contribution is limited to getting the
current copy of <http://meta.wikimedia.org/wiki/List_of_Wikipedias>
http://meta.wikimedia.org/wiki/List_of_Wikipedias and feeding it to a script
which generates the table. The list of wikipedias itself, as far as I know,
is bot-generated, but I only have the foggiest idea of how (wikipedia's ways
can be strange at times... :-)
Your project would need a great deal of data about editors and readers, and
data about the readers is probably unavailable as it would require
collecting server logs, and Wikimedia servers do not have the capability of
recording visitor logs at our current load. I remember seeing on wikitech-l
that someone is recording decimated data, e.g. one in 10 or 100 visitors,
but deleting personal info like the originating IP, which would defeat
geolocation.
About the editors, the IP addresses of logged in users are not collected
(again). While for anonymous editors, the IP is recorded in the history and
you could download a full history dump from
<http://download.wikimedia.org/> http://download.wikimedia.org and see what
you can recover. In short, i don't really know how to help you. Try to write
to wikitech-l (see <http://lists.wikimedia.org/mailman/listinfo/wikitech-l>
http://lists.wikimedia.org/mailman/listinfo/wikitech-l), and see if someone
has the data you need.
Cheers,
Alfio
-------------------------
Has anyone a clue as to where to direct my efforts?
Thanks a lot for your time,
Jun Julien Matsushita
Radio Connect Project Coordinator
Internews Europe
14, cité Griset - 75011 Paris
France - www.internews.eu
skype: junjulien
Starting a new thread since the conversation has diverged.
I'm noticing that with both mediawiki 1.11 and mediawiki trunk there many,
many templates missing after running the enwiki-20071018 dump through
mwdumper.
I searched around and noticed this tread:
http://www.gossamer-threads.com/lists/wiki/wikitech/108392?do=post_view_thr…
, where Jeff mentions that increasing the memory_limit and execution times
allowed certain templates to render. I've incorporated these changes and
still notice that certain templates are missing.
Even more, when I try to copy the template from en.wikipedia.org and create
the template on my local dev machine I get the following error, both with
1.11 and trunk:
"""
A database query syntax error has occurred. This may indicate a bug in the
software. The last attempted database query was:
(SQL query hidden)
from within function "efUpdateCheckUserData". MySQL returned error "1146:
Table 'wikidb.cu_changes' doesn't exist (localhost)".
"""
Shouldn't the installation script have created this table? I ran mysqlcheck
and everything is fine (See below).
I was under the assumption that the dump contained all the templates in use.
Was this assumption incorrect? If that is correct what is not happening in
the normal setup (by browsing to config) that is missing?
Thanks,
Yousef
root@dreadnought:/home/srv/www/dreadnought/wiki# mysqlcheck -p wikidb
Enter password:
wikidb.archive OK
wikidb.categorylinks OK
wikidb.externallinks OK
wikidb.filearchive OK
wikidb.hitcounter
note : The storage engine for the table doesn't support check
wikidb.image OK
wikidb.imagelinks OK
wikidb.interwiki OK
wikidb.ipblocks OK
wikidb.job OK
wikidb.langlinks OK
wikidb.logging OK
wikidb.math OK
wikidb.objectcache OK
wikidb.oldimage OK
wikidb.page OK
wikidb.page_restrictions OK
wikidb.pagelinks OK
wikidb.querycache OK
wikidb.querycache_info OK
wikidb.querycachetwo OK
wikidb.recentchanges OK
wikidb.redirect OK
wikidb.revision OK
wikidb.searchindex OK
wikidb.site_stats OK
wikidb.templatelinks OK
wikidb.text OK
wikidb.trackbacks OK
wikidb.transcache OK
wikidb.user OK
wikidb.user_groups OK
wikidb.user_newtalk OK
wikidb.watchlist OK
Hello all,
I am trying to get hold of the latest dump of all English pages, all
revisions (enwiki-pages-meta-history.xml.bz2).
However, I am having real tracking it down.
The most recent dump of the file (18th October 2007) seems to have
failed:
http://download.wikimedia.org/enwiki/20071018/
The directory for the December 18th dump is a blank page:
http://download.wikimedia.org/enwiki/20071218/
The progress page for all dumps shows 'abort' across the board:
http://download.wikimedia.org/backup-index.html
Does anyone know what is going on here?
I would really appreciate it if someone could point me in the right
direction.
Thank you for your time,
Mark Truran
University of Teesside
m.a.truran(a)tees.ac.uk
P.S. I have cross-posted this enquiry to the Wiki-research mailing list,
as my request stems from a research project. Apologies if this was
unwarranted.
Happy Holidays everyone.
I'm going through: http://www.mediawiki.org/wiki/Special:Version -- and it
means that the version of mediawiki currently deployed is r28966, however
that commit was only yesterday by siebrand -- is that correct. Also is the
extension list fully up to date, ie if I download and install everything on
that page, and run mwdumper I should have an exact mirror of wikipedia?
Thanks,
Yousef