Moin,
we use user authentication via LDAP. But the question is: what, if the
LDAP fails? The documentation how to setup or repair the LDAP server is
inside our wiki.
So I wonder if it's possible to use LDAP for "mormal" users, but a
local user (let's call him "admin") with authentication against the
database?
Thanks
--
|Michael Renner E-mail: michael.renner(a)gmx.de |
|81541 Munich skype: michael.renner.gmx.de |
|Germany Don't drink as root! ESC:wq
I installed this extension on 2 mediawiki servers,the decision being
made that if it's on Wikipedia I want it on mine. Well I've decided that
its way to undeveloped to use for my servers with too many issues in
installtion.
Here's the question; I want to remove the extension (easy enough) &
every vestige of it that I may have imported in templates or scripts. I
do use a lot of downloaded material from wikipedia & its sister site
including templates. The need is to get rid of Scribunto as a necessarry
extension. So how to locate those pages, templates, etc that have it
installed. My understanding is that it is developed for using Lua in
embedded scripts & as I don't have that anywhere & don't plan to, whats
the best choice for my needs.
John
There is no documentation on this on the entire world wide web. There seems to be a variable missing for the user's IP address that can be passed-in to the function. I tried user_name, but that didn't work and I'm not sure about the range parameter either... I tried "66.187.0.0/16" but that didn't seem to work.
Much appreciated.
I too was recently struggling with Scribuntu on a non-WMF site. We have an
old version of the GNU C library and now need to figure out how to upgrade
it and otherwise get the environment ready. I'm sure Scribuntu is a
wonderful tool whose time has come; I just hope we can get the
documentation to the point that it's easier for people to figure out
exactly how to install and configure it in the various environments they
will encounter.
Of course, there is always the option of not using Scribuntu, and instead
using an older version of the templates. Given the complexities of template
dependencies, it could be a bit tricky. Maybe there should be a way to
export not the latest version of the various templates, but the version
that existed as of a certain date. E.g., get the Infobox, Infobox person,
etc. templates as they existed a few years ago. That could be a reasonably
good stopgap solution while we wait for more people to get on board the
Scribuntu train. (Filed as bug 56077;
https://bugzilla.wikimedia.org/show_bug.cgi?id=56077) In the long run,
people will have to adopt Scribuntu, because they'll more and more often
need the latest template functionality when they import pages from
Wikipedia.
I am working on a wiki that has documentation for our various processes.
I am trying to figure out how to add a "signoff" to each article.
When the article is being developed, it's tagged with a
[[Category:fixme]] so that we can easily find all articles in development.
When an article reaches final form, someone - a manager, a technician, a
designer - approves the article. That approver is typically not the
article writer, and may not even have a user account on the wiki.
I want to be able to find all signed off articles and all articles
signed off by a particular person.
I've thought about having a Category:signoff and then subcategories for
each person who signs off, but I can't figure out how to automate this.
It has to be automated since my pool of authors will grow to fairly
non-technical people.
The pool of people who can sign off articles is open-ended; often we go
out on the shop floor and find "Chuck" who has been there for 40 years
and is the only person who knows this process, so "Chuck" signs off on
the article.
I can create 2 categories; one "Signed_off" and one
"Signed_off_by_Chuck" and that would work, but it just seems clunky.
Any ideas on how to make this work?
--
Project Management Consulting and Training
http://www.ridgelineconsultingllc.com
Hello again,
Right now we have users who are partly identified by the color they input
in their edits. They currently need to put <span> tags within anything
that they edit to color their text. I want to simplify this by possibly
changing user css values so anything the write gets colored without their
having to remember to put the <span> tags in their entries.
I'm having trouble finding the class that defines user edits and anything I
change in User:username/common.css isn't yielding any results.
Curious if anyone else on the list has encountered this. This would make
these user's lives a lot easier and would make it easier to enforce the
policy as well.
Thanks,
Devin
cd root-of-destination
wget -r http://site.to.be.copied/wiki
But I don't think that's what the OP wants.
I'd just copy the database tables over. But since he's copying to an "upgraded server," that may not work well.
Maybe copy the database tables over to an install of the same rev, then upgrade the new server.
On 2013-10-21, at 17:43, mediawiki-l-request(a)lists.wikimedia.org wrote:
> From: John <phoenixoverride(a)gmail.com>
>
> I can do a little better than that, I can whip up something that copies
> everything based off Special:Allpages. If you want me to do that drop me a
> email off list and we can work out the details. ~~~~
>
> On Mon, Oct 21, 2013 at 8:19 PM, Wjhonson <wjhonson(a)aol.com> wrote:
>
>> How about a script that Googles site:www.myurl.com and then walks every
>> page and copies it ;)
>>
>>
>>
>>
>>
>>
>>
>>
>> -----Original Message-----
>> From: John Foster <jfoster81747(a)verizon.net>
>> To: mediawiki-list <mediawiki-l(a)lists.wikimedia.org>
>> Cc: MediaWiki announcements and site admin list <
>> mediawiki-l(a)lists.wikimedia.org>
>> Sent: Mon, Oct 21, 2013 4:58 pm
>> Subject: Re: [MediaWiki-l] Mediawiki articla export
>>
>>
>> On Mon, 2013-10-21 at 16:46 -0700, Yan Seiner wrote:
>>> John W. Foster wrote:
>>>> Is there any way to export ALL the articles & or pages from a very slow
>>>> but working mediawiki. I want to move them to a much faster upgraded
>>>> mediawiki server.
>>>> I have tried the dumpbackup script in /maintainence, but that didn't
>> get
>>>> all the pages, only some, a& I dont know why. Any tips are appreciated.
>>>> Thanks
>>>> john
>>>>
>>>>
>>> If it's the same version of mediawiki you can always try dumping the
>> database
>> directly and importing it into mysql on the new server. I'm not sure but
>> you
>> might have to create the exact file structure as well....
>>>
>> Thanks.
>> I am aware of that solution & in fact it is my preferred method for
>> moving a wiki. However; the reason the mediawiki is slow is a totally
>> messed up MySql database system, & I don't know how to fix it. I tried
>> for over a year, as the wiki has thousands of pages/articles. Therefore
>> I don't want to move the table structure for this db into the new,
>> properly functioning wiki.
>> Anything else, maybe.
>
:::: If all the advertising in the world were to shut down tomorrow, would people still go on buying more soap, eating more apples, giving their children more vitamins, roughage, milk, olive oil, scooters and laxatives, learning more languages by iPod, hearing more virtuosos by radio, re-decorating their houses, refreshing themselves with more non-alcoholic thirst-quenchers, cooking more new, appetizing dishes, affording themselves that little extra touch which means so much? Or would the whole desperate whirligig slow down, and the exhausted public relapse upon plain grub and elbow-grease? -- Dorothy Sayers
:::: Jan Steinman, EcoReality Co-op ::::
:::: (Send email to Quote(a)Bytesmiths.com to get a random quote, or Quotes(a)Bytesmiths.com to get 50 random quotes. Put a word in the Subject line to filter for that word.)
I have a server with bitnami mediawiki installed on Windows 2008 Server and would like to have multiple wikis running on the same server with different upload files folder. Please advise.
Is there any way to export ALL the articles & or pages from a very slow
but working mediawiki. I want to move them to a much faster upgraded
mediawiki server.
I have tried the dumpbackup script in /maintainence, but that didn't get
all the pages, only some, a& I dont know why. Any tips are appreciated.
Thanks
john
I had git cloned mediawiki-core for development purposes 6 months back.
However, I could not contribute much. Yesterday, I did a git pull and it
got updated. I also did a git pull for all the extensions present.
When I tried to install mediawiki freshly yesterday, it didn't work. When I
pointed my browser to 127.0.0.1/mediawiki, it gave me the following PHP
errors:
Warning: Invalid argument supplied for foreach() in
C:\wamp\www\open_source\mediawiki\includes\objectcache\SqlBagOStuff.php on
line 232
Fatal error: Call to a member function numRows() on a non-object in
C:\wamp\www\open_source\mediawiki\includes\objectcache\SqlBagOStuff.php on
line 512
When I pointed my browser to 127.0.0.1/mediawiki/mw-config/ , it gave me
the following errors:
Warning: Class 'ParamProcessor\ParamDefinitionFactory' not found in
C:\wamp\www\open_source\mediawiki\extensions\Validator\ParamProcessor.php
on line 85
Warning: Class 'ParamProcessor\ParamDefinition' not found in
C:\wamp\www\open_source\mediawiki\extensions\Validator\ParamProcessor.php
on line 86
Warning: Class 'ParamProcessor\Definition\StringParam' not found in
C:\wamp\www\open_source\mediawiki\extensions\Validator\ParamProcessor.php
on line 87
and other such errors.
Am I doing anything wrong?
Please help me in fixing this.