It appears to me that there would have to be modifications in many
PHP files to allow MS SQL to work because PHP calls MySQL tables with
different commands then MS SQL tables. As a result I have given up
trying to use the site with MS SQL and have moved to one using MySQL
and it works fine there.
Thanks to all who responded.
At 11:36 PM 11/24/2010, you wrote:
>For certain values of "support" greater than "it doesn't exist at all."
>On Nov 24, 2010 4:15 PM, "Platonides" <Platonides(a)gmail.com> wrote:
>Gary Roush wrote:
> > Will Mediawiki work with MS SQL? I have not been able to find
> > a reference to...
>There is support for MS SQL database, so it could work. It may not be as
>complete or stable as it should, though.
>MediaWiki-l mailing list
>MediaWiki-l mailing list
I would like to prevent specific users from viewing some Special pages:
& a few others.
Here is the what I did:
Downloaded and extracted, Lockdown-MW1.16-r70092.tar.gz to
Inserted the following into Localsettings.php:
- require_once( "$IP/extensions/Lockdown/Lockdown.php" );
- $wgSpecialPageLockdown['Export'] = array('bureaucrat');
It didn't prevent users in the bureaucrat group from accessing the
Export page. Any ideas of what could be the cause, would be appreciated.
I'd be glad if someone could help me with the following question:
Imagine that *some* articles are using a given template. At the end of
this template I have inserted the following code (Facebook comments):
When users check a page with that template, first they see is :
1. the template filled with data,
2. then FB comments
3. finally the free text
But I would like they see this:
1. the template filled with data,
2. Free text
3. FB Comments
I think that the best option would be add the code
"<fb:comments></fb:comments>" outside from this template but, where?
Is there any internal template which be loaded at the end of an
article like a footer? I mean something like:
- internal header template
- internal content template
- internal footer template
Maybe I'm answering myself but this manual page:
http://www.mediawiki.org/wiki/Manual:Footer says that I have to modify
skins to achieve it:
To add or remove items from the footer on your MediaWiki page, you
must edit the skin.
For example: if you go in to MonoBook.php (located by default in the
skins folder) you will find the following code:
$footerlinks = array(
'lastmod', 'viewcount', 'numberofwatchingusers', 'credits', 'copyright',
'privacy', 'about', 'disclaimer', 'tagline',
I don't like it because it would mean that the footer is the same for
all pages and I don't want that.
Is it the only way?
I installed the Collection extension and have a problem which seems to
be not uncommon:
If I want to download the document, the pdf-File is opened in the
browser and shown as
%PDF-1.3 %“Œ‹ž ReportLab Generated PDF document http://www.reportlab.com
% 'BasicFonts': class PDFDictionary 1 0 obj % The standard fonts
dictionary << /F1+0 117 0 R /F2+0 121 0 R /F3+0 125 0 R /F4+0 129 0 R
/F5 19 0 R /F6+0 133 0 R >> endobj % 'Annot.NUMBER1': class LinkA ...
So the File is sent as "text/html" instead as "application/pdf".
I never have problems to download pdf files, so it's not a problem of my
browser (FF 3.6.12) I think.
Saving the file to disk doesn't work, it can't be read as PDF.
The only solution so far is download the file directly via
curl -D - -o test.pdf -d collection_id=1234567890 -d writer=rl -d
But that doesn't make fun...
Has anyone a solution for that problem? Could it be a problem of the
server at pediapress?
Thanks for the reply, but no, that's not quite what I'm looking for:
this is not a wiki "family" of multiple wikis that need a shared
repository, but a single large wiki that has multiple web servers.
Setting up a second single-node wiki just to serve images for the main
wiki would be inefficient and a single point of failure.
On 11/25/10 11:00 PM, mediawiki-l-request(a)lists.wikimedia.org wrote:
> From: Laurent Savaete<laurent.savaete(a)googlemail.com>
> Subject: Re: [Mediawiki-l] Managing uploads for a cluster of Wiki
> I think you'll find what you're after here:
2010/11/25 Jani Patokallio<jpatokal(a)iki.fi>:
> > I'm managing a two-node cluster of Wikis, and I'd like to enable uploads.
> > Only problem is, since uploads are stored locally, a file uploaded to node 1
> > is not visible from node 2. ?How is this usually managed?
I'm managing a two-node cluster of Wikis, and I'd like to enable uploads.
Only problem is, since uploads are stored locally, a file uploaded to node 1
is not visible from node 2. How is this usually managed? I'm a little
surprised that there appears to be no option to upload directly to and store
files in a database (which is already shared between nodes), but then Apache
would have to be hacked to find the files there instead.
One option I was considering was setting up an NFS mount, so uploads go to a
shared remote directory. The node already share a database, so node 2 will
know if node 1 gets a file and can find it at the same place. The drawback
is that this requires managing the NFS mount, which can be a little fiddly.
Any other ideas?
For certain values of "support" greater than "it doesn't exist at all."
On Nov 24, 2010 4:15 PM, "Platonides" <Platonides(a)gmail.com> wrote:
Gary Roush wrote:
> Will Mediawiki work with MS SQL? I have not been able to find
> a reference to...
There is support for MS SQL database, so it could work. It may not be as
complete or stable as it should, though.
MediaWiki-l mailing list
In a message dated 11/24/2010 3:52:08 PM Pacific Standard Time,
> <<Well after upgrading to the newest version and using a few extensions I
> was able to clean up the mess but it still took a good days worth of work.
> There were three "users" making edits so rolling them back created
> conflicts unless you did them in the right order. I resolved the conflicts by
> copying the revision of the page I wanted to keep, deleting the page and then
> pasting back in the content.
> One thing I really don't like about most of the solutions I used is that
> they still leave traces of the spam in your database.
> Anyway .. I don't have alot of traffic on my wiki so it's my fault for not
> checking it frequently enough that I could have just restored to a backup
> of the database. I need to see if my service provider will configure a
> longer period of backups. >>
I have the same issue from time-to-time. I also used this trick of
deleting the page and posting back my last good version, but of course that wipes
out all the past history as well, not that most people with Mediawiki's
really care about that history.... I'm just not certain that when you wipe the
history in this tricky way, if it actually releases that space out of the
database for use by another article, or if it's still sitting there, unlinked,
but using up space.
I've also wondered about the case, when you leave the spam in the old
version. Does that still get indexed in Google? Or is Google clever enough to
only see the top version of your page? I don't know the answer myself.
I'm writing an extension now where I have to know for the given link whether
it's red or not. How can I easily do that? I read some articles like that :
http://meta.wikimedia.org/wiki/Help:Page_existence and see how can I check
existence as user.
Is there any function to check page existence from my php-code?
Thanks in advance!