Hi Chad,
It appears to me that there would have to be modifications in many
PHP files to allow MS SQL to work because PHP calls MySQL tables with
different commands then MS SQL tables. As a result I have given up
trying to use the site with MS SQL and have moved to one using MySQL
and it works fine there.
Thanks to all who responded.
Gary
At 11:36 PM 11/24/2010, you wrote:
>For certain values of "support" greater than "it doesn't exist at all."
>
>-Chad
>
>On Nov 24, 2010 4:15 PM, "Platonides" <Platonides(a)gmail.com> wrote:
>
>Gary Roush wrote:
> > Will Mediawiki work with MS SQL? I have not been able to find
> > a reference to...
>There is support for MS SQL database, so it could work. It may not be as
>complete or stable as it should, though.
>
>
>
>_______________________________________________
>MediaWiki-l mailing list
>MediaWiki-l(a)lists.wikimed...
>_______________________________________________
>MediaWiki-l mailing list
>MediaWiki-l(a)lists.wikimedia.org
>https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Hello,
I would like to prevent specific users from viewing some Special pages:
Special:ListFiles
Special:upload log
& a few others.
Here is the what I did:
Downloaded and extracted, Lockdown-MW1.16-r70092.tar.gz to
(/extensions/Lockdown)
Inserted the following into Localsettings.php:
- require_once( "$IP/extensions/Lockdown/Lockdown.php" );
- $wgSpecialPageLockdown['Export'] = array('bureaucrat');
It didn't prevent users in the bureaucrat group from accessing the
Export page. Any ideas of what could be the cause, would be appreciated.
Thanks
Hi,
I'd be glad if someone could help me with the following question:
Imagine that *some* articles are using a given template. At the end of
this template I have inserted the following code (Facebook comments):
<fb:comments></fb:comments>
When users check a page with that template, first they see is :
1. the template filled with data,
2. then FB comments
3. finally the free text
But I would like they see this:
1. the template filled with data,
2. Free text
3. FB Comments
I think that the best option would be add the code
"<fb:comments></fb:comments>" outside from this template but, where?
Is there any internal template which be loaded at the end of an
article like a footer? I mean something like:
Articles:
- internal header template
- internal content template
- internal footer template
Maybe I'm answering myself but this manual page:
http://www.mediawiki.org/wiki/Manual:Footer says that I have to modify
skins to achieve it:
"""
To add or remove items from the footer on your MediaWiki page, you
must edit the skin.
For example: if you go in to MonoBook.php (located by default in the
skins folder) you will find the following code:
$footerlinks = array(
'lastmod', 'viewcount', 'numberofwatchingusers', 'credits', 'copyright',
'privacy', 'about', 'disclaimer', 'tagline',
);
"""
I don't like it because it would mean that the footer is the same for
all pages and I don't want that.
Is it the only way?
Regards
Hi,
I installed the Collection extension and have a problem which seems to
be not uncommon:
If I want to download the document, the pdf-File is opened in the
browser and shown as
--------------------------------------------------
%PDF-1.3 %“Œ‹ž ReportLab Generated PDF document http://www.reportlab.com
% 'BasicFonts': class PDFDictionary 1 0 obj % The standard fonts
dictionary << /F1+0 117 0 R /F2+0 121 0 R /F3+0 125 0 R /F4+0 129 0 R
/F5 19 0 R /F6+0 133 0 R >> endobj % 'Annot.NUMBER1': class LinkA ...
--------------------------------------------------
So the File is sent as "text/html" instead as "application/pdf".
I never have problems to download pdf files, so it's not a problem of my
browser (FF 3.6.12) I think.
Saving the file to disk doesn't work, it can't be read as PDF.
The only solution so far is download the file directly via
curl -D - -o test.pdf -d collection_id=1234567890 -d writer=rl -d
command=download http://tools.pediapress.com/mw-serve/
But that doesn't make fun...
Has anyone a solution for that problem? Could it be a problem of the
server at pediapress?
Daniel
Greetings,
Thanks for the reply, but no, that's not quite what I'm looking for:
this is not a wiki "family" of multiple wikis that need a shared
repository, but a single large wiki that has multiple web servers.
Setting up a second single-node wiki just to serve images for the main
wiki would be inefficient and a single point of failure.
Cheers,
-jani
On 11/25/10 11:00 PM, mediawiki-l-request(a)lists.wikimedia.org wrote:
> From: Laurent Savaete<laurent.savaete(a)googlemail.com>
> Subject: Re: [Mediawiki-l] Managing uploads for a cluster of Wiki
> servers
>
> I think you'll find what you're after here:
> http://www.mediawiki.org/wiki/Manual:Wiki_family#Upload
>
>
2010/11/25 Jani Patokallio<jpatokal(a)iki.fi>:
> > I'm managing a two-node cluster of Wikis, and I'd like to enable uploads.
> > Only problem is, since uploads are stored locally, a file uploaded to node 1
> > is not visible from node 2. ?How is this usually managed?
Greetings,
I'm managing a two-node cluster of Wikis, and I'd like to enable uploads.
Only problem is, since uploads are stored locally, a file uploaded to node 1
is not visible from node 2. How is this usually managed? I'm a little
surprised that there appears to be no option to upload directly to and store
files in a database (which is already shared between nodes), but then Apache
would have to be hacked to find the files there instead.
One option I was considering was setting up an NFS mount, so uploads go to a
shared remote directory. The node already share a database, so node 2 will
know if node 1 gets a file and can find it at the same place. The drawback
is that this requires managing the NFS mount, which can be a little fiddly.
Any other ideas?
Cheers,
-jani
Welcome to mediawiki-l. This mailing list exists for discussion and questions
about the MediaWiki software[0]. Important MediaWiki-related announcements
(such as new versions) are also posted to this list.
Other resources.
If you only wish to receive announcements, you should subscribe to
mediawiki-announce[1] instead.
MediaWiki development discussion, and all Wikimedia technical questions, should
be directed to the wikitech-l[2] mailing list.
Several other MediaWiki-related lists exist:
- mediawiki-api[5] for API discussions,
- mediawiki-enterprise[6] for discussion of MediaWiki in the enterprise,
- mediawiki-cvs[7] for notification of commits to the Subversion repository,
- mediawiki-i18n[8] for discussion of MediaWiki internationalisation support,
- wikibugs-l[9] for notification of changes to the bug tracker.
List administrivia (unsubscribing, list archives).
To unsubscribe from this mailing list, visit [12]. Archives of previous postings
can be found at [3].
This list is also gatewayed to the Gmane NNTP server[4], which you can use to
read and post to the list.
Posting to the list.
Before posting to this list, please read the MediaWiki FAQ[10]. Many common
questions are answered here. You may also search the list archives to see if
your question has been asked before.
Please try to ask your question in a way that enables people to answer you.
Provide all relevant details, explain your problem clearly, etc. You may
wish to read [13], which explains how to ask questions well.
To post to the list, send mail to <mediawiki-l(a)lists.wikimedia.org>. This is a
public list, so you should not include confidential information in mails you
send.
When replying to an existing thread, use the "Reply" or "Followup" feature of
your mail client, so that clients that understand threading can sort your
message properly. When quoting other messages, please use the "inline" quoting
style[11], for clarity.
When creating a new thread, do not reply to an existing message and change the
subject. This will confuse peoples' mail readers, and will result in fewer
people reading your mail. Instead, compose a new message for your post.
Messages posted to the list have the "Reply-To" header set to the mailing list,
which means that by default, replies will go to the entire list. If you are
posting a reply which is only interesting to the original poster, and not the
list in general, you should change the reply to only go to that person. This
avoids cluttering the list with irrelevant traffic.
About this message.
This message is posted to the list once per week by <river(a)wikimedia.org>.
Please contact me if you have any questions or concerns about this mailing.
References.
[0] http://www.mediawiki.org/
[1] http://lists.wikimedia.org/mailman/listinfo/mediawiki-announce
[2] http://lists.wikimedia.org/mailman/listinfo/wikitech-l
[3] http://lists.wikimedia.org/pipermail/mediawiki-l/
[4] http://dir.gmane.org/gmane.org.wikimedia.mediawiki
[5] http://lists.wikimedia.org/mailman/listinfo/mediawiki-api
[6] http://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
[7] http://lists.wikimedia.org/mailman/listinfo/mediawiki-cvs
[8] http://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
[9] http://lists.wikimedia.org/mailman/listinfo/wikibugs-l
[10] http://www.mediawiki.org/wiki/FAQ
[11] http://en.wikipedia.org/wiki/Posting_style#Inline_replying
[12] http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
[13] http://www.catb.org/~esr/faqs/smart-questions.html
For certain values of "support" greater than "it doesn't exist at all."
-Chad
On Nov 24, 2010 4:15 PM, "Platonides" <Platonides(a)gmail.com> wrote:
Gary Roush wrote:
> Will Mediawiki work with MS SQL? I have not been able to find
> a reference to...
There is support for MS SQL database, so it could work. It may not be as
complete or stable as it should, though.
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimed...
In a message dated 11/24/2010 3:52:08 PM Pacific Standard Time,
2007(a)gmaskfx.com writes:
> <<Well after upgrading to the newest version and using a few extensions I
> was able to clean up the mess but it still took a good days worth of work.
>
> There were three "users" making edits so rolling them back created
> conflicts unless you did them in the right order. I resolved the conflicts by
> copying the revision of the page I wanted to keep, deleting the page and then
> pasting back in the content.
>
> One thing I really don't like about most of the solutions I used is that
> they still leave traces of the spam in your database.
>
> Anyway .. I don't have alot of traffic on my wiki so it's my fault for not
> checking it frequently enough that I could have just restored to a backup
> of the database. I need to see if my service provider will configure a
> longer period of backups. >>
>
I have the same issue from time-to-time. I also used this trick of
deleting the page and posting back my last good version, but of course that wipes
out all the past history as well, not that most people with Mediawiki's
really care about that history.... I'm just not certain that when you wipe the
history in this tricky way, if it actually releases that space out of the
database for use by another article, or if it's still sitting there, unlinked,
but using up space.
I've also wondered about the case, when you leave the spam in the old
version. Does that still get indexed in Google? Or is Google clever enough to
only see the top version of your page? I don't know the answer myself.
W
Hello, everyone!
I'm writing an extension now where I have to know for the given link whether
it's red or not. How can I easily do that? I read some articles like that :
http://meta.wikimedia.org/wiki/Help:Page_existence and see how can I check
existence as user.
Is there any function to check page existence from my php-code?
Thanks in advance!
Yury