We are experiencing some form of timeout when editing a record. It
*appears* to be approx 15-20 minutes. If you don't save within that
time (not preview but save) then when you do save it says you are not
logged in.
I have been searching through wiki for this value but have not found it
anywhere.
Is this a wiki thing? The server admin does not know what it might be
and points to the wiki <G> until I point him to something else
Ideas??
DSig
David Tod Sigafoos | SANMAR Corporation
PICK Guy
206-770-5585
davesigafoos(a)sanmar.com
Good day,
I would like to host several projects on several domains with only one installation of mediawiki and one database.
Questions:
1) Can this be done with one installation of mediawiki?
2) Do I have to install one mediawiki for each domain?
3) Can I use a different layout for each project?
4) domain 1 should display only articles of catageory 1, domain 2 of category 2 etc. Is this possible?
5) Is it possible to have "shared" content too: domain 1 displays only catgeory 1, but domain 2 displays articles of categories 1 AND 2?
Thank you
Smo
Hello,
I think this is the right place to ask my question. J
I'm trying to do a search of a 3 letter word and it's not returning anything.
I'm using latest version of MediaWiki but this happen as well in the 1.9.xxx.
This is probably a config option but i'm not finding where or what =)
Can anyone help?
Best Regards,
Hugo Picão <mailto:hugo.picao@link.pt>
Link Consulting <http://www.link.pt/> - Redes e Segurança
Tel: 213 100 182
Av. Duque de Ávila, 23
1000-138 Lisboa
I forgot to add that, MW is not sending the user email confirmation mails as well.
Regards,
Jack
-----Original Message-----
From: mediawiki-l-bounces(a)lists.wikimedia.org [mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of Jack Eapen C
Sent: Friday, June 22, 2007 2:45 PM
To: MediaWiki announcements and site admin list
Subject: [Mediawiki-l] Watchlist is not notified
Hi,
I'm using MW 193. I first set it up on Winodws and then ported to Linux. In windows, the users were getting notification emails when a watching page is changed. But in the production version in linux, that's not happening. There are some other programmes (like Moodle) in the same server which has some mail notification features and they are working correctly. So I believe, the PHP mail is working fine.
I checked the watchlist table and I could see against the watchlist pages, the wl_notificationtimestamp column is updated. i.e. the system seems to sending mails and they are lost e route?
Any clues to this?
Regards,
Jack Eapen C
SunTec Knowledge Centre
------------------------------------------------------------------------------------
"Techies are like stars--they rise and set, they have the worship of the world, but no repose"
This electronic mail (including any attachment thereto) may be confidential and privileged and is intended only for the individual or entity named above. Any unauthorized use, printing, copying, disclosure or dissemination of this communication may be subject to legal restriction or sanction. Accordingly, if you are not the intended recipient, please notify the sender by replying to this email immediately and delete this email (and any attachment thereto) from your computer system...Thank You
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
This electronic mail (including any attachment thereto) may be confidential and privileged and is intended only for the individual or entity named above. Any unauthorized use, printing, copying, disclosure or dissemination of this communication may be subject to legal restriction or sanction. Accordingly, if you are not the intended recipient, please notify the sender by replying to this email immediately and delete this email (and any attachment thereto) from your computer system...Thank You
Hi dave,
Thanks for the hint on SpecialDeleteOldRevisions. That was very useful
Regards,
Jack
----------------------------------------------------------------
"Techies are like stars--they rise and set, they have the worship of the
world, but no repose"
-----Original Message-----
From: mediawiki-l-bounces(a)lists.wikimedia.org
[mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of Dave
Sigafoos
Sent: Thursday, June 21, 2007 7:06 PM
To: MediaWiki announcements and site admin list
Subject: Re: [Mediawiki-l] Managing history
I have not found such an animal, though with my users it might be a good
idea.
There is an article on meta for SpecialDeleteOldRevisions which *could
be* modified to remove oldest bits.
DSig
David Tod Sigafoos | SANMAR Corporation
PICK Guy
206-770-5585
davesigafoos(a)sanmar.com
-----Original Message-----
From: mediawiki-l-bounces(a)lists.wikimedia.org
[mailto:mediawiki-l-bounces@lists.wikimedia.org] On Behalf Of Jack Eapen
C
Sent: Thursday, June 21, 2007 6:01
To: MediaWiki announcements and site admin list
Subject: [Mediawiki-l] Managing history
Hi,
I don't want all the past versions of an article to be maintained,
because for each minor changes, a new version is created and the db size
is growing rapidly. I want only 5 previous versions or so, only. Can we
limit the number of versions stored in Mediawiki? Or can we delete the
past unwanted versions through the UI?
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
This electronic mail (including any attachment thereto) may be confidential and privileged and is intended only for the individual or entity named above. Any unauthorized use, printing, copying, disclosure or dissemination of this communication may be subject to legal restriction or sanction. Accordingly, if you are not the intended recipient, please notify the sender by replying to this email immediately and delete this email (and any attachment thereto) from your computer system...Thank You
He fellow mediawikilisteners,
Rob built a great start on an "instruction/reference manual" for "namespace" at http://www.mediawiki.org/wiki/ManualNamespace
I, of course, have additional questions, which I've posted behind it at http://www.mediawiki.org/wiki/Manual_talk:Namespace
And I thought I'd ask the same here. My MediaWiki support plan is to copy everything I've learned (all 80 words! No, just kidding...) to MediaWiki.org, so if you know something that's not here or there, please pitch in if you can. I notice some inquiries are months old on MediaWiki and still unanswered, yet this group often responds pretty quickly. As always, if you know links that answer this, links that I have missed, please redirect me - thanks! Here goes:
I'm looking for examples of "namespace" features and benefits, for instance:
- How to export or print all articles in a namespace or group of namespaces?
- How to control who can see or edit the articles in a particular namespace or group of namespaces?
- How to restrict search to include or exclude a particular namespace or group of namespaces?
- How to auto-build a table of contents for all articles in a namespace or group of namespaces?
- How to export and import one or more namespaces?
... more?!?
In other words, what are the features and benefits of using the "namespace" function, and how (examples and or links, please) can a wiki admin or wiki user take advantage of those features and benefits?
Versus Category?
Versus Sub-page?
I understand what I've read about each of these MediaWiki features, but I'm not sure I understand the benefits of each feature, especially comparatively. Here's my goal (of any database system): I want to read, create, save, edit, search, sort, print, export, and import (is there more?) and I wonder how the features of Namespace vs. Category vs. Sub-page (or other? Special:page-features?) assist in organizing the contents of any MediaWiki such that I can execute each of those tasks over my own choice of a pre-organized subset of the main data.
An example: say I have a wiki with job descriptions, and each job description has a series of tasks. I may want to see or print, say, all job descriptions, with no sub-tasks. Or, I may want to see and print only one entire job description and all it's tasks, but no other job descriptions. How would I organize that information best using Namespace vs. Category vs. Sub-page (or other)? Each job description goes on it's own page (I presume). Should I create a separate namespace for each job? Should I create sub-pages for each job's tasks? Would categories be helpful? What search, sort, and select tools in the MediaWiki software have powers over Namespace vs. Category vs. Sub-page such that I would know in advance before I enter or organize all the data? I'm not asking you to build my database and queries for me, I'm asking for examples of the MediaWiki features and benefits so I or anyone can know in advance how MediaWiki can manipulate it's contents (or not) if we pre-
organize or data entry appropriately.
I'm imagining a table, but I may be way off base:
Benefit: → Show all pages: Auto table of contents: Read: Create: Save: Edit: Search: Sort: Select: Print: Export: Import:
↓ Feature:
Page: Y Y Y Y Y Y Y Y Y
Namespace Y ? ? ? ? ?
Category: ? ? ? ? ? ?
Sub-page: ? ? ? ? ?
Search:
Print:
... And so on
How would YOU recommend organizing a Table of Benefits for every MediaWiki Feature?
I guess what I'm asking in a backwards way is, how powerful are the search, sort, select, report, print features, and also any auto-features such as auto table of contents or auto index features in MediaWiki? How would anyone using a MediaWiki database say, "show me and print all and only ... such and such"?
Or has no one evolved the search/sort/select and auto features over the database contents of a MediaWiki yet, and these are SQL/PHP challenges as yet unaddressed inside MediaWiki?
Thanks for taking a moment to explore this with me.
-- Peter Blaise
Hi,
I need help to figure out how to do the following:
I have 200 pages in my wiki. The content of each page is something
like this:
PRODUCT TITLE
Product_description
Product_price
Product_size
Product_website
I need a way to get the page that matches whit "Product_website" passing
the "website" via URL.
Example:
mydomain.com/wiki/index.php?website=www.billabong.com
must show the page whit this content:
Billabong T-shirt <-- product title
Description of this product blah blah... <-- product description
$60 <-- price
M - L - XL <-- size
www.billabong.com <-- WEBSITE
How can I do this?
thanks in advance
Lichi.
Hi all,
My wiki is sometimes not loading at all or when it does load it can be
very very slow. It is only the wiki directory that is affected, no
other directories so it is not the server speed.
The wiki directory has security on it in the form of simple htaccess.
The MediaWiki version is 1.6.8, with the server on PHP 5.2.2 and MySQL
on 5.0.21.
This problem has only just started to happen and the hosts WebFusion
are certain it is nothing to do with them although I am not too sure.
They argue it is the security but I don't see how that could be as it
has always been fine.
There are some extensions installed, is anything not supposed to work
behind htaccess?
Any help would be very appreciative.
Kind regards,
Niall
Hi,
I don't want all the past versions of an article to be maintained, because for each minor changes, a new version is created and the db size is growing rapidly. I want only 5 previous versions or so, only. Can we limit the number of versions stored in Mediawiki? Or can we delete the past unwanted versions through the UI?
Regards,
Jack Eapen C
SunTec Knowledge Centre
------------------------------------------------------------------------------------
"Techies are like stars--they rise and set, they have the worship of the world, but no repose"
This electronic mail (including any attachment thereto) may be confidential and privileged and is intended only for the individual or entity named above. Any unauthorized use, printing, copying, disclosure or dissemination of this communication may be subject to legal restriction or sanction. Accordingly, if you are not the intended recipient, please notify the sender by replying to this email immediately and delete this email (and any attachment thereto) from your computer system...Thank You