Brion, help. There are too many mails around - I cannot track them any longer.
May I propose to you: can't you set up wikis for mediawiki-l and wikitech-l ***with e-mail notification*** ?
Newcomers could post their questions onto the wiki, developers could answer, and whole message threads including solutions and patches would stay together and would be better to maintain.
Of course, an installation with a working e-mail notification would support the communication with much lower mail transfer (remember, only _one_ e-mail is sent out for a watched page, until the watching user re-visits the watched page - can be weeks later).
Perhaps: the both wikis only writeable for users who have registered with one of the mailing lists
I propose my current en201+mw1.3.10 from http://meta.wikipedia.org/Enotif to be installed on something like wikitech.wikipedia.org or even
Or be courageous and start trying it on meta.wikimedia.org .
Good or bad idea ? Tom
On Fri, 18 Feb 2005 22:26:41 +0100, Thomas Gries mail@tgries.de wrote:
Brion, help. There are too many mails around - I cannot track them any longer.
Same. I was reading everything for a day or two, decided which topics I was interested in, found I couldn't even read those, and basically gave up. I don't understand why things can't go on the talk pages or meta: and since gmail doesn't thread conversations, it would even be easier to read.
(I'm subscribed to wikitech, wikipedia and wikiEN)
Tomer Chachamu wrote in gmane.science.linguistics.wikipedia.technical:
On Fri, 18 Feb 2005 22:26:41 +0100, Thomas Gries mail@tgries.de wrote:
Brion, help. There are too many mails around - I cannot track them any longer.
Same. I was reading everything for a day or two, decided which topics I was interested in, found I couldn't even read those, and basically gave up. I don't understand why things can't go on the talk pages or meta: and since gmail doesn't thread conversations, it would even be easier to read.
have you tried reading the lists through gmane (http://www.gmane.org/)? it's a mail-to-news gateway which lets you read (and post to) mailing lists from a news reader. as news readers are rather more suited to the threaded discussion style that occurs on newsgroups and mailing lists, it's a fair bit easier.
kate.
Kate Turner schrieb:
have you tried reading the lists through gmane (http://www.gmane.org/)? it's a mail-to-news gateway which lets you read (and post to) mailing lists from a news reader. as news readers are rather more suited to the threaded discussion style that occurs on newsgroups and mailing lists, it's a fair bit easier.
Kate: A wiki was asked for, not a newsgroup-style thing. A wiki, where you work together to solve a problem - basically one "page". The GMANE interface also lacks this feature. Thus, if you want to have the "whole" story and all information, you must also visit all postings in a thread.
This is, why I and several others have proposed to "go wiki" with the mailing lists Tom
Thomas Gries wrote in gmane.science.linguistics.wikipedia.technical:
Kate Turner schrieb:
have you tried reading the lists through gmane (http://www.gmane.org/)? it's a mail-to-news gateway which lets you read (and post to) mailing lists from a news reader. as news readers are rather more suited to the threaded discussion style that occurs on newsgroups and mailing lists, it's a fair bireplying.
Kate: A wiki was asked for, not a newsgroup-style thing. A wiki, where you work together to solve a problem - basically one "page".
wiki is not a good forum for general, open discussion; see http://c2.com/cgi/wiki?ThreadMode, http://c2.com/cgi-bin/wiki?WhyWikiWorksNot for example. ThreadMode should be used for discussions related to forming a eventual page in DocumentMode; on Wikipedia, this means talk pages being used to shape the content of an article. trying to read long discussions on a talk page is rather difficult, as is finding (and replying to) archived discussions.
what wiki is good for is archiving the results of previous discussions. when i open a list in my news reader, i see a list of current threads, and can read which ones i want. what i don't have, that wiki has, is a whiteboard for preserving permanent discussion. i can link to a previous mailing list post, but the content i want to give someone may be split over several messages. what wiki can help with is taking the results of a discussion and allowing the participants to write up a concise summary of the important discussion points. for example, asking typical mediawiki-l questions, or wikitech-l things such as discussion of how to implement .htaccess authentication, wiki offers little benefit. there's no easy way to locate new discussions, and a talk page on its own offers nothing over a newsgroup. however, once a basic strategy for the .htaccess method has been designed (for example), it can be written up on the wiki, and changes can be discussed on the talk page. similarly, once a discussion on something has taken place on mediawiki-l, the result can be written up as a MediaWiki FAQ entry on the wiki.
that's where wiki is useful. it's not a replacement for mailing lists--it's just another discussion platform.
The GMANE interface also lacks this feature. Thus, if you want to have the "whole" story and all information, you must also visit all postings in a thread.
This is, why I and several others have proposed to "go wiki" with the mailing lists
Tom
kate.
I'm a member of the Green Party in New Zealand and we are increasingly using email lists for announcements, phpBB for general discussion, and MediaWiki for document creation and relevant discussion.
They all have their pros and cons. A wiki can be a very messy and difficult way to follow discussion.
Christiaan
On 20 Feb 2005, at 1:53 am, Thomas Gries wrote:
Kate Turner schrieb:
have you tried reading the lists through gmane (http://www.gmane.org/)? it's a mail-to-news gateway which lets you read (and post to) mailing lists from a news reader. as news readers are rather more suited to the threaded discussion style that occurs on newsgroups and mailing lists, it's a fair bit easier.
Kate: A wiki was asked for, not a newsgroup-style thing. A wiki, where you work together to solve a problem - basically one "page". The GMANE interface also lacks this feature. Thus, if you want to have the "whole" story and all information, you must also visit all postings in a thread.
This is, why I and several others have proposed to "go wiki" with the mailing lists Tom
Christiaan Briggs schrieb:
[...] A wiki can be a very messy and difficult way to follow discussion.
Not, if you use email notification ;-)) .Then everyone interested in a subject can follow (but is not obliged to) http://meta.wikipedia.org/Enotif http://bugzilla.wikipedia.org/show_bug.cgi?id=454 Tom
Thomas Gries wrote in gmane.science.linguistics.wikipedia.technical:
Christiaan Briggs schrieb:
[...] A wiki can be a very messy and difficult way to follow discussion.
Not, if you use email notification ;-)) .Then everyone interested in a subject can follow (but is not obliged to)
i'm not sure about anyone else, but for me it's significantly harder to receive every message as a notice in my inbox, and then have to load a web browser to actually participate in the discussion, as well as having to follow some kind of thread index page and "subscribe to" (watch) every thread i want to read. not to mention the problems with deeply threaded conversations on a wiki becoming rather hard to read (but i already discussed that earlier...)
also, most news clients will allow you to watch threads you're interested in, and ignore the ones you aren't.
Tom
kate.
Kate Turner schrieb:
but for me it's significantly harder to receive every message as a notice in my inbox
_One_ email notification is sent once the page, you are watching, is changed by someone else than you. Further changes do not trigger. Only at the moment you are revisiting the page, then the flag is cleared (and further changes by someelse than yourself would trigger a=one new enotif. This is not so sexy as such and other wiki engines might have this feature too, but:
With Enotif, you also receive in your personalized email a link to the DIRECT DIFFERENCE VIEW BETWEEN YOUR LAST-SEEN(*) AND THE CURRENT REVISION- (no matter when you read the enotif). This is so extremely useful, that I need to ask a bounty for this ... but it was an idea by Chris Phoenix ....(thanks again).
(*) The lvr-feature relies on oldid, which currently does not survive delete/undelete cycles, but this will change at some time
On Sun, 20 Feb 2005 14:59:53 +0100, Thomas Gries mail@tgries.de wrote:
but for me it's significantly harder to receive every message as a notice in my inbox
_One_ email notification is sent once the page, you are watching, is changed by someone else than you. Further changes do not trigger.
I think Kate's point was not about how many notifications, but about the difference between being notified that a change has been made, and actually reading the message directly. Using a mailing-list subscription, or a newsgroup viewer, there are 2 steps to reading the message: 1) pick the conversation you want to read (by its subject) 2) read the messages therein With a mail-notifying discussion forum *of any kind*, there are 3: 1) pick the conversation 2) follow the link in the notification 3) read the message This would make it significantly *harder* to "browse" a large number of discussions, because you would be constantly switching between mail-client and browser, and you would be less able to use additional features of your client to deal with the discussions in ways that suited you personally.
What's more, the "receive one notif and then no more" feature is actually something of a *disadvantage*, because (if I understand rightly) it removes the behaviour common on traditional threaded environments of new posts "bumping" a thread, even if it is a reply to something that happened a long time ago. If you go away, don't check the notifications, and come back, you will not know which are still active.
Please don't think I'm criticising your work on eNotif *in general*, but with the general consensus being that discussions on MediaWiki leave something to be desired anyway, I don't see that just having this one enhancement suddenly makes them better than longer established mediums *for the same purpose*.
On Sat, 19 Feb 2005 12:31:35 +0000, Tomer Chachamu the.r3m0t@gmail.com wrote:
... since gmail doesn't thread conversations, it would even be easier to read.
So don't use GMail! Either, as Kate suggests, find a usenet client, and point it at the appropriate gmane group[s], or use a mail client that supports fully threaded discussions - Mozilla, for instance (and therefore, I presume, Thunderbird, which I haven't used) can switch any mailbox into a collapsible tree [think Windows explorer] thread view at the click of a button. And with GMail now offering POP access [1], you don't even need a different account to direct your subscriptions to!
While I (obviously) think Wikis are great, and MediaWiki in particular has some great functions, threaded discussion is *not* something to which they are well adapted.
[1] See http://gmail.google.com/support/bin/answer.py?ctx=gmail&answer=12103
Hello,
I wrote a couple of tag extensions that really need to bypass the cache mechanism, i.e. any page that uses that tag should always be re-created.
I know, this obviously impact performances, but we are talking "small server" here, not Wikipedia :)
Nevertheless, there must be a way to achieve that, am I wrong ? I mean, it seems pretty reasonable that an extension might perform SQL queries, or inspect the current context, return time-dependent values, etc., and therefore the caching mechanism will just make them useless most of the time.
So far, the only two settings that I know that disable the cache are: - User preferences: misc settings: Disable page caching - DefaultSettings.php: $wgCachePages
In my extension callback, I tried the following code: $wgOut->addMeta("http:Pragma", "no-cache"); $wgOut->addMeta("http:no-cache", NULL); $wgOut->addMeta("http:EXPIRES", "TUES, 31 DEC 1996 12:00:00 GMT");
which adds the following headers to the page: <meta http-equiv="Pragma" content="no-cache" /> <meta http-equiv="no-cache" content="" /> <meta http-equiv="EXPIRES" content="TUES, 31 DEC 1996 12:00:00 GMT" /> but without success.
So I assume my extension callback is called *after* the decision is made that the page should be served from the cache and not re-created ?
Unless I'm wrong, OutputPage::checkLastModified() seems to be the point where that very decision is made. Is there anyway to influence the outcome of that function from an extension ? It sure tests for $wgCachePages, and the user's "no-cache" option, then do some magic on HTTP_IF_MODIFIED_SINCE.
Any clue ?
Thanks
-- Sebastien Barre
On Sun, 20 Feb 2005 12:43:58 -0500, Sebastien BARRE sebastien.barre@kitware.com wrote:
Hello, In my extension callback, I tried the following code: $wgOut->addMeta("http:Pragma", "no-cache"); $wgOut->addMeta("http:no-cache", NULL); $wgOut->addMeta("http:EXPIRES", "TUES, 31 DEC 1996 12:00:00 GMT");
The problem is (possibly) that there are multiple levels of caching within MediaWiki, some of which are completely independent of the HTTP dialogue with the browser. Don't ask me what they all are, or how they do work, but I gather that they are many and complex. :)
See also another recent thread on the same topic: * http://mail.wikimedia.org/pipermail/mediawiki-l/2005-February/thread.html#36... (particularly http://mail.wikimedia.org/pipermail/mediawiki-l/2005-February/003632.html and http://mail.wikimedia.org/pipermail/mediawiki-l/2005-February/003635.html)
Basically, MediaWiki is designed for dynamic editting of static content; for static editting of dynamic content, you need a different tool.
At 2/20/2005 01:12 PM, Rowan Collins wrote:
On Sun, 20 Feb 2005 12:43:58 -0500, Sebastien BARRE sebastien.barre@kitware.com wrote:
Hello, In my extension callback, I tried the following code: $wgOut->addMeta("http:Pragma", "no-cache"); $wgOut->addMeta("http:no-cache", NULL); $wgOut->addMeta("http:EXPIRES", "TUES, 31 DEC 1996 12:00:00 GMT");
The problem is (possibly) that there are multiple levels of caching within MediaWiki, some of which are completely independent of the HTTP dialogue with the browser. Don't ask me what they all are, or how they do work, but I gather that they are many and complex. :)
I think I nailed it though:
global $wgTitle; if ($wgTitle) { $ts = mktime(); $now = gmdate("YmdHis", $ts + 60); $ns = $wgTitle->getNamespace(); $ti = wfStrencode($wgTitle->getDBkey()); $sql = "UPDATE cur SET cur_touched='$now' WHERE cur_namespace=$ns AND cur_title='$ti'"; wfQuery($sql, DB_WRITE); }
Note that this code is slightly identical to Title::invalidateCache. The difference here is that if I call invalidateCache in my extension code, it sets cur_touched to 'now', *then* create the cache, so the cache is newer than cur_touched anyway. The trick here is that I set cur_touched in the future, something not too intrusive, let's say 'now' + 60 seconds, provided that I expect the cache to be created within 60 or 120 secs once my extension code has been executed. That way, cur_touched is always fresher than the cache, and the page always gets re-created. Am I missing something ?
I also throw the following code that I found on various threads. Does not hurt I guess. global $wgOut; $wgOut->enableClientCache(false); $wgOut->addMeta("http:Pragma", "no-cache"); $wgOut->addMeta("http:no-cache", NULL); $wgOut->addMeta("http:EXPIRES", "TUES, 31 DEC 1996 12:00:00 GMT");
-- Sebastien Barre
"Sebastien" == Sebastien BARRE sebastien.barre@kitware.com writes:
Hello, I wrote a couple of tag extensions that really need to bypass the cache mechanism, i.e. any page that uses that tag should always be re-created.
I know, this obviously impact performances, but we are talking "small server" here, not Wikipedia :)
Nevertheless, there must be a way to achieve that, am I wrong ? I mean, it seems pretty reasonable that an extension might perform SQL queries, or inspect the current context, return time-dependent values, etc., and therefore the caching mechanism will just make them useless most of the time.
So far, the only two settings that I know that disable the cache are:
- User preferences: misc settings: Disable page caching
- DefaultSettings.php: $wgCachePages
In my extension callback, I tried the following code: $wgOut->addMeta("http:Pragma", "no-cache"); $wgOut->addMeta("http:no-cache", NULL); $wgOut->addMeta("http:EXPIRES", "TUES, 31 DEC 1996 12:00:00 GMT");
You may want to look at http://article.gmane.org/gmane.science.linguistics.wikipedia.technical/14068 for a somewhat simpler way of doing the same.
At 2/21/2005 12:40 AM, Anders Wegge Jakobsen wrote:
You may want to look at http://article.gmane.org/gmane.science.linguistics.wikipedia.technical/14068
for a somewhat simpler way of doing the same.
No, that simple way does not work for extensions. It was reported by other people than me on mediawiki-l too. Neither does Title::invalidateCache().
I'm not 100% sure why, but I will give it a shot: once you call that enableClientCache(), it's already too late. The decision that is being made about serving the page from the cache, or re-creating it from scratch, is most probably done *before* the extension code is ever executed, since, basically, the extension is only executed to create the contents of the page (makes sense). My solution works since I'm not saying "don't use my cache now", but "don't cache me next time". And the next time, of course, it re-executes "don't cache me next time", etc. etc.
-- Sebastien Barre
wikitech-l@lists.wikimedia.org