I would suggest that it only applies when creating new pages -
particularly if it is editable on-wiki,
I can imagine three places where the check would need to occur: 1) On page creation 2) On page move (in case someone tries to move a page to a restricted title) 3) On page retrieval (in case the rules have changed since the page was first created).
otherwise you risk hiding legitimate content,
That's true - I certainly agree that this is a risk that must be evaluated by the person making the change. Perhaps a "what would this affect" special page to test out a filter prior to application?
or being unable to delete the vandalism.
A special page that shows all pages with invalid titles, with options to delete/rename them?
A bad regex could disable the whole wiki, for example.
That's certainly possible. Such a problem could be avoided by hardcoding in a rule that Sysops can bypass the filter for administration purposes.
i.e. this is not a way of implementing regex-based page protection.
Does everyone feel this way? I like the idea of putting the filters in a MediaWiki page because it allows people with little or no PHP experience, or Sysops without filesystem level access to administrate the site filters.
And the filters don't necessarily have to be regex. Firefox's AdBlock add-on has a good filter mechanism that permits both regular expressions, and simple '*' based wildcarding. This wouldn't be hard to imitate.
Your thoughts?
-- Jim
On 2/7/07, Mark Clements gmane@kennel17.co.uk wrote:
"Jim Wilson" wilson.jim.r@gmail.com wrote in message news:ac08e8d0702071247x28a130dfva87b91e4859fb060@mail.gmail.com...
I suppose it goes without saying that this could be achieved with an Extension. Perhaps some kind of regex blacklist or whitelisting
extension?
So for example you could have a page called
[[MediaWiki:TitlesBlacklist]]
which could be a newline separated list of regular expressions to block
for
title creation. (With of course an associated [[MediaWiki:TitlesWhitelist]]). Then have an extension which applies
the
rules prior to page submission, and also on retrieval (in case some malicious user finds a way through).
[Example snipped]
I would suggest that it only applies when creating new pages - particularly if it is editable on-wiki, otherwise you risk hiding legitimate content, or being unable to delete the vandalism. A bad regex could disable the whole wiki, for example. i.e. this is not a way of implementing regex-based page protection.
On a slightly tangentual note, I don't know if other people have realised this, but with the new cascading protection you can block named pages from being created, by creating e.g. [[Project:Banned pages]], transcluding any pages that users shouldn't be able to create, and enabling cascading protection. All the included pages can now no longer be created (or require login to create, depending on protection level).
- Mark Clements (HappyDog)
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikitech-l