Tim Starling wrote:
Robert Rohde wrote:
For Andrew or anyone else that knows, can we assume that the filter is smart enough that if the first part of an AND clause fails then the other parts don't run (or similarly if the first part of an OR succeeds)? If so, we can probably optimize rules by doing easy checks first before complex ones.
No, everything will be evaluated.
I've written and deployed branch optimisation code, which reduced run-time by about one third.
Note that the problem with rule 48 was that added_links triggers a complete parse of the pre-edit page text. It could be replaced by a check against the externallinks table. No amount of clever shortcut evaluation would have made it fast.
I've fixed this to use the DB instead for that particular context.
On Thu, Mar 19, 2009 at 11:54 AM, Platonides Platonides@gmail.com wrote:
PS: Why there isn't a link to Special:AbuseFilter/history/$id on the filter view?
There is.
I've disabled a filter or two which were taking well in excess of 150ms to run, and seemed to be targetted at specific vandals, without any hits. The culprit seemed to be running about 20 regexes to determine if an IP is in a particular range, where one call to ip_in_range would suffice. Of course, this is also a documentation issue which I'm working on.
To help a bit more with performance, I've also added a profiler within the interface itself. Hopefully this will encourage self-policing with regard to filter performance.