"Filters" based on meta tags are utterly useless in a world-editable website like Wikipedia. Parents wouldn't be able to trust that 5 seconds before Junior accessed the Tigger article that Goat Man hadn't replaced the article with an obscene story about Mickey, Minnie and Pluto with external links to the goat sex and other similar pictures under every clickable link. If I were the Goat Man I would specifically target "G" tagged articles.
IMO the only effective way to filter is to have humans hand select specific article versions and have those versions automatically uploaded into another website that is not world editable. One click and you are done. Hm, sounds like Larry's Sifter project (the software being developed is GPLd so there is no reason why there can't be many Sifters - each with a different focus -- but each Sifter project would have to respect Wikipedia's policies and conventions to use the easy upload feature).
We should face the fact that since Wikipedia is a wiki, that is has always been and will always be, a work in progress. I don't see anything wrong with that at all if there are stable Wikipedia article versions that people can put some trust into. Under the Sifter plan, Wikipedia will simply be a common resource that all the Sifter projects use and do their editing on.
-- Daniel Mayer (aka mav)
Free Software analogy: Wikipedia is like CVS. Yes you can compile the raw untested code that somebody wrote 10 minutes ago and run it on your computer but don't complain if it makes your hard-drive spin backwards and causes your computer to catch fire. Sifter can be viewed more like a stable and pre-compiled release that is deemed functional and mostly bug-free by somebody with a reputation on the line.