Hoi,
I am afraid that as the number of articles grows, the existence of redirects
becomes increasingly problematic because more and more disambiguation will
be needed. Existing redirects are not considered when disambiguation is
implemented. Redirects ARE problematic and by automagically creating a vast
number of more redirects it becomes even more of a nightmare.
Thanks,
GerardM
On 10/24/07, Andrew Garrett <andrew(a)epstone.net> wrote:
On 10/24/07, Steve Bennett <stevagewp(a)gmail.com> wrote:
Possible implementation:
Without knowing the MediaWiki DB schema at all, I speculated on a
possible
implementation that would be a good tradeoff
between size and speed. Two
new
tables are needed:
<snip>
No need for the complex setup you envisiage. For mysql, at least, we
could create a new table 'article_aliases', and "select aa_page from
article_aliases where 'my_title' like aa_alias". Of course, we'd need
to do some built-in, potentially expensive checking on the aliases
that would be originally introduced, like checking if any other pages
match the regex (if so, block the alias), and if the article title
itself matches the regex (if not, block the alias).
I have no idea how portable this would be to postgres and other
database engines, but it could potentially work as an extension.
--
Andrew Garrett
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
http://lists.wikimedia.org/mailman/listinfo/wikitech-l