On 16/09/05, Tony Sidaway f.crdfa@gmail.com wrote:
On 9/16/05, Phil Boswell phil.boswell@gmail.com wrote:
Would it be really stupid to ask why the URL sin't presented in this form on the actual page?
I'm sure Tim or Brion know the answer to this, but I'd hazard a guess that the /w form bypasses the squid caches.
Actually, I think one of the prime reasons is precisely so that they can be excluded in the robots.txt (it seems odd to me that an RSS aggregator would read robots.txt in that way, but never mind) - any URL with extra parameters is likely to be a view not suitable for crawlers, being either highly transitory or a different view of something already crawled, so those URLs aren't transformed into the /wiki/ form. I could be wrong about this, though.