On 21/01/07, beachboy22@verizon.net beachboy22@verizon.net wrote:
How exactly is MediaWiki indexed by Google in its out-of-box state (no extensions, etc.)? Does it create meta-tags or anything for each page?
Generally, regular pages will be indexed as normal, assuming robots.txt isn't blocking this; special pages shouldn't be indexed (meta noindex,nofollow tags) and external links will have "nofollow" applied.
You can customise the robot policies per-namespace with $wgNamespaceRobotPolicies, and of course, you can fine-tune bits with robots.txt, e.g. the Wikipedias usually block robots from indexing their deletion discussions.
Rob Church