On 11/22/05, S. Woodside sbwoodside@yahoo.com wrote:
Conjecture 1. That the distance between any two wikipedia pages, randomly chosen, as measured by wikilinks, is on average 6.
Conjecture 2. That wikipedia is sufficiently formal and complete that you could build a useful general purpose AI knowledge base using it.
Conjecture 3. That wikipedia has low information entropy.
Conjecture 4. That the development of a wikipedia article over time occurs in a manner consistent to the biological evolution of a species.
Conjecture 5. That the relationship between the amount of material in wikipedia and the number of article views is exponential.
Conjecture 6. That wikipedia is, on average, factually accurate.
Ohoh ! can I produce one?
Conjecture 7. The amount of wanky conjectures produced by bloggers is bound only by the number of sites that will allow them to spam their URLs.
Do I get a prize? ;)
On a more constructive note, If you're interested in AI taught using wikipedia... you should take a look at the state of the art in English parsers (such as [[Link Grammar]]) and how they fair on Wikipedia. Impressive that it works at all, but I don't think we need to worry about a hostile takeover of the planet by a self-aware encyclopedia any time soon.