Jonathan Nowacki wrote:
Hello, I'm looking for the the best way to dump the information in a locally installed Media Wiki so it can be subsequently parsed and then displayed in a network visualization program. For an example of this kind of visualization program please see the attached file, click here: http://www.centennial-software.com/_gfx/screenshots/discovery_visual_map.jpg or perform a google search for "network visualization"
This will show how each page is related and allow curators to keep an eye on how the wiki is growing. I can write a parsing program myself and visualization programs are freely available on the web. I'm simply wondering what is the best way to dump everything into a static file.
Sounds like you want to dump a list of link relationships. You can pull a full list fairly simply:
SELECT page_namespace,page_title, pl_namespace,pl_title FROM page,pagelinks WHERE page_id=pl_from;
(That'd include the numeric namepspace IDs; transform them to text appropriately if you're doing a full list of titles.)
-- brion vibber (brion @ wikimedia.org)