Here's one option:
1. Install the pywikipediabot and configure it for your site.
2. Manipulate the data for your pages into one large file, preferrably
named dict.txt with {{start}} tags at the top of each page, {{stop}}
tags at the bottom, and the '''pagename''' as the first bolded
item on
each page (optionally enclosed in an HTML-style <!-- comment -->
3. Run the login.py script to log in.
4. Run the pagefromfile.py script. This will parse through the dict.txt
file quickly creating each page in the file on your wiki.
This should be a fairly quick process... complete in under an hour or
two. The pagefromfile.py script can occasionally bomb out. If that
happens, look at your wiki recentchanges log, note the last file
uploaded, remove the data in dict.txt from the top of the file to that
page, and restart.
Good luck!
Tim
-------- Original Message --------
Subject: [Mediawiki-l] Automating the import page?
From: "Glen Faires" <glen(a)madnys.com>
Date: Tue, August 15, 2006 11:58 pm
To: "'MediaWiki announcements and site admin list'"
<mediawiki-l(a)Wikimedia.org>
I'm still in the process of attempting to set up a wiki. However, that
involves importing some 7,000 files/pages. Concatenating the files hasn't
worked, so I am looking for an alternate method.
Anyone have any ideas how to get 7,000 files, 128MB in text onto a wiki
without having to upload one file at a time?
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l