Hello,
Hello Victor,
Your host will certainly be blocked if you keep request special:export .
You should use the database dumps to get all the data at once :o)
http://download.wikimedia.org/index.php?thingumy=wiktionary
For me it is hard now to quickly enough set up so big databases.
Please so let us agree for a compromise variant:
- I will keep a cache (maybe currently around 20-30 Mb of Bzip2 compressed
data, increasing the cache size in months to follow) of Wiktionary articles.
- I will use XML export to download several (e.g. 10) articles at once to
update cache only when articles are very old (week? two weeks? month?) or
the user of my site explicitly requested to update the cache. (Well, when
somebody requests an article to be downloaded _first time_ I indeed need to
make XML export for just one, not ten articles.)
- I will use XML import with "Accept-Encoding: gzip" HTTP header to
compress the bunches of imported articles with Gzip.
OK?
Additionally I may now donate you a little to pay for the traffic.
--
== Victor Porton (porton(a)ex-code.com) ==
#
http://ex-code.com - software company, custom software for low price #
#
http://ex-code.com/~porton/ - Christian revelations, math discoveries #
#
http://ex-code.com/articles/ - original programming/XML articles #