You're looking at a few technical challenges in accomplishing this. The
first is finding time to do a back up of the MySQL database.
Question: does that already happen? I would hope so. If so, you can
produce a dump of the backup at no loss of time to the main server.
Second, there's the issue of getting the dump to the duplicate site. How
big is the dump of 1.1 million articles? Probably too large for a normal
download. This leaves you with a few options.
1. You can burn it to multiple DVD's once and ship it, and he can live with
the increasingly out of date version.
2. He can pay for a subscription, and you can pay someone to burn DVD's of
the backup on a regular basis.
3. If you wind up with enough demand for it to pay for a separate server,
you can set up a bittorrent feed on a regular (weekly or monthly) basis.
Getting it loaded shouldn't be too much trouble. Essentially it would be no
different than restoring a backup on crash. It would even be fairly easy to
automate for a subscription basis.
Seriously, folks, the guy wants to hand someone cash money to get this
done. Why are you psychoanalyzing him? I'm certain that the Wikipedia
project could put any profit from said cash money to good use.