On 11/29/07, Steve Bennett <stevagewp(a)gmail.com> wrote:
What is "the live system/site"? Are all the
Wikimedia sites,
including,
MediaWiki.org, Meta etc, running the same version of the
software?
Yes, they're run from a single set of files stored on NFS (which are
synchronized with the various Apaches by using the scap utility). The
differences between sites, such as different databases and settings,
are handled by, effectively, a single LocalSettings.php equivalent
with conditionals based on the Host in the request.
The way large changes are usually handled is small-scale testing (on
non-content wikis like private wikis or maybe
test.wikipedia.org)
followed by rolling out on all sites, and if necessary reverting that
and doing more small-scale testing. I imagine it's not considered
acceptable to have code with possibly major bugs running on even small
sites, and if you think it's not buggy, best to test that by as wide a
deployment as possible. The problems with the new parser were
detected within minutes by a flood of English-speaking users from
major wikis coming into #wikimedia-tech: how long would it have taken
if we only rolled it out on some of the tiny wikis?