Hi!
One of the bigger questions I have about the potential shift to requiring services is the fate of shared hosting deployments of MediaWiki. What will happen to the numerous MediaWiki installs on shared hosting providers like 1and1, Dreamhost or GoDaddy when running MediaWiki requires multiple node.js/java/hack/python stand alone processes to function? Is the MediaWiki community making a conscious decision to abandon these customers? If so should we start looking for a suitable replacement that can be recommended and possibly develop tools to easy the migration away from MediaWiki to another monolithic wiki application? If not, how are we going to ensure that pure PHP alternate implementations get equal testing and feature development if they are not actively used on the Foundation's project wikis?
I think we're trying to fulfill a bit of a contradictory requirement here - running on the same software both the site of the size of *.wikipedia.org and a 1-visit-a-week-maybe $2/month shared hosting install. I think it would be increasingly hard to be committed to both equally. However, with clear API architecture we could maybe have alternatives - i.e. be able to have the same service performed by a superpowered cluster or by a PHP implementation on the URL on the same host. Of course, the latter would have much worse performance and maybe reduced feature set. How large is the delta we'd need to decide. But if we have clear architecture I think it should be possible to define minimum capability and have ability to degrade from full-power install to reduced-capability install (e.g. - blazingly fast fulltext search or just limited slow 'LIKE %word%' search?), while still keeping the same architecture. Is this something we could do (or, having the API, let the willing people from the community do it)? Is the $2/month use case important enough to actually do it?