Google can certainly index our beloved, well-behaved, text- and context-rich, low-bandwidth sites.*
The fact that this happens differentially for Google and not other indexes implies it's within their control.
If you're getting boilerplate responses about SEO, you may not be talking to the people who care or can resolve this.
I wonder if we can make this easier for indexers to understand and address by
a) maintaining an index of essential free knowledge
-- a star catalog of sites in the constellation: including our core sites, MDwiki, &c,
-- pointers for each to a sitemap or equivalent, and a change-feed or equivalent
b) maintaining visualizations of index speed and coverage, via spot checks
SJ
* Jorge wrote: "we don’t have any influence or can decide what Google indexes..." -- we seem to have a good deal of soft influence.
"...or where Wikimedia content ranks in their search" -- as I understand it, this isn't about search rank at all. It's about being able to find newly added knowledge, that doesn't exist anywhere else online, in a range of languages. (asking about search rank may rightly trigger a boilerplate immune response)
** Scholar and Patents have their own feeds they prioritize; this could be a similar carve-out of attention. The sitemaps don't need to be accessible to "any spider on the web" (if this is why we turned them off). Something that only shows pages created or changed in the last window would also suffice.