In the context of <https://bugzilla.wikimedia.org/show_bug.cgi?id=10621>,
the concept of using wiki pages as databases has come up. We're already
beginning to see this:
(over 30,000 lines)
(over 7,400 lines)
At large enough sizes, the in-browser syntax highlighting is currently
problematic. But it's also becoming clear that the larger underlying
problem is that using a single request wiki page as a database isn't
really scalable or sane.
(ParserFunction #switch's performance used to prohibit most ideas of using
a wiki page as a database, as I understand it.)
Has any thought been given to what to do about this? Will it require
manually paginating the data over collections of wiki pages? Will this be
something to use Wikidata for?