Hey SJ!
Intuitively one model per wiki has a lot of merit.
< if we continued down that path, Lift Wing would eventually be hosting 3000+ models (i.e. 330 models per new feature) pretty quickly
< But with language agnostic models we can make that model available to all communities.
Thanks! Agreed that having a model available to all communities is good for equity :) In the automoderator case, is it that the multilingual model incorporates the language-agnostic model, but not vice-versa? Is there a way to have the inverse: a generalized multilingual model, that may be fine-tuned for different communities, but does its best with input in less-known languages or variants? [Perhaps w/ context cues for users estimating how far out of distribution the input is.]
I like the idea of a general model that can be tuned, since I can imagine community groups maintaining datasets for fine-tuning more easily than maintaining their own entire models.
Warmly, SJ