Hi Amir & others,
I’m glad we are making changes to improve DB storage/query efficiency. I wanted to express my agreement with Tacsipacsi that dropping the data before the migration has completed is a really bad outcome. Now tool maintainers need to deal with multiple migrations depending on the wikis they query or add more code complexity. And there is little time to make the changes for those of us who had planned to wait until the new data was available.
Commons has grown to 1.8TB already
That’s a big number yes, but it doesn’t really answer the question — is the database actually about to fill up? How much time do you have until that happens and how much time until s1/s8 finish their migration? Is there a reason you can share why this work wasn’t started earlier if the timing is so close?
so you need to use the old way for the list of thirty-ish wikis (s1, s2, s6, s7, s8) and for any wiki not a member of that, you can just switch to the new system
IMO, a likely outcome is some tools/bots will simply be broken on a subset of wikis until the migtation is completed across all DBs.
Thanks for all your work on this task so far.
Ben / Earwig