Hi Huji and Xabriel,
Am Sa., 20. Jan. 2024 um 21:43 Uhr schrieb Huji Lee huji.huji@gmail.com:
Yes, a communication would be great (including, on this thread).
As it stands right now, the linktargets table is incomplete on the wikis I checked. For instance, this fawiki query https://superset.wmcloud.org/superset/sqllab/?savedQueryId=71 returns 388 results but the equivalent query https://superset.wmcloud.org/superset/sqllab/?savedQueryId=72 using linktargets only returns 4 results.
For some update, on the s7 https://noc.wikimedia.org/conf/dblists/s7.dblist run is now backfilling eswiki which comes right before fawiki. Once that's over (in a couple of days at most) it'll start backfilling fawiki. The rows you're seeing are due to write both (new pages, new edits, etc.). But this is already backfilled in many wikis, for example frwiki, arwiki, etc.
My bots are using quite a few queries that rely on pl_title or pl_namespace. I'd rather spend time update the queries only after the linktargets table is fully updated, so I can compare the results of old and new queries as a validation step.
I will inform you once the fawiki run is finished.
On Fri, Jan 19, 2024 at 4:08 PM Xabriel Collazo Mojica < xcollazo@wikimedia.org> wrote:
Amir,
To summarize: the only wiki that will soon get the old columns dropped is commonswiki and the rest of the wikis will keep the old columns until the migration to the new columns is complete on all wikis, at which time there will be a communication.
Is this correct?
Yes, until further communication, only s4 (commonswiki and
testcommonswiki) and testwiki (s3) will have their old columns removed.
Thanks,
-- Xabriel J. Collazo Mojica (he/him, pronunciation https://commons.wikimedia.org/wiki/File:Xabriel_Collazo_Mojica_-_pronunciation.ogg ) Sr Software Engineer Wikimedia Foundation
On Wed, Jan 17, 2024 at 9:57 PM Ben Kurtovic wikipedia.earwig@gmail.com wrote:
Thanks for the clear explanation, this gives more context for the urgency.
On Jan 17, 2024, at 3:04 PM, Amir Sarabadani <
asarabadani@wikimedia.org> wrote:
What about only dropping it from Commons to reduce the risk of outage
and leave the rest until the all are finished (or all except Wikidata)? You'd have to write something for the new schema regardless.
If we insist on dropping now, I do think treating Commons as special and waiting until all are finished before dropping the rest would be better. It’s a small difference but potentially makes the logic simpler to implement.
Ben / Earwig _______________________________________________ Cloud mailing list -- cloud@lists.wikimedia.org List information: https://lists.wikimedia.org/postorius/lists/cloud.lists.wikimedia.org/
Cloud mailing list -- cloud@lists.wikimedia.org List information: https://lists.wikimedia.org/postorius/lists/cloud.lists.wikimedia.org/