https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
Bug ID: 63327 Summary: Migrate i18n file format to JSON Product: Pywikibot Version: unspecified Hardware: All OS: All Status: NEW Keywords: i18n Severity: normal Priority: Unprioritized Component: i18n Assignee: Pywikipedia-bugs@lists.wikimedia.org Reporter: siebrand@kitano.nl CC: niklas.laxstrom@gmail.com, valhallasw@arctus.nl Web browser: --- Mobile Platform: ---
We've recently started using JSON for i18n files in MediaWiki and for some other products. We think it's a good non-executable format for i18n/L10n.
It would be nice if Pywikibot would also start using this format. It would also allow us to remove the less than excellent Python file format support in the Translate extension.
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
--- Comment #1 from Merlijn van Deen valhallasw@arctus.nl --- One of the reasons we are using the current file format is the structure: in pywikibot, the typical use case is 'getting a few strings from a lot of languages', while the general use case (e.g. for mediawiki) is 'getting a lot of strings from a single language'.
On tools-login, parsing all mediawiki's json files takes 3-5 seconds. For many bot users, this might be a lot longer (slower computer, slower hard drive), which (I think) is not reasonable.
Switching to a json-based format with the same file structure would be OK with me.
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
--- Comment #2 from Niklas Laxström niklas.laxstrom@gmail.com --- It shouldn't matter too much whether N messages are in 50 or 1000 files (made up numbers) on the time how much it takes to parse them.
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
Betacommand Phoenixoverride@gmail.com changed:
What |Removed |Added ---------------------------------------------------------------------------- CC| |Phoenixoverride@gmail.com
--- Comment #3 from Betacommand Phoenixoverride@gmail.com --- Actually it would, each new file introduces more disk read time, file open, file close time. Then depending on the language it may have to create a new json parser, parse the file, and then destroy the json parser. With 1-2 files its not that big of a deal, but as it scales up the issue becomes more and more of a bottle neck
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
--- Comment #4 from Niklas Laxström niklas.laxstrom@gmail.com --- I can understand that, but I'm not convinced it is a bottle neck currently.
I am weighing this against the development effort needed in Translate to support it.
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
--- Comment #5 from Merlijn van Deen valhallasw@arctus.nl --- (In reply to Niklas Laxström from comment #4)
I can understand that, but I'm not convinced it is a bottle neck currently.
As I mentioned, I *measured* the time needed to parse JSON files. For Mediawiki core, this takes *multiple seconds*. That's *multiple seconds* during *startup* of *each* script.
Currently, each script needs to read a single file, that's also in a format that's loaded quickly (the python bytecode is cached).
(In reply to Niklas Laxström from comment #4)
I am weighing this against the development effort needed in Translate to support it.
I assume you are aware this switch won't exactly be free in terms of *our* time, either, right? I'd rather work on site-based configuration, or python 3 support, instead of changing a file format for the sake of changing it.
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
--- Comment #6 from Niklas Laxström niklas.laxstrom@gmail.com --- Are you saying that you parse *all* of *MediaWiki's* i18n files on *startup* for all pywikibot scripts?
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
--- Comment #7 from Merlijn van Deen valhallasw@arctus.nl --- No, I'm saying we would need to parse *all* of *pywikibot's* i18n files on *startup* of all scripts if we would use the json format currently used by mediawiki.
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
--- Comment #8 from Niklas Laxström niklas.laxstrom@gmail.com --- MediaWiki core has about 500000 strings (all languages together). Pywikibot has 11000 strings. Assuming parsing is linear on the number of strings, for Pywikibot your example would take 0.1 seconds. This leaves some room for growth, non-linear parsing time and slower machines.
I would also imagine that you could on startup just dump all the messages in an efficient serialized format provided by python, which is used on subsequent loads, if necessary. I might be able to help with that.
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
--- Comment #9 from Merlijn van Deen valhallasw@arctus.nl --- You're right, MW is much bigger. Could you provide the json formatted output for pywikibot somewhere?
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
Siebrand Mazeland siebrand@kitano.nl changed:
What |Removed |Added ---------------------------------------------------------------------------- CC| |siebrand@kitano.nl
--- Comment #10 from Siebrand Mazeland siebrand@kitano.nl --- (In reply to Merlijn van Deen from comment #9)
You're right, MW is much bigger. Could you provide the json formatted output for pywikibot somewhere?
There isn't such a thing. There's a file format and a conversion script that does not yet exist. There are currently 41 distinct i18n components with 129 strings for Pywikibot.
Let's double that. 100 components and 300 strings in say 70 languages. That's 5000 files and 21000 strings.
MediaWiki core has 3000 strings for English alone in a single file. It's hard to compare. I'd suggest to create some dummy data based on my above assumptions.
Sample files are abundant in any MediaWiki extension in i18n/en.json.
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
--- Comment #11 from Niklas Laxström niklas.laxstrom@gmail.com --- Or you can use the following python script to create json files for testing:
import json import glob import importlib import os
for file in glob.glob("*.py"): module = file[:-3] try: dict = importlib.import_module(module).msg os.mkdir(module) for lang in dict: file = open(module + "/" + lang + ".json", "w+") json.dump(dict[lang], file, ensure_ascii=False) except AttributeError: pass
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
--- Comment #12 from Gerrit Notification Bot gerritadmin@wikimedia.org --- Change 151113 had a related patch set uploaded by Ladsgroup: [BREAKING] [Bug 63327] Use JSON for i18n files
https://gerrit.wikimedia.org/r/151113
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
Gerrit Notification Bot gerritadmin@wikimedia.org changed:
What |Removed |Added ---------------------------------------------------------------------------- Status|NEW |PATCH_TO_REVIEW
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
--- Comment #13 from Gerrit Notification Bot gerritadmin@wikimedia.org --- Change 151114 had a related patch set uploaded by Ladsgroup: [BREAKING] [Bug 63327] Use JSON for i18n files
https://gerrit.wikimedia.org/r/151114
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
John Mark Vandenberg jayvdb@gmail.com changed:
What |Removed |Added ---------------------------------------------------------------------------- Blocks| |70936
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
John Mark Vandenberg jayvdb@gmail.com changed:
What |Removed |Added ---------------------------------------------------------------------------- Priority|Unprioritized |High CC| |jayvdb@gmail.com Severity|normal |critical
--- Comment #14 from John Mark Vandenberg jayvdb@gmail.com --- This needs to be done before the next version pushed to pypi , which would be another beta or a release candidate.
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
--- Comment #15 from Gerrit Notification Bot gerritadmin@wikimedia.org --- Change 151114 had a related patch set uploaded by John Vandenberg: Use JSON for i18n files
https://gerrit.wikimedia.org/r/151114
https://bugzilla.wikimedia.org/show_bug.cgi?id=63327
John Mark Vandenberg jayvdb@gmail.com changed:
What |Removed |Added ---------------------------------------------------------------------------- See Also| |https://bugzilla.wikimedia. | |org/show_bug.cgi?id=66897
pywikipedia-bugs@lists.wikimedia.org