The one in the API. But they should be identical, except when someone’s updated them and I haven’t synched the update to the code yet – the wiki pages are just harder to parse. (The wiki pages also have some templates that aren’t available in the tool yet, but those are blocked on some issues/questions that haven’t been resolved yet, and I wouldn’t recommend using them.)

On 24.07.19 19:17, Denny Vrandečić wrote:
Lucas, what would you consider the canonical representation of the language knowledge - the one in the API or the one on the Wiki pages?

On Wed, Jul 24, 2019 at 5:34 AM Lucas Werkmeister <mail@lucaswerkmeister.de> wrote:

Just a side note – if anyone wants to use the templates, it’s probably better to use the tool’s templates API rather than the wiki page itself: transcribing the templates into structured form takes some time, there’s no need for someone else to do it again :)

Cheers,
Lucas

On 24.07.19 02:03, Denny Vrandečić wrote:
Hey,

is anyone working on or has worked on generating EntitySchemas from the Wikidata Lexeme Forms data that Lucas is collecting?

It seems that most of the necessary data should be there already for these.

E.g. generating


from


(If Danish and German were the same language, which they are not, obviously, but this is to exemplify the idea).

If not, does anyone want to work / cooperate on that?

Cheers,
Denny


_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata

_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata