*Here's your quick overview of what has been happening around Wikidata over the last week.*
Discussions
- New request for comments: How to handle heat treating as a qualifier for material properties ? https://www.wikidata.org/wiki/Wikidata:Requests_for_comment/How_to_handle_heat_treating_as_a_qualifier_for_material_properties_%3F - New development input: Identify problems with adding new languages into Wikidata https://www.wikidata.org/wiki/Wikidata:Identify_problems_with_adding_new_languages_into_Wikidata
Events https://www.wikidata.org/wiki/Special:MyLanguage/Wikidata:Events
- Upcoming: "Researcher meets Curator", with a subquestion: "What are the consequences of collecting born digital sources, working with digital network analysis and engaging with linked open data initiatives such as Wikidata", in Maastricht on 22 March 2019. Call for papers https://www.academischerfgoed.nl/call-for-papers-researcher-meets-curator/ - Upcoming: Advanced Wikidata Training https://meta.wikimedia.org/wiki/CIS-A2K/Events/Advanced_Wikidata_Training_2018 in India, December 15-16 - Past: "Wikibase: configure, customize, and collaborate" workshop at SWIB 18 http://swib.org/swib18/programme.html in Bonn, Germany on November 26, 2018. Workshop materials https://stuff.coffeecode.net/2018/wikibase-workshop-swib18.html - Past: EveryPolitician https://www.wikidata.org/wiki/Wikidata:WikiProject_every_politician event to identify political data sources for Wikidata in Madrid, Spain, on December 1, 2018.
Other Noteworthy Stuff
- Author Disambiguator https://tools.wmflabs.org/author-disambiguator/ (github source https://github.com/arthurpsmith/author-disambiguator/), new tool by User:ArthurPSmith https://www.wikidata.org/wiki/User:ArthurPSmith (based on SourceMD) for linking author items to their works. - OpenRefine 3.1 https://github.com/OpenRefine/OpenRefine/releases/tag/3.1 was released - New API module to format multiple entity IDs https://lists.wikimedia.org/pipermail/wikidata-tech/2018-December/001356.html - You can now access the number of Forms and Senses of Lexemes through API and special page https://www.wikidata.org/wiki/Wikidata_talk:Lexicographical_data#Access_the_number_of_Forms_and_Senses_for_Lexemes
Did you know?
- Newest properties https://www.wikidata.org/wiki/Special:ListProperties: - General datatypes: reference has role https://www.wikidata.org/wiki/Property:P6184, tautomer of https://www.wikidata.org/wiki/Property:P6185, eponymous category https://www.wikidata.org/wiki/Property:P6186, language style https://www.wikidata.org/wiki/Property:P6191, ratified by https://www.wikidata.org/wiki/Property:P6193, funding scheme https://www.wikidata.org/wiki/Property:P6195 - External identifiers: GameFAQs company ID https://www.wikidata.org/wiki/Property:P6182, AJOL journal ID https://www.wikidata.org/wiki/Property:P6183, LEGO set ID https://www.wikidata.org/wiki/Property:P6187, BDFA player ID https://www.wikidata.org/wiki/Property:P6188, Sabinet journal ID https://www.wikidata.org/wiki/Property:P6189, NSW State Archives and Records Authority Agency ID https://www.wikidata.org/wiki/Property:P6190, Bygdeband location ID https://www.wikidata.org/wiki/Property:P6192, Austrian Biographical Encylopedia ID https://www.wikidata.org/wiki/Property:P6194, Badtaste ID https://www.wikidata.org/wiki/Property:P6196, Badgames ID https://www.wikidata.org/wiki/Property:P6197, Mexican female soccer players ID https://www.wikidata.org/wiki/Property:P6198, member of the Portuguese parliament ID https://www.wikidata.org/wiki/Property:P6199, BBC News topic ID https://www.wikidata.org/wiki/Property:P6200, OBV editions ID https://www.wikidata.org/wiki/Property:P6201, Geolex ID https://www.wikidata.org/wiki/Property:P6202, CNPJ https://www.wikidata.org/wiki/Property:P6204, Defined Term ID https://www.wikidata.org/wiki/Property:P6205, Guida al Fumetto Italiano ID https://www.wikidata.org/wiki/Property:P6206 - New property proposals https://www.wikidata.org/wiki/Special:MyLanguage/Wikidata:Property_proposal to review: - General datatypes: gained independence from https://www.wikidata.org/wiki/Wikidata:Property_proposal/gained_independence_from, nachgewiesen mittels https://www.wikidata.org/wiki/Wikidata:Property_proposal/nachgewiesen_mittels, measured by (KPI) https://www.wikidata.org/wiki/Wikidata:Property_proposal/measured_by_(KPI), OpenTrials ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/OpenTrials_ID, administrated by the administrative territorial entity https://www.wikidata.org/wiki/Wikidata:Property_proposal/administrated_by_the_administrative_territorial_entity, level of description https://www.wikidata.org/wiki/Wikidata:Property_proposal/level_of_description, stored as lexeme https://www.wikidata.org/wiki/Wikidata:Property_proposal/stored_as_lexeme, taxon author citation https://www.wikidata.org/wiki/Wikidata:Property_proposal/taxon_author_citation, Astronomical coordinates https://www.wikidata.org/wiki/Wikidata:Property_proposal/Astronomical_coordinates, catchphrase https://www.wikidata.org/wiki/Wikidata:Property_proposal/catchphrase, real estate developer https://www.wikidata.org/wiki/Wikidata:Property_proposal/real_estate_developer, Danske Taler speaker https://www.wikidata.org/wiki/Wikidata:Property_proposal/Danske_Taler_speaker - External identifiers: TrENSmissions person ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/TrENSmissions_person_ID, LIGA profile https://www.wikidata.org/wiki/Wikidata:Property_proposal/LIGA_profile, SEINet ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/SEINet_ID, BIBSYS work ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/BIBSYS_work_ID, Jewish Museum Berlin person ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/Jewish_Museum_Berlin_person_ID, UK Parliament Identifier https://www.wikidata.org/wiki/Wikidata:Property_proposal/UK_Parliament_Identifier, HAER ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/HAER_ID, Vesti.kz person ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/Vesti.kz_person_ID, Genius album ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/Genius_album_ID, Genius song ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/Genius_song_ID, TASS reference https://www.wikidata.org/wiki/Wikidata:Property_proposal/TASS_reference, DIR3 ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/DIR3_ID, NooSFere story ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/NooSFere_story_ID, L'Encyclopédie philosophique ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/L%27Encyclop%C3%A9die_philosophique_ID, RegiowikiAT ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/RegiowikiAT_ID, Discord Store game SKU https://www.wikidata.org/wiki/Wikidata:Property_proposal/Discord_Store_game_SKU, kohanimeregister https://www.wikidata.org/wiki/Wikidata:Property_proposal/kohanimeregister, ARLLFB member ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/ARLLFB_member_ID, ARB person ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/ARB_person_ID, Biographie nationale de Belgique ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/Biographie_nationale_de_Belgique_ID, protected area authority ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/protected_area_authority_ID, Flora of Wisconsin ID https://www.wikidata.org/wiki/Wikidata:Property_proposal/Flora_of_Wisconsin_ID, identifiant Monument aux morts https://www.wikidata.org/wiki/Wikidata:Property_proposal/identifiant_Monument_aux_morts - Query examples: - Timeline of early Western movies https://query.wikidata.org/#%23defaultView%3ATimeline%0ASELECT%20distinct%20%3Fevent%20%3FeventLabel%20%3Fdate%20%3Flocation%20%3Fimage%20WHERE%20%7B%0A%20%20%20%20%3Fevent%20wdt%3AP136%20wd%3AQ172980%20%3B%0A%20%20%20%20%20%20%20%20%20%20%20wdt%3AP577%20%3Fdate%20FILTER%20(YEAR(%3Fdate)%20%3C1925)%20.%0A%20%20%20%20%20%20%20%20%20%20%20%0A%20%20%20%20OPTIONAL%20%7B%3Fevent%20wdt%3AP18%20%3Fimage%20.%7D%20%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22en%2C%20fr%2C%20de%2C%20it%2C%20es%22.%20%7D%0A%7D (source https://twitter.com/SciHiBlog/status/1069173727651393536) - Species represented in the exhibition "Espèces en voies d'illumination" in the natural history museum of Paris https://query.wikidata.org/#%23defaultView%3AImageGrid%0ASELECT%20%3Fitem%20%3FitemLabel%20%3Fimage%20WHERE%20%7B%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22fr%2C%20en%22.%20%7D%0A%20%20wd%3AQ59312512%20wdt%3AP180%20%3Fitem.%0A%20%20OPTIONAL%20%7B%20%3Fitem%20wdt%3AP18%20%3Fimage.%20%7D%0A%7D%0ALIMIT%20500 (source https://twitter.com/K_rho/status/1068897422485741570) - Properties most used to describe cats in Wikidata https://query.wikidata.org/#SELECT%20%3Fprop%20%3FpropLabel%20%3Fcount%20WHERE%20%7B%0A%20%20%20%20%7B%0A%20%20%20%20%20%20%20%20SELECT%20%3Fprop%20%28COUNT%28DISTINCT%20%3Fitem%29%20AS%20%3Fcount%29%20WHERE%20%7B%0A%20%20%20%20%20%20%20%20%20%20%20%0A%20%20%20%20%20%20%20%20%20%20%20hint%3AQuery%20hint%3Aoptimizer%20%22None%22%20.%0A%20%20%20%20%20%20%20%20%20%20%20%3Fitem%20wdt%3AP31%20wd%3AQ146%20.%0A%20%20%20%20%20%20%20%20%20%20%20%3Fitem%20%3Fp%20%3Fid%20.%0A%20%20%20%20%20%20%20%20%20%20%20%3Fprop%20wikibase%3AdirectClaim%20%3Fp%20.%0A%20%20%20%20%20%20%20%20%7D%20%20GROUP%20BY%20%3Fprop%0A%20%20%20%20%7D%0A%20%20%0A%20%20%20%20SERVICE%20wikibase%3Alabel%20%7B%0A%20%20%20%20%20%20%20%20bd%3AserviceParam%20wikibase%3Alanguage%20%22en%22%20.%0A%20%20%20%20%7D%0A%7D%20ORDER%20BY%20DESC%20%28%3Fcount%29 (source https://twitter.com/fagerving/status/1068230850196647936) - List of UK embassies https://query.wikidata.org/#SELECT%20distinct%20%3Fitem%20%3FitemLabel%20%3FP31Label%20%3FP17Label%20%3FP131Label%20%20%3FP137Label%20%3FP361Label%20WHERE%20%0A%7B%0A%20%20%3Fitem%20wdt%3AP31%2Fwdt%3AP279%2a%20wd%3AQ213283%20.%0A%20%20%3Fitem%20wdt%3AP137%20wd%3AQ145.%0A%20%20%20%20optional%20%7B%3Fitem%20wdt%3AP31%20%3FP31%20.%20%7D%0A%20%20%20%20optional%20%7B%3Fitem%20wdt%3AP17%20%3FP17%20.%20%7D%0A%20%20%20%20optional%20%7B%3Fitem%20wdt%3AP131%20%3FP131%20.%20%7D%20%0A%20%20%20%20optional%20%7B%3Fitem%20wdt%3AP137%20%3FP137%20.%20%20%7D%0A%20%20%20%20optional%20%7B%3Fitem%20wdt%3AP361%20%3FP361%20.%20%7D%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22%5BAUTO_LANGUAGE%5D%2Cen%22.%20%7D%0A%7D (source https://twitter.com/Tagishsimon/status/1068217770347831297) - Map of places of residence for accused witches in Scotland with a layer for occupations https://query.wikidata.org/#%23Map%20of%20places%20of%20residence%20of%20accused%20witches.%0A%23defaultView%3AMap%0A%0A%23You%20need%20to%20use%20the%20name%20%3Flayer%20as%20the%20variable%20you%20are%20colour%20coding%20by%20%0A%23It's%20much%20easier%20to%20change%20later%20if%20you%20do%20this%20in%20the%20SELECT%20area%2C%20using%20e.g.%20(%3FGender%20as%20%3Flayer)%0A%0ASELECT%20%3Fperson%20%3FpersonLabel%20%3Faccusedwitch%20%3Fprecision%20%3Fcoords%20%3FOccupation%20%3FResidence%20%3Fimage%20(%3FOccupation%20as%20%3Flayer)%20%0AWHERE%20%7B%0A%20%20%3Fperson%20wdt%3AP4478%20%3Faccusedwitch%20.%0A%20%20%3Fperson%20wdt%3AP31%20wd%3AQ5.%0A%20%20%3Fperson%20wdt%3AP21%20%3Fgender%20.%0A%20%20%3Fperson%20wdt%3AP551%20%3Fresidence%20.%0A%20%20OPTIONAL%20%7B%20%3Fperson%20wdt%3AP106%20%3Foccupation%7D%20%0A%20%20OPTIONAL%20%7B%20%3Fperson%20wdt%3AP3716%20%3Fsocialclassification%20%7D%0A%20%20%3Fresidence%20wdt%3AP625%20%3Fcoords.%0A%20%20OPTIONAL%20%7B%20%3Farticle%20schema%3Aabout%20%3Fperson.%20%7D%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%0A%20%20%20%20bd%3AserviceParam%20wikibase%3Alanguage%20%22en%22.%0A%20%20%20%20%3Fperson%20rdfs%3Alabel%20%3FpersonLabel.%0A%20%20%20%20%3Fresidence%20rdfs%3Alabel%20%3FResidence%20.%0A%20%20%20%20%3Foccupation%20rdfs%3Alabel%20%3FOccupation.%0A%20%20%20%20%3Fgender%20rdfs%3Alabel%20%3FGender.%0A%20%20%20%20%3Fsocialclassification%20rdfs%3Alabel%20%3FSocialClassification%20.%0A%20%20%7D%0A%7D (source https://twitter.com/emcandre/status/1068137424893685760)
Development
- Add Lexeme to Wikibase's ontology.owl (phab:T195368 https://phabricator.wikimedia.org/T195368) - Prepare to drop change_tag.ct_tag column (phab:T194163 https://phabricator.wikimedia.org/T194163) - Create Federated Wikibase instance on Beta Commons (phab:T204748 https://phabricator.wikimedia.org/T204748) - More work on preparing a new termbox for the mobile version of Wikidata
You can see all open tickets related to Wikidata here https://phabricator.wikimedia.org/maniphest/query/4RotIcw5oINo/#R. If you want to help, you can also have a look at the tasks needing a volunteer https://phabricator.wikimedia.org/project/board/71/query/zfiRgTnZF7zu/?filter=zfiRgTnZF7zu&order=priority.
Monthly Tasks
- Add labels, in your own language(s), for the new properties listed above. - Comment on property proposals: all open proposals https://www.wikidata.org/wiki/Wikidata:Property_proposal/Overview - Suggested and open tasks https://www.wikidata.org/wiki/Wikidata:Contribute/Suggested_and_open_tasks ! - Contribute to a Showcase item https://www.wikidata.org/wiki/Special:MyLanguage/Wikidata:Showcase_items . - Help translate https://www.wikidata.org/wiki/Special:LanguageStats or proofread the interface and documentation pages, in your own language! - Help merge identical items https://www.wikidata.org/wiki/User:Pasleim/projectmerge across Wikimedia projects. - Help write the next summary! https://www.wikidata.org/wiki/Wikidata:Status_updates/Next
I recall reading in an announcement for Wikidata lexemes that we should not (yet) run (large) import jobs for Wikidata lexemes. I cannot immediately find that message.
I am wondering what the attitude of the Wikidata developers and users are wrt. import jobs for Wikidata lexemes?
Specifically, I am thinking on importing DanNet, a Danish lexical resource in RDF for which we recently gained a property for one of type of data: https://www.wikidata.org/wiki/Property:P6140
Finn Årup Nielsen http://people.compute.dtu.dk/faan/
On Tue, Dec 4, 2018 at 6:48 PM fn@imm.dtu.dk wrote:
I recall reading in an announcement for Wikidata lexemes that we should not (yet) run (large) import jobs for Wikidata lexemes. I cannot immediately find that message.
I am wondering what the attitude of the Wikidata developers and users are wrt. import jobs for Wikidata lexemes?
Specifically, I am thinking on importing DanNet, a Danish lexical resource in RDF for which we recently gained a property for one of type of data: https://www.wikidata.org/wiki/Property:P6140
Hey Finn :)
From my side: I am ok with ramping up imports now but please don't go
all crazy. The reason we asked not to do this in the beginning were: * The community needs to get an understanding of the new domain first and play around with how to model the data before a large import sets everything in stone. I think by now this is fine. * We introduced a completely new part to Wikidata that we didn't want to potentially bring to its limits right away. * Wiktionary editors explicitly asked not to import data from Wiktionary at this point. This is still the case as far as I know. Anything that's not CC-0 can't be imported. * The API was bound to change and I didn't want everyone to write bots and tools just to have to rewrite them a few days later. We still don't give any guarantees but from my side no massive changes are planned right now. If you're ok with things potentially changing that's fine.
Cheers Lydia
Finn,
I wouldn't upload the entire DanNet initially but instead something more manageable that you can verify...say 50-100 Lexemes, Statements, Senses, etc. and get community feedback...then go for larger.
Thad https://www.linkedin.com/in/thadguidry/
On Tue, Dec 4, 2018 at 11:58 AM Lydia Pintscher < Lydia.Pintscher@wikimedia.de> wrote:
On Tue, Dec 4, 2018 at 6:48 PM fn@imm.dtu.dk wrote:
I recall reading in an announcement for Wikidata lexemes that we should not (yet) run (large) import jobs for Wikidata lexemes. I cannot immediately find that message.
I am wondering what the attitude of the Wikidata developers and users are wrt. import jobs for Wikidata lexemes?
Specifically, I am thinking on importing DanNet, a Danish lexical resource in RDF for which we recently gained a property for one of type of data: https://www.wikidata.org/wiki/Property:P6140
Hey Finn :)
From my side: I am ok with ramping up imports now but please don't go all crazy. The reason we asked not to do this in the beginning were:
- The community needs to get an understanding of the new domain first
and play around with how to model the data before a large import sets everything in stone. I think by now this is fine.
- We introduced a completely new part to Wikidata that we didn't want
to potentially bring to its limits right away.
- Wiktionary editors explicitly asked not to import data from
Wiktionary at this point. This is still the case as far as I know. Anything that's not CC-0 can't be imported.
- The API was bound to change and I didn't want everyone to write bots
and tools just to have to rewrite them a few days later. We still don't give any guarantees but from my side no massive changes are planned right now. If you're ok with things potentially changing that's fine.
Cheers Lydia
-- Lydia Pintscher - http://about.me/lydia.pintscher Product Manager for Wikidata
Wikimedia Deutschland e.V. Tempelhofer Ufer 23-24 10963 Berlin www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/029/42207.
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
I'd also say that it would be a very good idea NOT TO RUN BOTS THAT ADD GRAMMATICAL FORMS.
Wiktionaries in some languages have templates and modules that show things like verb conjugation and noun declension. If I'm allowed to fantasize, the ideal thing to do would be to have something like a common API framework in Wikibase Lexeme (probably in Lua or JavaScript), and that different languages would be able to auto-generate forms in a robust way. The problem of doing it with bots is that if something is wrong or incomplete, fixing it later will be a nightmare. In fact, doing it with a bot would be a regression when compared doing it with a template, as it's done in Wiktionary.
I don't know whether Wikibase developers actually plan to create such a framework—it's just a wish :)
-- Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי http://aharoni.wordpress.com “We're living in pieces, I want to live in peace.” – T. Moore
בתאריך יום ג׳, 4 בדצמ׳ 2018 ב-19:48 מאת <fn@imm.dtu.dk>:
I recall reading in an announcement for Wikidata lexemes that we should not (yet) run (large) import jobs for Wikidata lexemes. I cannot immediately find that message.
I am wondering what the attitude of the Wikidata developers and users are wrt. import jobs for Wikidata lexemes?
Specifically, I am thinking on importing DanNet, a Danish lexical resource in RDF for which we recently gained a property for one of type of data: https://www.wikidata.org/wiki/Property:P6140
Finn Årup Nielsen http://people.compute.dtu.dk/faan/
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
On Tue, Dec 4, 2018 at 10:16 PM Amir E. Aharoni amir.aharoni@mail.huji.ac.il wrote:
I'd also say that it would be a very good idea NOT TO RUN BOTS THAT ADD GRAMMATICAL FORMS.
Wiktionaries in some languages have templates and modules that show things like verb conjugation and noun declension. If I'm allowed to fantasize, the ideal thing to do would be to have something like a common API framework in Wikibase Lexeme (probably in Lua or JavaScript), and that different languages would be able to auto-generate forms in a robust way. The problem of doing it with bots is that if something is wrong or incomplete, fixing it later will be a nightmare. In fact, doing it with a bot would be a regression when compared doing it with a template, as it's done in Wiktionary.
I don't know whether Wikibase developers actually plan to create such a framework—it's just a wish :)
Your wish will be granted at some point ;-) (though I have no idea yet how we're going to make this happen beyond what you lined out) https://phabricator.wikimedia.org/T202282
Cheers Lydia