Hi everyone,
Remember the Wikidata primary sources tool?
https://www.wikidata.org/wiki/Wikidata:Primary_sources_tool
While the StrepHit team is building its next version, I'd like to
invite you to have a look at a new project proposal.
The main goal is to add a high volume of identifiers to Wikidata,
ensuring live maintenance of links.
Do you think that Wikidata should become the central linking hub of
open knowledge?
If so, I'd be really grateful if you could endorse the *soweego* project:
https://meta.wikimedia.org/wiki/Grants:Project/Hjfocs/soweego
Of course, any comment is more than welcome on the discussion page.
Looking forward to your valuable feedback.
Best,
Marco
>From the OSM list, FYI
[iD and JOSM are OSM editing tools]
---------- Forwarded message ----------
From: Yuri Astrakhan <yuriastrakhan(a)gmail.com>
Date: 19 September 2017 at 21:03
Subject: [OSM-talk] Adding wikidata tags to the remaining objects with
only wikipedia tag
To: OpenStreetMap talk mailing list <talk(a)openstreetmap.org>
There is now a relatively small number of OSM nodes and relations
remaining, that have wikipedia, but do not have wikidata tags. iD
editor already automatically adds wikidata to all new edits, so
finishing up the rest automatically seems like a good thing to do, as
that will allow many new quality control queries. I would like to
auto-add all the corresponding wikidata based on wikipedia, for all
remaining objects, using JOSM's "Fetch Wikidata IDs".
This way, we will be able to quickly find all the objects that are
problematic with the Wikidata+OSM service. For example, thanks to the
community, we already fixed over 600 incorrect links to wiki
disambiguations pages, and this will find many more of them. We will
be able to fix when things are tagged as people (e.g. wikidata ->
person, instead of subject:wikidata -> person), find location errors
(e.g. wikidata and OSM point to very different locations, implying
that its an incorrect link).
Some statistics (I don't plan to add wikipedia tags to those with only wikidata)
Query: http://tinyurl.com/yafxe4co
has both n 323,745
has both w 137,097
has both r 248,588
no wikidata n 68,330
no wikidata w 131,796
no wikidata r 11,602
no wikipedia n 77,224
no wikipedia w 47,402
no wikipedia r 17,408
_______________________________________________
talk mailing list
talk(a)openstreetmap.org
https://lists.openstreetmap.org/listinfo/talk
--
Andy Mabbett
@pigsonthewing
http://pigsonthewing.org.uk
Hi Thad,
"Assignment" can be done with CONSTRUCT, so something like this would work
to split a name into two parts:
PREFIX ex: <http://example.org#>
CONSTRUCT {
?human ex:hasfirstName ?first.
?human ex:hasSecondName ?second
} WHERE
{
?human wdt:P31 wd:Q5; rdfs:label ?label .
BIND (STRBEFORE(?label, " ") AS ?first) .
BIND (STRAFTER(?label, " ") AS ?second) .
FILTER (lang(?label)= "en") .
}
Christopher Johnson
Scientific Associate
Universitätsbibliothek Leipzig
On 19 September 2017 at 14:00, <wikidata-request(a)lists.wikimedia.org> wrote:
> Send Wikidata mailing list submissions to
> wikidata(a)lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://lists.wikimedia.org/mailman/listinfo/wikidata
> or, via email, send a message with subject or body 'help' to
> wikidata-request(a)lists.wikimedia.org
>
> You can reach the person managing the list at
> wikidata-owner(a)lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Wikidata digest..."
>
>
> Today's Topics:
>
> 1. Weekly Summary #278 (Léa Lacroix)
> 2. How to split a label by whitespace in WDQS ? (Thad Guidry)
> 3. Re: How to split a label by whitespace in WDQS ? (Marco Neumann)
> 4. Re: How to split a label by whitespace in WDQS ?
> (Nicolas VIGNERON)
> 5. Re: How to split a label by whitespace in WDQS ?
> (Lucas Werkmeister)
> 6. Re: How to split a label by whitespace in WDQS ? (Thad Guidry)
> 7. Categories in RDF/WDQS (Stas Malyshev)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 18 Sep 2017 17:36:38 +0200
> From: Léa Lacroix <lea.lacroix(a)wikimedia.de>
> To: "Discussion list for the Wikidata project."
> <wikidata(a)lists.wikimedia.org>
> Subject: [Wikidata] Weekly Summary #278
> Message-ID:
> <CAERksTZPK-wMkwcr4hXBTA3TPTQcBPntN3dHFpZJ
> 1798eM4oYQ(a)mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> *Here's your quick overview of what has been happening around Wikidata over
> the last week.*Events
> <https://www.wikidata.org/wiki/Special:MyLanguage/Wikidata:Events>/
> Press/Blogs
> <https://www.wikidata.org/wiki/Special:MyLanguage/Wikidata:Press_coverage>
>
> - Upcoming: Wikidata Wahldaten Workshop 2017
> <https://www.wikidata.org/wiki/Wikidata:Events/Wikidata_
> Wahldaten_Workshop_2017>
> – 30 September 2017 in Vienna, Austria
> - Upcoming: Wikimedia Research Showcase
> <https://meta.wikimedia.org/wiki/Wikimedia_Research/
> Showcase#September_2017>
> - Past: Wikidata workshop in Zurich
> <https://www.wikidata.org/wiki/Wikidata:Events/Wikidata_Zurich> (the
> slides of the speakers are linked on the page)
> - Past: GLAMhack Wikidata workshop in Lausanne (see the slides of the
> Query
> Service introduction
> <https://docs.google.com/presentation/d/1hwUBbtP0TppAKrEpjtSjdOXePZ_
> 7OIRNDWsAHzVk0NA/edit#slide=id.g1f4d0124c0_0_0>
> )
> - Past: Wikidata workshop in Kolkata
> <https://www.wikidata.org/wiki/Wikidata:Events/Wikidata_
> workshop_Kolkata_2017>,
> India
> - Bridging real and fictional worlds
> <https://medium.com/wiki-playtime/bridging-real-and-
> fictional-worlds-1af32ee65a26>
> in Wikidata, by Martin Poulter
> - Weekend at the Museum (of Brittany)
> <https://www.lehir.net/weekend-at-the-museum-of-brittany/>, by Envel Le
> Hir <https://www.wikidata.org/wiki/User:Envlh>
> - Wiki Loves Monuments und Wikidata
> <http://archivalia.hypotheses.org/67371>, by SW
> - The French Connection at the Wikimania 2017 Hackathon
> <https://www.lehir.net/the-french-connection-at-the-
> wikimania-2017-hackathon/>,
> by Envel Le Hir <https://www.wikidata.org/wiki/User:Envlh>
>
> Other Noteworthy Stuff
>
> - Wikidata ontology explorer
> <https://lucaswerkmeister.github.io/wikidata-ontology-explorer/>:
> creates a tree of a class or property, shows common properties and
> statements
> - Join the mysterious group of Wikidata:Flashmob
> <https://www.wikidata.org/wiki/Wikidata:Flashmob> who improve labels,
> or
> summon them on an item
> - A breaking change to the *wbcheckconstraints* API output format was
> announced
> <https://www.wikidata.org/wiki/Wikidata:Project_chat#BREAKING_CHANGE:_
> wbcheckconstraints_API_output_format>
> - Q40000000 <https://www.wikidata.org/wiki/Q40000000> was created
> - Improvements coming soon to Recent Changes
> <https://www.wikidata.org/wiki/Wikidata:Project_chat#
> Improvements_coming_soon_to_Recent_Changes>
> - Several new catalogs in Mix'n'Match
> <https://tools.wmflabs.org/mix-n-match/#.2Fcatalog.2F547> incl.
> Encyclopædia Britannica, National Gallery artists and ArtCyclopedia
>
> Did you know?
>
> - Newest properties
> <https://www.wikidata.org/wiki/Special:ListProperties>: United Nations
> Treaty Series Registration Number
> <https://www.wikidata.org/wiki/Property:P4231>, Sefaria ID
> <https://www.wikidata.org/wiki/Property:P4230>, ICD-10-CM
> <https://www.wikidata.org/wiki/Property:P4229>, Encyclopedia of
> Australian Science ID
> <https://www.wikidata.org/wiki/Property:P4228>, Indonesian
> Small Islands Directory ID <https://www.wikidata.org/
> wiki/Property:P4227>,
> Cyworld ID <https://www.wikidata.org/wiki/Property:P4226>, IPA Braille
> <https://www.wikidata.org/wiki/Property:P4225>, category contains
> <https://www.wikidata.org/wiki/Property:P4224>, Enciclopedia Italiana
> ID
> <https://www.wikidata.org/wiki/Property:P4223>, United Nations Treaty
> Series Volume Number
> <https://www.wikidata.org/wiki/Property:P4222>, National
> Criminal Justice ID <https://www.wikidata.org/wiki/Property:P4221>,
> order
> of battle <https://www.wikidata.org/wiki/Property:P4220>, Tyrolean Art
> Cadastre inventory ID <https://www.wikidata.org/wiki/Property:P4219>,
> shelf
> life <https://www.wikidata.org/wiki/Property:P4218>, UK Electoral
> Commission ID <https://www.wikidata.org/wiki/Property:P4217>, LNB Pro A
> player ID <https://www.wikidata.org/wiki/Property:P4216>, nLab ID
> <https://www.wikidata.org/wiki/Property:P4215>, highest observed
> lifespan
> <https://www.wikidata.org/wiki/Property:P4214>, Unicode hex codepoint
> <https://www.wikidata.org/wiki/Property:P4213>, PACTOLS thesaurus ID
> <https://www.wikidata.org/wiki/Property:P4212>, Bashkir encyclopedia
> (Russian version) ID <https://www.wikidata.org/wiki/Property:P4211>,
> Bashkir
> encyclopedia (Bashkir version) ID
> <https://www.wikidata.org/wiki/Property:P4210>
> - Query examples:
> - Algorithms and the problems they solve
> <https://query.wikidata.org/#SELECT%20%3Falgorithm%20%
> 3FalgorithmLabel%20%20%3FprobLabel%0A%7B%0A%09%
> 3Falgorithm%20wdt%3AP2159%20%3Fprob%20.%0A%09SERVICE%
> 20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%
> 3Alanguage%20%22en%2Cen%22%20%20%7D%20%20%20%20%0A%7D>
> (source <https://twitter.com/WikiDigi/status/908717947937591298>)
> - Swiss items with article in exactly one of German-, French-,
> Italian-, and Romansh-language Wikipedias
> <https://query.wikidata.org/#%23%20map%20of%20Swiss%20items%
> 20with%20article%20in%20exactly%20one%20of%20German-
> %2C%20French-%2C%20Italian-%2C%20and%20Romansh-language%
> 20Wikipedias%0A%23defaultView%3AMap%0ASELECT%20%3Fitem%20%
> 28SAMPLE%28%3Ftitle%29%20AS%20%3FitemLabel%29%20%28SAMPLE%
> 28%3Flocation%29%20AS%20%3Flocation%29%20%28SAMPLE%28%
> 3Flanguage%29%20AS%20%3Flayer%29%20WITH%20%7B%0A%20%
> 20SELECT%20%2a%20WHERE%20%7B%0A%20%20%20%20wd%3AQ39%20p%
> 3AP1332%2Fpsv%3AP1332%2Fwikibase%3AgeoLatitude%20%
> 3Fn%3B%0A%20%20%20%20%20%20%20%20%20%20%20p%3AP1333%2Fpsv%
> 3AP1333%2Fwikibase%3AgeoLatitude%20%3Fs%3B%0A%20%
> 20%20%20%20%20%20%20%20%20%20p%3AP1334%2Fpsv%3AP1334%
> 2Fwikibase%3AgeoLongitude%20%3Fe%3B%0A%20%20%20%20%20%20%
> 20%20%20%20%20p%3AP1335%2Fpsv%3AP1335%2Fwikibase%
> 3AgeoLongitude%20%3Fw.%0A%20%20%7D%0A%7D%20AS%20%25switzerlandBoundingBox%
> 20WHERE%20%7B%0A%20%20VALUES%20%3Fwiki%20%7B%20%3Chttps%3A%
> 2F%2Fde.wikipedia.org%2F%3E%20%3Chttps%3A%2F%2Ffr.wikipedia.org%2F%3E%20%
> 3Chttps%3A%2F%2Fit.wikipedia.org%2F%3E%20%3Chttps%3A%2F%
> 2Frm.wikipedia.org%2F%3E%20%7D%0A%20%20%3Fitem%20wdt%
> 3AP17%20wd%3AQ39%3B%0A%20%20%20%20%20%20%20%20wdt%3AP625%
> 20%3Flocation.%0A%20%20%3Farticle%20a%20schema%
> 3AArticle%3B%0A%20%20%20%20%20%20%20%20%20%20%20schema%
> 3Aabout%20%3Fitem%3B%0A%20%20%20%20%20%20%20%20%20%20%
> 20schema%3AisPartOf%20%3Fwiki%3B%0A%20%20%20%20%20%20%20%20%
> 20%20%20schema%3AinLanguage%20%3Flanguage%3B%0A%20%20%20%
> 20%20%20%20%20%20%20%20schema%3Aname%20%3Ftitle.%0A%20%20%
> 23%20filter%20out%20some%20stray%20results%20that%20have%20country%
> 20Switzerland%20but%20coordinates%20outside%20it%
> 20%28e.%E2%80%AFg.%20rivers%29%0A%20%20INCLUDE%20%
> 25switzerlandBoundingBox.%0A%20%20BIND%28geof%3Alatitude%
> 28%3Flocation%29%20AS%20%3Flat%29%0A%20%20BIND%28geof%
> 3Alongitude%28%3Flocation%29%20AS%20%3Flon%29%0A%20%
> 20FILTER%28%3Fs%20%3C%3D%20%3Flat%20%26%26%20%3Flat%20%3C%
> 3D%20%3Fn%20%26%26%0A%20%20%20%20%20%20%20%20%20%3Fw%20%
> 3C%3D%20%3Flon%20%26%26%20%3Flon%20%3C%3D%20%3Fe%29%0A%
> 7D%0AGROUP%20BY%20%3Fitem%0AHAVING%28COUNT%28DISTINCT%
> 20%3Fwiki%29%20%3D%201%29>
> (source <https://twitter.com/WikidataFacts/status/908444126999441408
> >)
> - Popular gender-neutral given names
> <https://query.wikidata.org/#%23%20names%20that%20were%
> 20used%20gender-neutrally%20among%20people%20on%
> 20Wikidata%0ASELECT%20%3Fname%20%3FnameLabel%20%3Fwomen%20%
> 3Ftotal%20%3Fratio%20%28ABS%28%3Fratio-0.5%29%20AS%20%
> 3FdiffFrom5050%29%20WHERE%20%7B%0A%20%20%7B%0A%20%20%20%
> 20SELECT%20%3Fname%20%28COUNT%28%2a%29%20AS%20%3Ftotal%29%
> 20%28SUM%28%3Fwoman%29%20AS%20%3Fwomen%29%20%28SUM%28%
> 3Fwoman%29%2FCOUNT%28%2a%29%20AS%20%3Fratio%29%20WHERE%20%
> 7B%20%23%20should%20be%20%28%3Fwomen%2F%3Ftotal%20AS%20%
> 3Fratio%29%20%E2%80%93%20see%20T172113%0A%20%20%20%20%20%
> 20%3Fperson%20wdt%3AP31%20wd%3AQ5%3B%0A%20%20%20%20%20%20%
> 20%20%20%20%20%20%20%20wdt%3AP735%20%3Fname%3B%0A%20%20%
> 20%20%20%20%20%20%20%20%20%20%20%20wdt%3AP21%20%
> 3FsexOrGender%3B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%
> 20wdt%3AP569%20%3Fdob.%0A%20%20%20%20%20%20BIND%28IF%28%
> 3FsexOrGender%20IN%20%28wd%3AQ6581072%2C%20wd%3AQ1052281%
> 29%2C%201%2C%200%29%20AS%20%3Fwoman%29%0A%20%20%20%20%7D%
> 0A%20%20%20%20GROUP%20BY%20%3Fname%0A%20%20%20%20HAVING%
> 28%3Ftotal%20%3E%3D%2010%20%26%26%200.4%20%3C%3D%20%
> 3Fratio%20%26%26%20%3Fratio%20%3C%3D%200.6%29%20%23%
> 20arbitrary%20limits%2C%20feel%20free%20to%20tweak%0A%
> 20%20%7D%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%
> 3AserviceParam%20wikibase%3Alanguage%20%22%5BAUTO_
> LANGUAGE%5D%2Cen%22.%20%7D%0A%7D%0AORDER%20BY%20DESC%28%3Ftotal%29>
> (source <https://twitter.com/WikidataFacts/status/907740126150873089
> >)
> - Computer network protocols and their ports
> <https://query.wikidata.org/#%0ASELECT%20%3Fitem%20%
> 3FitemLabel%20%3FportLabel%0AWHERE%20%0A%7B%0A%20%20%
> 3Fitem%20wdt%3AP31%20wd%3AQ15836568%20.%0A%20%20%3Fitem%20wdt%3AP1641%20%
> 3Fport%20.%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%
> 3AserviceParam%20wikibase%3Alanguage%20%22%5BAUTO_
> LANGUAGE%5D%2Cen%22.%20%7D%0A%7D>
> (source <https://twitter.com/WikiDigi/status/907614182564036609>)
> - Software developers by number of software titles
> <https://query.wikidata.org/#%23defaultView%3ABubbleChart%
> 0ASELECT%20%3Fdeveloper%20%3FdeveloperLabel%20%28COUNT%
> 28%3Fsoftware%29%20AS%20%3Fcount%29%20WHERE%20%7B%0A%
> 20%20%3Fsoftware%20%28p%3AP31%2Fps%3AP31%2Fwdt%3AP279%2a%29%
> 20wd%3AQ7397.%0A%20%20%3Fsoftware%20wdt%3AP178%20%
> 3Fdeveloper.%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%20bd%
> 3AserviceParam%20wikibase%3Alanguage%20%22en%22.%20%7D%
> 0A%7D%0AGROUP%20BY%20%3Fdeveloper%20%3FdeveloperLabel%0AORDER%20BY%
> 20DESC%28%3Fcount%29%0ALIMIT%20100>
> (source <https://twitter.com/WikiDigi/status/907961753606045696>)
> - Spacecraft and what they were named after
> <https://query.wikidata.org/#SELECT%20DISTINCT%20%3Fobj%20%
> 3FobjLabel%20%3FobjDescription%20%3Fnom%20%3FnomLabel%20%
> 3FnomDescription%0AWHERE%20%7B%0A%7B%3Fobj%20wdt%3AP31%
> 2Fwdt%3AP279%2a%20wd%3AQ40218%20%7D%20%23%20type%20of%
> 20spacecraft%0AUNION%20%7B%20%3Fobj%20wdt%3AP31%2Fwdt%
> 3AP279%2a%20wd%3AQ13226541%20%7D%20%23%20or%20spaceflight%
> 20programme%0A%3Fobj%20wdt%3AP138%20%3Fnom%20%23named%
> 20after%0ASERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%
> 3Alanguage%20%22%5BAUTO_LANGUAGE%5D%2Cen%22%20%7D%0A%
> 7D%20ORDER%20BY%20ASC%28%3FobjLabel%29>
> (source <https://twitter.com/mlpoulter/status/908774078080802818>)
> - Newest WikiProjects
> <https://www.wikidata.org/wiki/Special:MyLanguage/Wikidata:WikiProjects
> >:
> WikiProject property constraints
> <https://www.wikidata.org/wiki/Wikidata:WikiProject_
> property_constraints>
>
> Development
>
> - Worked more on the constraints gadget in order to make it also qork
> for references and qualifiers
> - Made progress on persistently storing edits for the new Lexeme entity
> type (next to items and properties)
> - Worked on the RDF mapping for full URIs of external identifiers (
> phabricator:T121274 <https://phabricator.wikimedia.org/T121274>)
>
> You can see all open tickets related to Wikidata here
> <https://phabricator.wikimedia.org/maniphest/query/4RotIcw5oINo/#R>.
> Monthly Tasks
>
> - Add labels, in your own language(s), for the new properties listed
> above.
> - Comment on property proposals: all open proposals
> <https://www.wikidata.org/wiki/Wikidata:Property_proposal/Overview>
> - Suggested and open tasks
> <https://www.wikidata.org/wiki/Wikidata:Contribute/
> Suggested_and_open_tasks>
> !
> - Contribute to a Showcase item
> <https://www.wikidata.org/wiki/Special:MyLanguage/
> Wikidata:Showcase_items>
> .
> - Help translate <https://www.wikidata.org/wiki/Special:LanguageStats>
> or proofread the interface and documentation pages, in your own
> language!
> - Help merge identical items
> <https://www.wikidata.org/wiki/User:Pasleim/projectmerge> across
> Wikimedia projects.
> - Help write the next summary!
> <https://www.wikidata.org/wiki/Wikidata:Status_updates/Next>
>
>
>
> --
> Léa Lacroix
> Project Manager Community Communication for Wikidata
>
> Wikimedia Deutschland e.V.
> Tempelhofer Ufer 23-24
> 10963 Berlin
> www.wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
> der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
> Körperschaften I Berlin, Steuernummer 27/029/42207.
>
Hi!
I'd like to announce that the category tree of certain wikis is now
available as RDF dump and in Wikidata Query Service.
More documentation is at:
https://www.mediawiki.org/wiki/Wikidata_query_service/Categories
which I will summarize shortly below.
The dumps are located at
https://dumps.wikimedia.org/other/categoriesrdf/. You can use these
dumps any way you wish, data format is described at the link above[1].
The same dump is loaded into "categories" namespace in WDQS, which can
be queried by
https://query.wikidata.org/bigdata/namespace/categories/sparql?query=SPARQL.
Sorry, no GUI support yet (probably will happen later). See example in
the docs[2].
These datasets are not updated automatically yet, so they'll be up to
date roughly for the date of the latest dump. Hopefully soon it will be
automated and then the datasets will be updated daily.
The list of currently supported wikis is here:
https://noc.wikimedia.org/conf/categories-rdf.dblist - these are
basically all 1M+ wikis and couple more that I added for various
reasons. If you have a good candidate wiki to add, please tell me or
write on the talk page for the document above.
Please note this is only the first step for the project, so there might
still be some rough edges. I am announcing it early since I think it
would be useful for people to look at the dumps and SPARQL endpoint and
see if something is missing or does not work properly, and share ideas
on how it can be used.
We plan eventually to use it for search improvement[3] - this work is
still in progress.
As always, we welcome any comments and suggestions.
[1]
https://www.mediawiki.org/wiki/Wikidata_query_service/Categories#Data_format
[2]
https://www.mediawiki.org/wiki/Wikidata_query_service/Categories#Accessing_…
[3] https://phabricator.wikimedia.org/T165982
Thanks,
--
Stas Malyshev
smalyshev(a)wikimedia.org
Say I have this query...
SELECT ?human ?label
WHERE
{
?human wdt:P31 wd:Q15632617; rdfs:label ?label.
FILTER(LANG(?label) = "en").
FILTER(STRSTARTS(?label, "Mr. ")).
}
What if I wanted to see if any one of a humans name ends with "y" such as
my last name does , their first, last, doesn't matter. I have a "d" and a
"y" on the array returned from my name (if it were split by whitespace)
I did not see any special syntax or FILTER or Label service commands to
help with splitting apart a Label by whitespace and then applying a filter
on each string.
How would I accomplish this ?
Thad
+ThadGuidry <https://www.google.com/+ThadGuidry>
Hi,
I'd like to share a small project I've done for last couple of week.
I've building last couple of my projects's backend api in GraphQL (
http://graphql.org/ ), which is a new query language specification started
by Facebook It solved many problem in the modern web application
developing, read their website for more if you're interested.
In the mean time, I'm really interested in using wikidata's structured
data for some of my scripts and toy projects. It seems to me somehow we
could comebine these two to provide an easy to use API for application
users.
So here's the simple demo using GraphQL to querying the wikidata, it's just
a prove of concept certainly not ready for any serious use. The underlying
fetching it's done by using the rest API at *https://www.wikidata.org/w/api.php
<https://www.wikidata.org/w/api.php>*
Demo time: you could write some query to
1. *Get the beatles's all members(has_part) 's award_received and
place_of_birth and twitter_username*
(permalink for this query at http://bit.ly/2eXTmbG )
{
entity(id: "Q1299") {
label(lang: "en")
has_part {
mainsnak {
label(lang: "en")
award_received {
mainsnak {
label(lang: "en")
}
}
place_of_birth {
mainsnak {
label(lang: "en")
}
}
twitter_username {
mainsnak
}
}
}
}
}
*2. Get China's all past head_of_government and where they educated_at and
what other position_held before*
(permalink for this query at http://bit.ly/2xmxaT1 )
{
entity(id: "Q148") {
label(lang: "en")
head_of_government {
mainsnak {
label(lang: "en")
educated_at {
mainsnak {
id
label(lang: "en")
}
}
position_held {
mainsnak {
label(lang: "en")
}
}
}
}
}
}
You could visit here to use the api explorer to write some query to try
this one
http://wikidata.notimportant.org/graphql
Certainly there is so much to do in order to get most of wikidata:
1. need add reference and qualifiers
2. support search entity
3. other datatype like Time and Media, Geographic and etcs to the graphql
schema
4. and maybe many things to help make reverse lookup possible (like get
one artist's all albums)
Also I realized this certainly can't do a lot of things that SPARQL does,
but in a way it provide a more intuitive way to get many useful data needed
out of wikidata. And it's really expressive for deep nested info compare to
the what we have. It's still an early project, ideas and comments are
welcome.
Source code available: https://github.com/seansay/wikidata-graphql
--
Best,
李松, Sean Lee
https://notimportant.org
Hello all,
The program of the WikidataCon is now published on-wiki
<https://www.wikidata.org/wiki/Wikidata:WikidataCon_2017/Program>. You will
find there a lot of different formats, topics and speakers, during the two
days of the conference.
More information will be added in the next weeks. If you think that any
useful information is missing, feel free to let a comment on the talk page.
Thanks to all the speakers who committed to present something during the
conference, and thanks to the program committee who selected and sorted all
the submissions. Thanks to the 200 persons who registered for the
conference and all the enthusiasm shared around the WikidataCon!
Note: the event is now complete, no more tickets available. The last
tickets have been attributed to the first persons who registered on the
waitlist. If some more seats get free because another attendee cancels
participation, the next person on the waitlist will be informed.
If you have any question, feel free to reach me.
Cheers,
--
Léa Lacroix
Project Manager Community Communication for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Hi all!
This is an announcement for a breaking change to the output format of the
WikibaseQualityConstraints constraint checking API, to go live on 10
October 2017. It affects all clients that use the *wbcheckconstraints* API
action. (We are not aware of any such clients apart from the
*checkConstraints* gadget, which has been adapted.)
We are soon going to check constraints not just on the main snak of a
statement, but also on qualifiers and references (T168532
<https://phabricator.wikimedia.org/T168532>). However, the current API
output format of the *wbcheckconstraints* API action cannot accommodate any
other constraint check results. To resolve this issue, we are introducing a
new, more flexible output format for the API, which can contain constraint
check results on all kinds of snaks and also leaves room for future
expansion (e. g. for T168626 <https://phabricator.wikimedia.org/T168626>).
The new format is based on the Wikibase JSON format, and documented (along
with the old format) on mw:Wikibase/API#wbcheckconstraints
<https://www.mediawiki.org/wiki/Wikibase/API#wbcheckconstraints>.
If you use the *wbcheckconstraints* API action in your tools, the safest
option is to make them support both output formats for the transitional
period. It’s easy to determine which format the API returned, because the
new format contains the fixed key "claims" on the second level, which will
never happen in the old format. You can see an example of this for the
*checkConstraints* gadget in change I99379a96cd
<https://gerrit.wikimedia.org/r/#/c/373323/>, specifically the new
extractResultsForStatement function.
The new API output format is already enabled on the Wikidata constraints
test system <https://wikidata-constraints.wmflabs.org/>. You can test your
tools or other code there.
Please let us know if you have any comments or objections.
-- Lucas
Relevant tickets:
- T168532 <https://phabricator.wikimedia.org/T168532>
- T174544 <https://phabricator.wikimedia.org/T174544>
Relevant patches:
- https://gerrit.wikimedia.org/r/#/c/369420
- https://gerrit.wikimedia.org/r/#/c/373323/
--
Lucas Werkmeister
Software Developer (Intern)
Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de
Imagine a world, in which every single human being can freely share in the
sum of all knowledge. That‘s our commitment.
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.