​ Hi Scott, ​

One way to do that would be to get the language code label list from WDQS with this:
SELECT ?label WHERE {?s wdt:P424 ?code; rdfs:label ?label filter (lang(?label) = "en").}

and then stream the list to LDF client [1] requests
ldf-client https://query.wikidata.org/bigdata/ldf http://fragments.dbpedia.org/2015-10/en "SELECT * WHERE {?s rdfs:label "+ language + "@en  . ?s owl:sameAs ?link }"

The results would be in JSON from the client.  It should give a relatively complete list of WIkidata language code entity corresponding resources in DBpedia

Also, an simple way to get a dbpedia resource with TPF is with the entity label which is one of the properties that is the same for both datasets.  So,
?s rdfs:label "German"@en  .
will return the matching dbpedia and wikidata resources for that label.  This could also perhaps be done with a federated query in WDQS.(untested).

Christopher Johnson

Message: 3
Date: Sun, 29 Apr 2018 17:48:10 -0700
From: Scott MacLeod <worlduniversityandschool@gmail.com>
To: Discussion list for the Wikidata project
Subject: Re: [Wikidata] How to find the Dbpedia data for a Wikidata
Content-Type: text/plain; charset="utf-8"

Hi Paris Writers' News/PWN, Markus, and Wikidatans,

Based on your example (http://tinyurl.com/yahwql2n), Markus, I'm seeking to
learn how to do a similar query for all languages.

In Wikidata I found a Q item # for "language" - Q34770 (
https://www.wikidata.org/wiki/Q34770) - and plugged this into your query,
replaced the word "countries" with "languages," etc. but didn't get a
result, where your query yields 209 countries, Markus.

In a parallel way, how would one compute them from the names of the
articles in Wikipedia?


On Fri, Apr 27, 2018 at 2:31 PM, Markus Kroetzsch <
markus.kroetzsch@tu-dresden.de> wrote:

> Hi,
> (English) DBpedia URIs are basically just (English) Wikipedia URIs with
> the first part exchanged. So one can compute them from the names of the
> articles. Example: a query for DBpedia URIs for all countries:
> http://tinyurl.com/yahwql2n
> """
> SELECT ?dbpediaId
> {
>   ?item wdt:P31 wd:Q6256 . # for the example: get IDs for all countries
>   ?sitelink schema:about ?item ;
>             schema:isPartOf <https://en.wikipedia.org/> .
> BIND(URI(CONCAT("http://dbpedia.org/resource/",SUBSTR(STR(?sitelink),31)))
> as ?dbpediaId)
> }
> """
> Of course, depending on your use case, you can do the same offline
> (without requiring SPARQL to rewrite the id strings for you).
> In theory, one could use federation to pull in data from the DBpedia
> endpoint, but in practice I could not find an interesting query that
> completes within the timeout (but I did not try for very long to debug
> this).
> Best regards,
> Markus
> On 23/04/18 06:41, PWN wrote:
>> If one knows the Q code (or URI) for an entity on Wikidata, how can one
>> find the Dbpedia Id and the information linked to it?
>> Thank you.
>> Sent from my iPad
>> _______________________________________________
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
> _______________________________________________
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata


- Scott MacLeod - Founder & President
- World University and School
- http://worlduniversityandschool.org

- 415 480 4577
- http://scottmacleod.com

- CC World University and School - like CC Wikipedia with best STEM-centric
CC OpenCourseWare - incorporated as a nonprofit university and school in
California, and is a U.S. 501 (c) (3) tax-exempt educational organization.

IMPORTANT NOTICE: This transmission and any attachments are intended only
for the use of the individual or entity to which they are addressed and may
contain information that is privileged, confidential, or exempt from
disclosure under applicable federal or state laws.  If the reader of this
transmission is not the intended recipient, you are hereby notified that
any use, dissemination, distribution, or copying of this communication is
strictly prohibited.  If you have received this transmission in error,
please notify me immediately by email or telephone.

World University and School is sending you this because of your interest in
free, online, higher education. If you don't want to receive these, please
reply with 'unsubscribe' in the body of the email, leaving the subject line
intact. Thank you.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.wikimedia.org/pipermail/wikidata/attachments/20180429/a0648368/attachment-0001.html>


Subject: Digest Footer

Wikidata mailing list


End of Wikidata Digest, Vol 77, Issue 18