Hi,
I'm looking into ways to use tabular data like
https://commons.wikimedia.org/wiki/Data:Zika-institutions-test.tab
in SPARQL queries but could not find anything on that.
My motivation here is in part coming from the time out limits, and the
basic idea here would be to split queries that typically time out into
sets of queries that do not time out and - if their results were
aggregated - would yield the results that would be expected for the
original query would it not time out.
The second line of motivation here is that of keeping track of how
things develop over time, which would be interesting for both content
and maintenance queries as well as usage of things like classes,
references, lexemes or properties.
I would appreciate any pointers or thoughts on the matter.
Thanks,
Daniel
Hi everyone,
We are delighted to announce that Wiki Workshop 2020 will be held in
Taipei on April 20 or 21, 2020 (the date to be finalized soon) and as
part of the Web Conference 2020 [1]. In the past years, Wiki Workshop
has traveled to Oxford, Montreal, Cologne, Perth, Lyon, and San
Francisco.
You can read more about the call for papers and the workshops at
http://wikiworkshop.org/2020/#call. Please note that the deadline for
the submissions to be considered for proceedings is January 17. All
other submissions should be received by February 21.
If you have questions about the workshop, please let us know on this
list or at wikiworkshop(a)googlegroups.com.
Looking forward to seeing you in Taipei.
Best,
Miriam Redi, Wikimedia Foundation
Bob West, EPFL
Leila Zia, Wikimedia Foundation
[1] https://www2020.thewebconf.org/
(usual apologies for cross-posting!)
The board of the Wikipedia & Education User Group invites you to attend our
user group's next Open Meeting, one week from today, on Thursday, April 2,
at 15:00 UTC, as always via Zoom. We'll be discussing the Wikimedia &
Education community's response to the COVID-19 pandemic. Guest speakers
include:
* Nichole Saad and Melissa Guadalupe Huertas from the WMF Education Team
will talk about their strategy and how you can help. (
https://lists.wikimedia.org/pipermail/education/2020-March/002511.html)
* User:TiagoLubiana, a graduate student in Computational Biology from the
University of São Paulo, and a leading editor at Wikidata:WikiProject
COVID-19, will discuss Wikidata's work around COVID-19 and how the
Wikimedia and education community can help. (
https://www.wikidata.org/wiki/Wikidata:WikiProject_COVID-19)
Per usual, the board will provide an update on user group activities, and
we'll offer an opportunity for others to briefly share what they've been up
to in light of the COVID-19 pandemic. Join us!
What: Wikipedia & Education User Group Open Meeting
When: Apr 2, 2020 15:00 UTC
Where: https://zoom.us/j/759620545
Meeting ID: 759 620 545
Hi,
I'm about to import around 7,000 P2347 mappings (YSO ID authority links)
between Wikidata items and YSO (General Finnish Ontology) concepts to
Wikidata using QuickStatements2. I'm following the excellent example of
Joachim Neubert's work at ZBW, documented e.g. here:
http://zbw.eu/labs/en/blog/wikidata-as-authority-linking-hub-connecting-rep…
The mappings were collected from several sources:
1. Mappings between KOKO (related to YSO) and Wikidata curated by the
Finnish Broadcasting Company Yle (kindly given to us, but not publicly
available AFAIK)
2. Indirect mappings derived from Wikidata-LCSH and YSO-LCSH mappings
3. Algorithmic matching suggestions for frequently used YSO concepts
In all these cases, the mappings have been verified by vocabulary
managers here at the National Library of Finland, so we're not just
blindly copying the information from the above sources.
I'm wondering about whether to add source/qualifier statements to the
mapping statements I'm about to add. I see that in most cases, authority
links don't have any source information. For this batch, I could
potentially document several bits of provenance information:
1. Where the (suggested) statement originally came from (e.g. Yle and/or
indirect LCSH mapping)
2. That we have verified it here at NLF
I see that Joachim used source statements like this for his imported links:
title (P1476):
Derived from ZBW's RAS-GND authors mapping (English)
reference URL (P854):
https://github.com/zbw/repec-ras/blob/master/doc/RAS-GND-author-id-mapping.…
Is this still best practice or should I use something else? Or just
import the raw links without any qualifiers or sources?
Thanks in advance,
Osma
--
Osma Suominen
D.Sc. (Tech), Information Systems Specialist
National Library of Finland
P.O. Box 15 (Unioninkatu 36)
00014 HELSINGIN YLIOPISTO
Tel. +358 50 3199529
osma.suominen(a)helsinki.fi
http://www.nationallibrary.fi