Hi, I really like where this conversation is going!
This is very much duplicating the (good) GLAM model for other types of
content.
Formalizing a Wikidata-Scientist/WIR status beyond GLAM gives us access to
broader pool of organizations who generate/own good sources of content.
I see great value is having WS/WiRs working on various economic development
areas while embedded with the World Bank, the OECD.
WS/WiRs enriching Health and disease related content while embedded with WHO,
NIH, CDC... Same with policy with WiRs working with Brookings, CATO, Pew,
CID...
The community can decide where we want to expand, but this would create an
unprecedented reach of our projects and our mission.
Quim, I'll defer to you/Lydia for a decision as to what tools/mechanism is
best to have these next discussions, but I dont see any reason not to
start.
Daniel, I would love to learn more about your work with NIH and excited by
your offer to help with a pilot!
Sylvia
--
Sylvia Ventura
Strategic Partnerships
Wikimedia Foundation
sventura(a)wikimedia.org
Hi,
I work in the Strategic Partnerships team at the Wikimedia Foundation, and
I'm in initial conversations with the World Bank and several other large
NGOs about using their open data sets in our projects.
The World Bank maintains a large data set of statistics on countries, and
is ready to start a pilot test with us. They suggested us to look for a
sample of specific indicators that are missing in Wikidata/Wikipedia that
could either be linked to or imported into Wikidata. Examples could go from
basic indicators like population to more specific data like “% of country's
population with access to water”.
I'm looking for technical help to work with our contacts at the World Bank.
For instance, what is the best way to:
*compare the World Bank's indicators with Wikidata's properties, and see
what are we missing today that would be interesting to collect, either in
Wikidata or directly through templates in Wikipedia
**pull/connect that content from the World Bank into our servers
This is what is available today: http://data.worldbank.org/indicator/all
I also welcome your advice starting this project following community
processes and standards. Could
https://www.wikidata.org/wiki/Wikidata:WikiProject_Economics serve as a
starting point?
Thank you and happy Hackathon for those in Lyon!
Sylvia
--
Sylvia Ventura
Strategic Partnerships
Wikimedia Foundation
sventura(a)wikimedia.org
Dear users, developers and all people interested in semantic wikis,
We are happy to announce SMWCon Fall 2015 - the 12th Semantic MediaWiki
Conference:
* Dates: October 28th to October 30th 2015 (Wednesday to Friday)
* Location: Fabra i Coats, Art Factory. Carrer Sant Adrià 20 (Sant
Andreu), Barcelona.
* Conference wikipage:
https://semantic-mediawiki.org/wiki/SMWCon_Fall_2015
* Participants: Everybody interested in semantic wikis, especially in
Semantic MediaWiki, e.g., users, developers, consultants, business
representatives, researchers.
SMWCon Fall 2015 will be supported by Institut de Cultura de Barcelona
(http://lameva.barcelona.cat/barcelonacultura/en/), Amical Wikimedia
(https://www.wikimedia.cat) and Open Semantic Data Association e. V.
(https://opensemanticdata.org/).
Following the success of this format the SMWCon will have one tutorial
day preceeding two conference days.
Participating in the conference: To help us planning, you can already
informally register on the wikipage, although a firm registration will
later be needed.
Contributing to the conference: If you want to present your work in the
conference please go to the conference wikipage and add your talk there.
To create an attractive program for the conference, we will later ask
you to give further information about your proposals. Tutorials and
presentations will be video and audio recorded and will be made
available for others after the conference.
Among others, we encourage contributions on the following topics:
Applications of semantic wikis:
* Semantic wikis for enterprise workflows and business intelligence
* Semantic wikis for corporate or personal knowledge management
* Exchange on business models with semantic wikis
* Lessons learned (best/worst practices) from using semantic wikis or
their extensions
* Semantic wikis in e-science, e-learning, e-health, e-government
* Semantic wikis for finding a common vocabulary among a group of people
* Semantic wikis for teaching students about the Semantic Web
* Offering incentives for users of semantic wikis
* Challenges and obstacles for Semantic Wikis in business environments
Development of semantic wikis:
* Semantic wikis as knowledge base backends / data integration platforms
* Comparisons of semantic wiki concepts and technologies
* Community building, feature wishlists, roadmapping of Semantic MediaWiki
* Improving user experience in a semantic wiki
* Speeding up semantic wikis
* Integrations and interoperability of semantic wikis with other
applications and mashups
* Modeling of complex domains in semantic wikis, using rules, formulas
etc.
* Access control and security aspects in semantic wikis
* Multilingual semantic wikis
For any other question and sponsorship opportunities, please do not
hesitate to contact Toni Hermoso <toniher at cau.cat>
Hope to see you in Barcelona!
Lia Veja, Karsten Hoffmeyer
(Program Board)
I'd like to announce a new Labs tool to show a periodic table
<https://tools.wmflabs.org/ptable/>.
It is based on WikiPeriod's PHP code (in turn ported from JavaScript)
and features several improvements:
* 'tiles' are wider and taller;
* most of them are now provided with a background color (the same as
Wikipedia's
<https://en.wikipedia.org/wiki/Periodic_table#Periodic_table_legend_for_cate…>)
based on the elements' "subclass of" property
<https://www.wikidata.org/wiki/Property:P279> (the same that powers
period/group detection);
* for labels, Wikidata's built-in language fallback is used instead of
just falling back to English;
* a public JSON API <https://tools.wmflabs.org/ptable/api> is
available for everyone!
And some more under the hood:
* rewritten in Python with Jinja2:
o more object-oriented
o presentation is split from actual logic
o less vulnerable to XSS attacks
* a LRU (least recently used) cache with a maximum TTL (per-item
time-to-live) value of 6 hours is used to avoid hitting data sources
on every request;
* both the Wikidata API and Wikidata Query can be used interchangeably
as sources.
I had to create some items such as Q19753344
<https://www.wikidata.org/wiki/Q19753344> and Q19753345
<https://www.wikidata.org/wiki/Q19753345> to properly categorize
elements. My knowledge of chemistry is limited, so please report/fix
every mistake you can find ;-)
Future plans include:
* oxidation state
* images
* responsive design
* alternative table structures
Is it possible to get claims made in Wikidata using the SQL interface to
the replicas of the production databases run by Wikimedia Labs
(https://wikitech.wikimedia.org/wiki/Help:Tool_Labs/Database)?
I see on http://quarry.wmflabs.org/ (a service to "Run SQL queries
against Wikipedia & other databases from your browser!") queries
involving the wikidatawiki_p database -- e.g.,
http://quarry.wmflabs.org/query/3152. But I've not been able to find
queries that pull out values for Wikidata claims.
To provide a concrete example, can one write a SQL query on the
Wikimedia databases to return claims about the population of Alameda
County, California (https://www.wikidata.org/wiki/Q107146), which was
specifically, 1,510,271 in the 2010 census?
P1082 (https://www.wikidata.org/wiki/Property:P1082) is the population
properly. In
https://www.wikidata.org/wiki/Special:EntityData/Q107146.json, we see
the following json fragment:
P1082: [
{
id: "Q107146$8938463B-C469-4162-80F7-96DDDE1429DD",
mainsnak:
{
snaktype: "value",
property: "P1082",
datatype: "quantity",
datavalue:
{
value:
{
amount: "+1510271",
unit: "1",
upperBound: "+1510271",
lowerBound: "+1510271"
},
type: "quantity"
}
}
...
}
]
},
I've looked through the tables in wikidatawiki_p but can't find any
table that has the claim data.
So I suspect the answer is no -- that although there's lot of metadata
about Wikidata available to SQL queries on the replicas. (SQL not
mentioned in https://www.wikidata.org/wiki/Wikidata:Data_access). Is
this correct?
Thanks,
-Raymond
It is rather clear that everyone wants Wikidata to also support Wiktionary,
and there have been plenty of proposals in the last few years. I think that
the latest proposals are sufficiently similar to go for the next step: a
break down of the tasks needed to get this done.
Currently, the idea of having Wikidata supporting Wiktionary is stalled
because it is regarded as a large monolithic task, and as such it is hard
to plan and commit to. I tried to come up with a task break-down, and
discussed it with Lydia and Daniel, and now, as said in the last office
hour, here it is for discussion and community input.
https://www.wikidata.org/wiki/Wikidata:Wiktionary/Development/Proposals/201…
I think it would be really awesome if we would start moving in this
direction. Wiktionary supported by Wikidata could quickly become one of the
crucial pieces of infrastructure for the Web as a whole, but in particular
for Wikipedia and its future development.
Cheers,
Denny
Hello,
we here at Bibnet (a government agency working for public libraries) in Brussels greatly admire your work on the Wikidata project! Thanks to the addition of Wikidata indentifiers to the viaf.org<http://viaf.org> project, we finally will be able to relate the personal names in our library catalog to Wikipedia.
What we do now (without identifiers):
We use the Wikipedia API to query the ’titles’ parameter, based on the name of an author in our database. Lots of things can go wrong of course.
Example query:
http://nl.wikipedia.org/w/api.php?action=query&prop=pageprops%7Cinfo%7Cextr…<http://nl.wikipedia.org/w/api.php?action=query&prop=pageprops|info|extracts…>
Result in our library catalog (AquaBrowser software):
http://zbb.staging.aquabrowser.be/?q=author:%22Dimitri%20Verhulst%22&uilang…
What we want to do (with identifiers):
Keep on using the Wikipedia api (because then there are no costs for extra development), but query for a Wikidata identifier (I believe it is called wikibase_item). I only find pageids as a parameter on http://nl.wikipedia.org/w/api.php?action=help&modules=query. Is querying for a Wikidata identifier possible? If not, what is a possible workaround?
Thank you,
Johan
Johan Mijs
Technology Manager
johan.mijs(a)bibnet.be<x-msg://13/johan.mijs@bibnet.be>
+32 (0)2 213 10 27
+32 (0)473 81 78 70
skype:johanmijs
Bibnet vzw
www.bibnet.be<http://www.bibnet.be/>
Priemstraat 51
B-1000 Brussel
Hi everyone,
On Tuesday, 19th of May, there is the intention to rename the wikidata-l
mailing list to wikidata <https://phabricator.wikimedia.org/T99136>. This
will drop the -l suffix. This is being done in the intention to unify and
standarise a naming scheme for all Wikimedia mailing lists that has been
waiting for several years.
This is taking part in an existing planned maintenance window <
https://phabricator.wikimedia.org/T99098> so no issues are expected to
arise from this. Existing URLs and emails will continue to function
correctly.
This is just an email giving a brief notice that this will happen. If you
have any more questions, please reply and I'll try my best to respond as
soon as possible.
Thanks to Lydia for kindly accepting to allow a rename of the list to
happen during the Tuesday window as well.
John Lewis