...but don't focus to much on the 1% #¤%& wrong thing, focus on the 99%
right thing.
And I do think Wikibase is done 99% right!
(And the 1% WrongThing™ is just there so I can nag Danny and Duesentrieb...)
John
Some time ago I learned through this mailing list the existence of the
direct URL of (then) the wdqs (being:
http://wdqs-beta.wmflabs.org/bigdata/namespace/wdq/sparql). I have been
happily using this in a variety of applications ranging from federated
queries to R or simply to submit SPARQL queries from desktop SPARQL
editors.
With the release of http://query.wikidata.org, this long address doesn't
seem to work anymore, nor the same pattern in query.wikidata.org (i.e.:
http://query.wikidata.org/bigdata/namespace/wdq/sparql) both return a http
301 header.
I there an alternative URL to submit SPARQL queries to , from outside the
GUI, I could use?
Kind regards,
Andra
Hey everyone :)
We just enabled data access for Wikibooks as previously announced.
Please welcome our newest sister and keep an eye on
https://www.wikidata.org/wiki/Wikidata:Wikibooks in case there are
questions.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hey everyone :)
As promised we just enabled support for quantities with units on
Wikidata. So from now on you'll be able to store fancy things like the
height of a mountain or the boiling point of an element.
Quite a few properties have been waiting on unit support before they
are created. I assume they will be created in the next hours and then
you can go ahead and add all of the measurements.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hi Markus, everyone,
The project proposal is currently in active development.
I would like to focus now on the dissemination of the idea and the
engagement of the Wikidata community.
Hence, I would love to gather feedback on the following question:
Does StrepHit sounds interesting and useful for you?
It would be great if you could report your thoughts on the project talk
page:
https://meta.wikimedia.org/wiki/Grants_talk:IEG/StrepHit:_Wikidata_Statemen…
Cheers!
On 9/8/15 2:02 PM, wikidata-request(a)lists.wikimedia.org wrote:
> Date: Mon, 07 Sep 2015 16:47:16 +0200
> From: Markus Krötzsch<markus(a)semantic-mediawiki.org>
> To: "Discussion list for the Wikidata project."
> <wikidata(a)lists.wikimedia.org>
> Subject: Re: [Wikidata] [ANNOUNCEMENT] first StrepHit dataset for the
> primary sources tool
> Message-ID:<55EDA374.2090901(a)semantic-mediawiki.org>
> Content-Type: text/plain; charset=utf-8; format=flowed
>
> Dear Marco,
>
> Sounds interesting, but the project page still has a lot of gaps. Will
> you notify us again when you are done? It is a bit tricky to endorse a
> proposal that is not finished yet;-)
>
> Markus
>
> On 04.09.2015 17:01, Marco Fossati wrote:
>> >[Begging pardon if you have already read this in the Wikidata project chat]
>> >
>> >Hi everyone,
>> >
>> >As Wikidatans, we all know how much data quality matters.
>> >We all know what high quality stands for: statements need to be
>> >validated via references to external, non-wiki, sources.
>> >
>> >That's why the primary sources tool is being developed:
>> >https://www.wikidata.org/wiki/Wikidata:Primary_sources_tool
>> >And that's why I am preparing the StrepHit IEG proposal:
>> >https://meta.wikimedia.org/wiki/Grants:IEG/StrepHit:_Wikidata_Statements_Va…
>> >
>> >
>> >StrepHit (pronounced "strep hit", means "Statement? repherence it!") is
>> >a Natural Language Processing pipeline that understands human language,
>> >extracts structured data from raw text and produces Wikidata statements
>> >with reference URLs.
>> >
>> >As a demonstration to support the IEG proposal, you can find the
>> >**FBK-strephit-soccer** dataset uploaded to the primary sources tool
>> >backend.
>> >It's a small dataset serving the soccer domain use case.
>> >Please follow the instructions on the project page to activate it and
>> >start playing with the data.
>> >
>> >What is the biggest difference that sets StrepHit datasets apart from
>> >the currently uploaded ones?
>> >At least one reference URL is always guaranteed for each statement.
>> >This means that if StrepHit finds some new statement that was not there
>> >in Wikidata before, it will always propose its external references.
>> >We do not want to manually reject all the new statements with no
>> >reference, right?
>> >
>> >If you like the idea, please endorse the StrepHit IEG proposal!
--
Marco Fossati
http://about.me/marco.fossati
Twitter: @hjfocs
Skype: hell_j
Hoi,
...
What is the status of the import of Freebase information into Wikidata. Why
is nothing observable happening. Why is it a black box that is not to be
talked about ?? Why are we suffering from this lack of data that is
arguably of a similar quality of what we have or better ??
Thanks,
GerardM
---------- Forwarded message ----------
From: Pine W <wiki.pine(a)gmail.com>
Date: 20 September 2015 at 06:11
Subject: [Wikimedia-l] Upcoming database purge on Freenode
To: "wikitech-l(a)lists.wikimedia.org" <wikitech-l(a)lists.wikimedia.org>,
Wikimedia Mailing List <Wikimedia-l(a)lists.wikimedia.org>
Freenode occasionally purges its databases of "expired nicks, channels and
accounts". If you have a nick, channel or account that you want to keep and
haven't used recently, now is a good time to do so. See
https://blog.freenode.net/2015/09/services-database-purge/ for more info.
Please send any questions to Freenode staff.
Pine
_______________________________________________
Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines
Wikimedia-l(a)lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
<mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
Given the recent discussions on how to deal with person names in
Wikidata (e.g. how many properties to use, how to handle scripts,
automatic vs. manual labels/aliases/descriptions...) and the importance
username display has in MediaWiki (e.g. gendered namespaces, log system
restructure since 1.19, ...), it may be useful for someone to read this
thesis and summarise it to our benefit. :)
http://ulir.ul.ie/handle/10344/3450
«If a system does not possess the ability to capture, store, and
retrieve people names, according to their cultural requirements, it is
less likely to be acceptable on the international market.
Internationalisation of people names could reduce the probability of a
person’s name being lost in a system, avoiding frustration, saving time,
and possibly money. This study attempts to determine the extent to which
the human name can be internationalised, based upon published
anthroponymic data for 148 locales, by categorising them into eleven
distinctly autonomous parts: definite article, common title, honorific
title, nickname, by-name, particle, forename, patronymic or matronymic,
surname, community name, and generational marker. This paper provides an
evaluation of the effectiveness of internationalising people names;
examining the challenges of terminology conflicts, the impact of
subjectivity whilst pigeonholing personyms, and the consequences of
decisions made. It has demonstrated that the cultural variety of human
names can be expressed with the Locale Data Mark-up Language for 74% of
the world’s countries. This study, which spans 1,919 anthroponymic
syntactic structures, has also established, through the use of a unique
form of encoding, that the extent to which the human name can be
internationalised is 96.31% of the data published by Plassard (1996) and
Interpol (2006). Software developers, localisation engineers, and
database administrators may benefit from this paper, through recognition
of this problem and understanding the potential gains from accurately
handling people names within a system. The outcome of this study opens
up opportunities for future research into cultural name mapping that may
further enhance the Common Locale Data Repository.»
Hi Wikidata crew,
the Gerrit Cleanup Day on Wed 23rd is approaching fast - only one week
left. More info: https://phabricator.wikimedia.org/T88531
Do you feel prepared for the day and know what to do?
If not, what are you missing and how can I help?
Some Gerrit queries for Wikidata are listed under "Gerrit queries per
team/area" in https://phabricator.wikimedia.org/T88531
Are they helpful and a good start? Or do they miss some areas (or do
you have existing Gerrit queries to use instead or to "integrate", e.g.
for parts of MediaWiki core if relevant)?
Will there be a main Wikidata contact for the day (and available in
#wikimedia-dev on IRC) and help organize review work in Wikidata
areas, in case other teams would like to reach out?
Thanks for your help and interest!
andre
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/