[Begging pardon if you have already read this in the Wikidata project chat]
Hi everyone,
As Wikidatans, we all know how much data quality matters. We all know what high quality stands for: statements need to be validated via references to external, non-wiki, sources.
That's why the primary sources tool is being developed: https://www.wikidata.org/wiki/Wikidata:Primary_sources_tool And that's why I am preparing the StrepHit IEG proposal: https://meta.wikimedia.org/wiki/Grants:IEG/StrepHit:_Wikidata_Statements_Val...
StrepHit (pronounced "strep hit", means "Statement? repherence it!") is a Natural Language Processing pipeline that understands human language, extracts structured data from raw text and produces Wikidata statements with reference URLs.
As a demonstration to support the IEG proposal, you can find the **FBK-strephit-soccer** dataset uploaded to the primary sources tool backend. It's a small dataset serving the soccer domain use case. Please follow the instructions on the project page to activate it and start playing with the data.
What is the biggest difference that sets StrepHit datasets apart from the currently uploaded ones? At least one reference URL is always guaranteed for each statement. This means that if StrepHit finds some new statement that was not there in Wikidata before, it will always propose its external references. We do not want to manually reject all the new statements with no reference, right?
If you like the idea, please endorse the StrepHit IEG proposal!
Cheers,
Hoi, The danger of blanket statements is that they are often easy to refute. No. Quality is not determined by sources. Sources do lie.
When you want quality, you seek sources where they matter most. It is not by going for "all" of them, it is where Wikidata differs from other sources.
Arguably and I do make that argument. Wikidata is so much underdeveloped in the statement department that having more data with a reasonable expectation of quality will trump quality for a much smaller dataset. Thanks, GerardM
On 4 September 2015 at 17:01, Marco Fossati hell.j.fox@gmail.com wrote:
[Begging pardon if you have already read this in the Wikidata project chat]
Hi everyone,
As Wikidatans, we all know how much data quality matters. We all know what high quality stands for: statements need to be validated via references to external, non-wiki, sources.
That's why the primary sources tool is being developed: https://www.wikidata.org/wiki/Wikidata:Primary_sources_tool And that's why I am preparing the StrepHit IEG proposal:
https://meta.wikimedia.org/wiki/Grants:IEG/StrepHit:_Wikidata_Statements_Val...
StrepHit (pronounced "strep hit", means "Statement? repherence it!") is a Natural Language Processing pipeline that understands human language, extracts structured data from raw text and produces Wikidata statements with reference URLs.
As a demonstration to support the IEG proposal, you can find the **FBK-strephit-soccer** dataset uploaded to the primary sources tool backend. It's a small dataset serving the soccer domain use case. Please follow the instructions on the project page to activate it and start playing with the data.
What is the biggest difference that sets StrepHit datasets apart from the currently uploaded ones? At least one reference URL is always guaranteed for each statement. This means that if StrepHit finds some new statement that was not there in Wikidata before, it will always propose its external references. We do not want to manually reject all the new statements with no reference, right?
If you like the idea, please endorse the StrepHit IEG proposal!
Cheers,
Marco Fossati http://about.me/marco.fossati Twitter: @hjfocs Skype: hell_j
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
On Fri, Sep 4, 2015 at 5:01 PM, Marco Fossati hell.j.fox@gmail.com wrote:
[Begging pardon if you have already read this in the Wikidata project chat]
Hi everyone,
As Wikidatans, we all know how much data quality matters. We all know what high quality stands for: statements need to be validated via references to external, non-wiki, sources.
That's why the primary sources tool is being developed: https://www.wikidata.org/wiki/Wikidata:Primary_sources_tool And that's why I am preparing the StrepHit IEG proposal: https://meta.wikimedia.org/wiki/Grants:IEG/StrepHit:_Wikidata_Statements_Val...
StrepHit (pronounced "strep hit", means "Statement? repherence it!") is a Natural Language Processing pipeline that understands human language, extracts structured data from raw text and produces Wikidata statements with reference URLs.
As a demonstration to support the IEG proposal, you can find the **FBK-strephit-soccer** dataset uploaded to the primary sources tool backend. It's a small dataset serving the soccer domain use case. Please follow the instructions on the project page to activate it and start playing with the data.
What is the biggest difference that sets StrepHit datasets apart from the currently uploaded ones? At least one reference URL is always guaranteed for each statement. This means that if StrepHit finds some new statement that was not there in Wikidata before, it will always propose its external references. We do not want to manually reject all the new statements with no reference, right?
If you like the idea, please endorse the StrepHit IEG proposal!
Thank you for working on this, Marco. This is a great step forward. I wish you good luck for the IEG proposal!
Cheers Lydia
Dear Marco,
Sounds interesting, but the project page still has a lot of gaps. Will you notify us again when you are done? It is a bit tricky to endorse a proposal that is not finished yet ;-)
Markus
On 04.09.2015 17:01, Marco Fossati wrote:
[Begging pardon if you have already read this in the Wikidata project chat]
Hi everyone,
As Wikidatans, we all know how much data quality matters. We all know what high quality stands for: statements need to be validated via references to external, non-wiki, sources.
That's why the primary sources tool is being developed: https://www.wikidata.org/wiki/Wikidata:Primary_sources_tool And that's why I am preparing the StrepHit IEG proposal: https://meta.wikimedia.org/wiki/Grants:IEG/StrepHit:_Wikidata_Statements_Val...
StrepHit (pronounced "strep hit", means "Statement? repherence it!") is a Natural Language Processing pipeline that understands human language, extracts structured data from raw text and produces Wikidata statements with reference URLs.
As a demonstration to support the IEG proposal, you can find the **FBK-strephit-soccer** dataset uploaded to the primary sources tool backend. It's a small dataset serving the soccer domain use case. Please follow the instructions on the project page to activate it and start playing with the data.
What is the biggest difference that sets StrepHit datasets apart from the currently uploaded ones? At least one reference URL is always guaranteed for each statement. This means that if StrepHit finds some new statement that was not there in Wikidata before, it will always propose its external references. We do not want to manually reject all the new statements with no reference, right?
If you like the idea, please endorse the StrepHit IEG proposal!
Cheers,