I couldn't wait for a detailed description of the primary sources tool.
Thanks a lot to the authors for mentioning the StrepHit soccer dataset!
Cheers,
Marco
On 2/19/16 13:00, wikidata-request(a)lists.wikimedia.org wrote:
> Date: Thu, 18 Feb 2016 11:07:41 -0600
> From: Maximilian Klein<isalix(a)gmail.com>
> To: "Discussion list for the Wikidata project."
> <wikidata(a)lists.wikimedia.org>
> Subject: Re: [Wikidata] from Freebase to Wikidata: the great migration
> Message-ID:
> <CAKbmofgRwTN4BK9ga-=4TyKwDcwDk+31o5_J5Q+aDVp3B=NjdA(a)mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> Congratulations on a fantastic project and a your acceptance in WWW2016.
>
> Make a great day,
> Max Klein ‽http://notconfusing.com/
>
> On Thu, Feb 18, 2016 at 10:54 AM, Federico Leva (Nemo)<nemowiki(a)gmail.com>
> wrote:
>
>> >Lydia Pintscher, 18/02/2016 15:59:
>> >
>>> >>Thomas, Denny, Sebastian, Thomas, and I have published a paper which was
>>> >>accepted for the industry track at WWW 2016. It covers the migration
>>> >>from Freebase to Wikidata. You can now read it here:
>>> >>http://research.google.com/pubs/archive/44818.pdf
>>> >>
>>> >>
>> >Nice!
>> >
>>> > >Concluding, in a fairly short amount of time, we have been
>>> > >able to provide the Wikidata community with more than
>>> > >14 million new Wikidata statements using a customizable
>> >
>> >I must admit that, despite knowing the context, I wasn't able to
>> >understand whether this is the number of "mapped"/"translated" statements
>> >or the number of statements actually added via the primary sources tool. I
>> >assume the latter given paragraph 5.3:
>> >
>>> > >after removing dupli
>>> > >cates and facts already contained in Wikidata, we obtain
>>> > >14 million new statements. If all these statements were
>>> > >added to Wikidata, we would see a 21% increase of the num-
>>> > >ber of statements in Wikidata.
>> >
> I was confused about that too. "the [Primary Sources] tool has been
> used by more than a hundred users who performed about
> 90,000 approval or rejection actions. More than 14 million
> statements have been uploaded in total." I think that means that ≤ 90,000
> items or statements were added of 14 million available to be add through
> Primary Sources tool.
>
>> >
>> >Nemo
>> >
>> >_______________________________________________
>> >Wikidata mailing list
>> >Wikidata(a)lists.wikimedia.org
>> >https://lists.wikimedia.org/mailman/listinfo/wikidata
>> >
Dear all,
I don't mean to hijack the thread, but for federation purposes, you might be interested in a Triple Pattern Fragments interface [1]. TPF offers lower server cost to reach high availability, at the expense of slower queries and higher bandwidth [2]. This is possible because the client performs most of the query execution.
I noticed the Wikidata SPARQL endpoint has had an excellent track record so far (congratulations on this), so the TPF solution might not be necessary for server cost / availability reasons.
However, TPF is an excellent solution for federated queries. In (yet to be pulbished) experiments, we have verified that the TPF client/server solution performs on par with state-of-the-art federation frameworks based on SPARQL endpoints for many simple and complex queries. Furthermore, there are no security problems etc. ("open proxy"), because all federation is performed by the client.
You can see a couple of example queries here with other datasets:
– Works by writers born in Stockholm (VIAF and DBpedia – http://bit.ly/writers-stockholm)
– Books by Swedish Nobel prize winners that are in the Harvard Library (VIAF, DBpedia, Harvard – http://bit.ly/swedish-nobel-harvard)
It might be a quick win to set up a TPF interface on top of the existing SPARQL endpoint.
If you want any info, don't hesitate to ask.
Best,
Ruben
[1] http://linkeddatafragments.org/in-depth/
[2] http://linkeddatafragments.org/publications/iswc2014.pdf
Hey folks :)
We have enabled the new datatype for identifiers now. New properties will
start showing up at
https://www.wikidata.org/wiki/Special:ListProperties/external-id. As
announced we will also start converting existing properties with datatype
string that should be external identifiers. We will start with the first
eleven from https://www.wikidata.org/wiki/User:Addshore/Identifiers/0. We
will do more over the coming days. There are still a lot of properties to
go through so please help with that so we can get this over with quickly to
cause minimal disruption to 3rd parties relying on our data. You will also
see the identifiers that are already converted moving to a new section at
the bottom of the item pages. This might need a purge still though to clear
the cache.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Apologies, if you have received multiple copies.
Kindly forward this email to interested Colleagues and Research Scholars
=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=
The International Conference on Computing Technology, Information Security and Risk Management (CTISRM2016)
Islamic Azad University, Academic City, Dubai, UAE
March 2-4, 2016
http://sdiwc.net/conferences/ctisrm2016/
=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*
The proposed conference will be held at Islamic Azad University, Academic City, Dubai, UAE From March 2-4, 2016 which aims to enable researchers build connections between different digital applications.
The conference welcomes papers on the following (but not limited to) research topics:
**Computing Technology
- Face Recognition and High-Resolution Imaging
- Object Detection, Recognition and Categorization
- Adaptive Signal Processing
- Parallel Programming & Processing
- Coding and Modulation
- Mobile IP Networks/ Ad-hoc Networks
- Image Processing
- Data Modeling for Cloud-Based Networks
- Artificial Intelligence
- Expert Systems
- Modulation, Coding, and Channel Analysis
- Multimedia Signal Processing
- Video Compression and Streaming
- Data Mining for Social Network Analysis
- Mobile/ Wireless Network Modeling and Simulation
- Data Compression and Watermarking
- Speech Recognition, Analysis and Synthesis
- Energy Minimization in Cluster-Based Wireless Sensor Networks
- Data Cleaning and Processing
- Text and Web Mining
- Bluetooth and Personal Area Networks
- Wireless System Architecture
- Wireless Network Standard and Protocols
- Digital Right Management and Multimedia Protection
- Mobile Management in Wireless Networks
- Mobile Database Access and Design
**Information Security
- Multimedia and Document Security
- Operating System and Database Security
- Enterprise System Security
- Hardware-Based Security
- Social Network Security
- Wireless and Local Area Networks Security
- Network and Cyber Security
- Information Content Security
- Voice Over IP Security
- Wireless Communication Security
- E-Commerce Security
- Computer Forensics
- Security in Cloud Computing
- Security In Data Center
- Security of Web-Based Application and Services
- Threat, Vulnerability, and Risk Management
- Cryptography and Data Protection
- Computer Crime Prevention and Detection
- Surveillance Systems
- Security Policies and Trust Managements
**Risk Management
- Risk Handling Strategies
- Practical Risk Management
- Risk Management of Financial Information
- Risk Transfer Strategies
- Resource Risk Management
- Risk Management of IT
- Risk Management of Natural Disasters
- Enterprise Risk Management
- Medical Device Risk Management
- Risk Management in Petroleum and Gas
- Security Risk Management
- Risk Management Techniques for Active Trades
- Risk Management and Critical Infrastructure Protection
- Operational Risk Management
- Risk Management in Banking Industry
- Investment Risk Management
- Event Identification, Risk Assessment, and Risk response
- Risk Tolerance Evaluation Techniques
Researchers are encouraged to submit their work electronically. All papers will be fully refereed by a minimum of two specialized referees. Before final acceptance, all referees comments must be considered.
Best selected papers will be published in one of the following special issues provided that the author do major improvements and extension within the time frame that will be set by the conference and his/her paper is approved by the chief editor:
International Journal of New Computer Architectures and their Applications (IJNCAA)
International Journal of Digital Information and Wireless Communications (IJDIWC)
International Journal of Cyber-Security and Digital Forensics (IJCSDF)
International Journal of E-Learning and Educational Technologies in the Digital Media (IJEETDM)
=*=*=*=*=*=*=*=*=*=*=*=
IMPORTANT DATES
Submission Dates: February 16, 2016
Notification of Acceptance: February 19, 2016
Camera Ready Submission: Open from now until February 23, 2016
Registration: Open from now until February 23, 2016
Conference Dates: March 2-4, 2016
Hi all,
I am happy to announce the release of Wikidata Toolkit 0.6.0 [1], the
Java library for programming with Wikidata and Wikibase.
The most prominent new feature of this release is improved support for
writing bots (full support for maxlag and edit throttling, simpler code
through convenience methods, fixed a previous issue with API access). In
addition, the new version introduces support for the new Wikidata
property types "external-id" and "math".
We have also improved our documentation by creating an example project
that shows how to use Wikidata Toolkit as a library in your own,
stand-alone Java code [2].
The bot code in the examples is used in actual bots, and was used for
thousands of edits on Wikidata (e.g., some may have noticed that the
annoying "+-1" after population numbers and the like has become quite
rare recently ;-).
Maven users can get the library directly from Maven Central (see [1]);
this is the preferred method of installation. There is also an
all-in-one JAR at github [3] and of course the sources [4] and updated
JavaDocs [5].
As usual, feedback is welcome. Developers are also invited to contribute
via github.
Cheers,
Markus
[1] https://www.mediawiki.org/wiki/Wikidata_Toolkit
[2] https://github.com/Wikidata/Wikidata-Toolkit-Examples
[3] https://github.com/Wikidata/Wikidata-Toolkit/releases
[4] https://github.com/Wikidata/Wikidata-Toolkit/
[5] http://wikidata.github.io/Wikidata-Toolkit/
--
Markus Kroetzsch
Faculty of Computer Science
Technische Universität Dresden
+49 351 463 38486
http://korrekt.org/