The Springer paywall is no longer a problem for open science since there is a certain Russian website, but in this case I see that we can find the full article on ResearchGate: 
https://www.researchgate.net/profile/Alessandro_Piscopo/publication/319272942_What_makes_a_good_collaborative_knowledge_graph_Group_composition_and_quality_in_Wikidata/links/599fd3d2a6fdccf594266835/What-makes-a-good-collaborative-knowledge-graph-Group-composition-and-quality-in-Wikidata .pdf

2017-09-07 8:46 GMT+02:00 Gerard Meijssen <gerard.meijssen@gmail.com>:
Hoi,
Sorry but with only conclusions it is just that.. hidden behind a paywall. Consequently it does not make a difference; our community cannot comment. Please choose a different venue for publications.
Thanks,
     GerardM

On 7 September 2017 at 08:37, Ettore RIZZA <ettorerizza@gmail.com> wrote:
Well, here is a fresh paper that seems to have been written to answer the questions I had after this discussion.

We performed a regression analysis to investigate how the contribution of different types of users, i.e. bots and human editors, registered or anonymous, influences outcome quality in Wikidata. Moreover, we looked at the effects of tenure and interest diversity among registered users. Our findings show that a balanced contribution of bots and human editors positively influence outcome quality, whereas higher numbers of anonymous edits may hinder performance. Tenure and interest diversity within groups also lead to higher quality. "

2017-09-02 15:53 GMT+02:00 Jane Darnell <jane023@gmail.com>:
Thanks! That really made me laugh and I needed that. The wonderful story of Wikidata's history set within the wonderful story of Wikipedia's history anno 2014 is truly amazing. Using that information to describe Wikidata today is like trying to imagine the "bot wars" that have recently become a viral hit on various social media websites. You could say Wikidata was born out of a need to end "bot wars" between updating interwikilink bots. After that "bot war" ended though, it looks like we created a new "bot war" where Wikipedians became afraid of this new project because they might get bitten by a bot. 

On Sat, Sep 2, 2017 at 3:05 PM, Ettore RIZZA <ettorerizza@gmail.com> wrote:
Hi Jane, 

I'm really sorry if my naïve comment made you sad. :/ To be clearer, I never wanted to minimize the contribution of the volunteers! It's just that I still don't know the internal mechanics of Wikidata. I recently read in a paper, already a bit old*, that 90% of editions were made by bots. I just thought that the mapping between the Wikipedia editions and Wikidata was part of these 90% automated tasks, after which the volunteers had to add the missing 10%, correct and enrich the automatic operations, etc. I'm sorry if I misunderstood.

* " Wikidata has grown significantly since its launch in October 2012; see the table here for key facts about its current content. It has also become the most edited Wikimedia project, with 150– 500 edits per minute, or a half million per day, about three times as many as the English Wikipedia. Approximately 90% of these edits are made by bots contributors create for automating tasks, yet almost one million edits per month are still made by humans." (VRANDEČIĆ, Denny et KRÖTZSCH, Markus. Wikidata: a free collaborative knowledgebase. Communications of the ACM, 2014, vol. 57, no 10, p. 78-85.)

2017-09-02 14:42 GMT+02:00 Ed Summers <ehs@pobox.com>:

> On Sep 2, 2017, at 7:47 AM, Jane Darnell <jane023@gmail.com> wrote:
>
> Your note really made me feel so sad. I try to motivate my Wikipedian friends into doing more on Wikidata and each time they react the way you did, with a sentence like "I imagined that the mapping between Wikipedia and Wikidata was ultra-automated." I guess there is something about the "data" word in the same that makes people assume it is technical, or that being "machine-readable" makes it impossible for humans to read and without "bot" knowlege, there is no place for "normal contributors" to help out.

I appreciate this perspective a great deal. I think it's great that you are motivating users to edit Wikidata--it's really important. Wikidata is nothing (IMHO) without the human-in-the-loop.

But as a practical matter wouldn't it be useful if there were stubs in Wikidata that would help editors identify which entities need attention? Or would the vastness of it cause a problem?

I can certainly see an argument for an embargo period to give counter-vandalism efforts a chance to triage the new pages. But after that point wouldn't it be useful if a bot monitored the language wikipedias for new entries and then added them to Wikidata so that people could fill them out?

I'm just throwing ideas around here, and am not trying to be critical of the current state of affairs. You all are doing amazing work.

//Ed

_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata