Hi Jane, 

I'm really sorry if my naïve comment made you sad. :/ To be clearer, I never wanted to minimize the contribution of the volunteers! It's just that I still don't know the internal mechanics of Wikidata. I recently read in a paper, already a bit old*, that 90% of editions were made by bots. I just thought that the mapping between the Wikipedia editions and Wikidata was part of these 90% automated tasks, after which the volunteers had to add the missing 10%, correct and enrich the automatic operations, etc. I'm sorry if I misunderstood.

* " Wikidata has grown significantly since its launch in October 2012; see the table here for key facts about its current content. It has also become the most edited Wikimedia project, with 150– 500 edits per minute, or a half million per day, about three times as many as the English Wikipedia. Approximately 90% of these edits are made by bots contributors create for automating tasks, yet almost one million edits per month are still made by humans." (VRANDEČIĆ, Denny et KRÖTZSCH, Markus. Wikidata: a free collaborative knowledgebase. Communications of the ACM, 2014, vol. 57, no 10, p. 78-85.)

2017-09-02 14:42 GMT+02:00 Ed Summers <ehs@pobox.com>:

> On Sep 2, 2017, at 7:47 AM, Jane Darnell <jane023@gmail.com> wrote:
>
> Your note really made me feel so sad. I try to motivate my Wikipedian friends into doing more on Wikidata and each time they react the way you did, with a sentence like "I imagined that the mapping between Wikipedia and Wikidata was ultra-automated." I guess there is something about the "data" word in the same that makes people assume it is technical, or that being "machine-readable" makes it impossible for humans to read and without "bot" knowlege, there is no place for "normal contributors" to help out.

I appreciate this perspective a great deal. I think it's great that you are motivating users to edit Wikidata--it's really important. Wikidata is nothing (IMHO) without the human-in-the-loop.

But as a practical matter wouldn't it be useful if there were stubs in Wikidata that would help editors identify which entities need attention? Or would the vastness of it cause a problem?

I can certainly see an argument for an embargo period to give counter-vandalism efforts a chance to triage the new pages. But after that point wouldn't it be useful if a bot monitored the language wikipedias for new entries and then added them to Wikidata so that people could fill them out?

I'm just throwing ideas around here, and am not trying to be critical of the current state of affairs. You all are doing amazing work.

//Ed

_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata