An example that just happened a few minutes ago : I did this kind of edits because the claims are wrong : https://www.wikidata.org/w/index.php?title=Q12191&diff=254099391&oldid=254099376 I think I already did this in the past. After a chat with Yamaha5, it appears he added this to complete a symmetric relations, which means if I just remove the claim in this item they are likely to come back. But it might be a chain of works : maybe a robot had imported this from some Wikipedia, then Yamaha completed the symmetric relation. If I remove the symmetric claims, the robot might reimport them, so ... with or without inferrences and magic, we will have to trace the origin of the problem to solve it once and for all.

2015-09-28 16:43 GMT+02:00 Thomas Douillard <thomas.douillard@gmail.com>:
>
(*) This follows the principle of "magic is bad, let people edit". Allowing
inconsistencies means we can detect errors by finding such inconsistencies.
Automatically enforcing consistency may lead to errors propagating out of view
of the curation process. The QA process on wikis is centered around edits, so
every change should be an edit. Using a bot to fill in missing "reverse" links
follows this idea. The fact that you found an issue with the data because you
saw a bot do an edit is an example of this principle working nicely.

That might prove to become a worser nightmare than the magic one ... It's seems like refusing any kind of automation because it might surprise people for the sake of exhausting them to let them do a lot of manual work.

2015-09-28 16:23 GMT+02:00 Daniel Kinzler <daniel.kinzler@wikimedia.de>:
Am 27.09.2015 um 21:19 schrieb Thad Guidry:
> Both Sides ?  Wikidata has a true graph representation like FB ?  didn't know
> that.  Can you show me the other side your referring too ?

"Both sides" probably means that "sister city" is a reflexive property, so if
item A refers to item B as a sister city, item B should also refer to item A as
a sister city. This is not automatic, and it was a conscious design decision to
not make it automatic(*).

What do you mean by "true graph representation"? Wikidata internally uses JSON
structures to represent items, and items reference each other, forming a graph.
We have a linked data interface for traversing the graph[1]. We also have an RDF
mapping with a SPARQL endpoint[2] that allows queries against that graph.

-- daniel


[1] https://www.wikidata.org/wiki/Wikidata:Data_access#Linked_Data_interface
[2] https://www.wikidata.org/wiki/Wikidata:Data_access#SPARQL_endpoints

(*) This follows the principle of "magic is bad, let people edit". Allowing
inconsistencies means we can detect errors by finding such inconsistencies.
Automatically enforcing consistency may lead to errors propagating out of view
of the curation process. The QA process on wikis is centered around edits, so
every change should be an edit. Using a bot to fill in missing "reverse" links
follows this idea. The fact that you found an issue with the data because you
saw a bot do an edit is an example of this principle working nicely.


--
Daniel Kinzler
Senior Software Developer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata