Hi all!
We have some breaking API changes that will soon be deployed to wikidata.org.
The deployment date should be: 9th September 2015 (just under 2 weeks)
The change making the breaks an be found at:
https://gerrit.wikimedia.org/r/#/c/227686/
The breaking changes are:
- XML output aliases are now grouped by language
- XML output may no longer give elements when they are empty
- XML any claim, qualifier, reference or snak elements that had an
'_idx' element will no longer have it
- ALL output may now give empty elements, ie. labels when an entity has none
If you want to see a wikipage explaining these changes take a look at:
https://www.wikidata.org/wiki/User:Addshore/API_Break_September_2015
If you have any questions regarding these breaking changes please ask!
Addshore
Hi,
I am wondering how errors and warnings are reported through the API, and
which errors and warnings are possible. There is some documentation on
Wikidata errors [1], but I could not find documentation on how the
warning messages are communicated in JSON. I have seen structures like this:
{ "warnings" :
{"wbeditentity":
{"messages":
[{"name":"wikibase-self-conflict-patched",
"parameters":[],
"html": { "*":"Your edit was patched into the latest version,
overriding some of your own intermediate changes."}
}]
}
}
}
I don't know how to provoke more warnings, or multiple warnings in one
request, so I found it hard to guess how this pattern generalises. Some
questions:
* What is the purpose of the map with the "*" key? Which other keys but
"*" could this map have?
* The key "wbeditentity" points to a list. Is this supposed to encode
multiple warnings of this type?
* I guess the "name" is a message name, and "parameters" are message
"arguments" (as they are called in action="query") for the message?
* Is this the JSON pattern used in all warnings or can there also be
other responses from wbeditentity?
* Is this the JSON pattern used for warnings in all Wikibase actions or
can there also be other responses from other actions?
* Is there a list of relevant warning codes anywhere?
* Is there a list of relevant error codes anywhere? The docs in [1]
point to paraminfo (e.g.,
http://www.wikidata.org/w/api.php?action=paraminfo&modules=wbeditentity)
but there are no errors mentioned there.
Thanks,
Markus
[1] https://www.mediawiki.org/wiki/Wikibase/API#Errors
Hi,
how is the datetime value with precision of one year stored?
For example for birt date in https://www.wikidata.org/wiki/Q299687
fine grain value for "1700" is "1.01.1700"
But for population date field in https://www.wikidata.org/wiki/Q216
the fine grain value for "2014" is "30.11.2013"
Which is kind of unexpected.
--
Raul
Hi,
What is the correct API call to find a property by its label (in a given
language)? As the label is unique, the function I am looking for would
return 1 or 0 answers. I only found wbsearchentities, which might return
any number of results since it also searches aliases.
The problem I am facing is rather simple: I want to program an example
that shows how to edit Wikidata using Wikidata Toolkit. To do this, I
need a valid property that I can use in statements. I want to work on
test.wikidata.org, of course, so I don't know which properties exist. I
can create my own, but I should only do this if I did not do this yet
(else I will get a "failed-save" error, which contains an error message,
but it does not seem safe to parse this string for finding the id I am
looking for). I could also create new properties every time someone is
running the example program ;-)
Since I could use any property, it would also help me if I could
retrieve a list of properties by type. All I need is some (any) string
property ...
Thanks,
Markus
Hi,
I am doing some 'kicking the tyres' tests on Wikidata as Linked Data. I
like the SPARQL end-point, which is more helpful than most, and
successfully managed a query for "people with the surname Light" last
night. (Only five of them in the world, apparently, but that's another
matter. :-) )
What I do have an issue with is the content negotiation. I kept failing
to get an RDF rendition of my results, and as a last resort I read the
documentation [1].
This described a postfix pattern which delivers RDF XML (e.g. [2]).
However, this pattern is itself subject to content negotiation, and an
initial 303 response converts the URL to e.g. [3]. I am interested in
knowing what pattern of URL will deliver RDF/XML /without /requiring
content negotiation, and the answer to that question is not [2] but
[3]. This matters, for example, in scenarios where one wants to use
XSLT's document() function to retrieve an RDF XML response directly.
The URL pattern [2] will fail. So the documentation is currently unhelpful.
In a similar vein, is there a syntax for running a SPARQL query on
Wikidata such that the response is delivered as RDF XML? In many
end-points there is a parameter you can add to specify the response
format, which allows you to submit searches as HTTP requests and include
the results directly in your (in my case XML-based) processing chain.
An HTML results page isn't very machine-processible!
Thanks,
Richard
[1] https://www.wikidata.org/wiki/Wikidata:Data_access
[2] https://www.wikidata.org/entity/Q3807415.rdf
[3] https://www.wikidata.org/wiki/Special:EntityData/Q3807415.rdf
--
*Richard Light*
Hi,
I wondered why wbeditentity has a parameter "bot". The documentation
says that this will mark the edit as a bot edit, but only if the user is
in the bot group. In other words, users in the bot group can use this
parameter to decide if they want to have their API-based edit flagged as
bot or not. Is there any reason why a user in bot group would *not* want
their API-based edit flagged as bot?
Cheers,
Markus
Hi,
How do you delete, say, the English label of an entity via wbeditentity?
I could not find documentation on this. Whatever the answer, I guess it
is the same for descriptions, right?
How about aliases? I know that writing one English alias will delete all
existing aliases, but how can you write "no English aliases"?
In either case, I do not want to use the "clear" flag, of course :-)
Thanks,
Markus
Hi,
is there a decent documentation of the JavaScript exposed in the Wikidata
interface, so one can re-use it in user scripts?
To give a specific example: I want to add a reference to a statement. I can
do that in JavaScript via the API, but I can't update the changed statement
easily. I could try to re-implement the existing routines, but that would
be duplication of effort, and likely break with the next interface update.
Hi all,
(apologies, if this is not the right place for raising the following
issue (if this is the case, then please delegate me to a more
appropriated place ;) ))
we are currently evaluating Wikibase as storage for D:SWARM GDM data
(see [1,2]). Right now, we have a prototype client [3] that makes use
of a Wikidata Toolkit fork [4]. So far, we were able to write/create
simple items and properties. However, we also would like to create new
items with a given set of statements. Therefore, we intended to
utilise the 'wbeditentity' HTTP API (and I had (and still have) a
lively conversation with Markus Krötzsch about this topic, see [5]).
We get (a kind of) item JSON serialisation (e.g. with help of
JacksonObjectFactory-based DatamodelConverter (Wikidata Toolkit
code)). However, when sending this to the 'webeditentity' API we
always receive an error response like this:
"
{
"error": {
"code": "modification-failed",
"info": "array instead of string",
"messages": [
{
"name": "wikibase-validator-bad-type",
"parameters": [
"string",
"array"
],
"html": {
"*": "array instead of string"
}
}
],
"*": "See http://[OUR WIKIBASE SERVICE IP]/api.php for API usage"
}
}
"
We cannot really interpret what's wrong with the data model that we
send to this API (note: we make use of the POJOs below
'org.wikidata.wdtk.datamodel.json.jackson' package of Wikidata
Toolkit, see [6]). An example of a data JSON is attached to this e-mail.
Thanks a lot in advance for all your help.
Cheers,
Bo/T
[1] http://dswarm.org
[2] https://github.com/dswarm/dswarm-documentation/wiki/Graph-Data-Model
[3]
https://github.com/zazi/wikidata-d-swarm-importer/tree/own_mediawiki_api_cl…
[4]
https://github.com/zazi/Wikidata-Toolkit/tree/wikibase_api_write_modificati…
[5] https://github.com/Wikidata/Wikidata-Toolkit/issues/162
[6]
https://github.com/zazi/wikidata-d-swarm-importer/blob/own_mediawiki_api_cl…