Hello,
Recently I have been creating test wikibase projects on wikibase-cloud and
deleting them when the test was over. I noticed initially on the
wikibase-cloud console that I could have a maximum of 6 wikibases created
at a time. However today when I tried to create another wikibase from the
console I got a 'Creation Failed' message in red. Is this because a person
can create a maximum of 6 wikibases total on their account? If so, this was
not made clear before...
Hello,
I noticed today a major problem on my Wikibase-cloud project with respect
to a particular sparql query (see below). After using WikibaseIntegrator to
bulk create about 300,000 items all having a specific 'instance of X'
statement, a sparql query that tried to retrieve those items on the basis
of that 'instance of X' statement failed to retrieve any of them. It isn't
a problem with the query syntax since changing X to another item Y yields
proper results. The item X is retrievable via its Qid and label in the main
search box. Does anyone have experience with this problem or know what is
going on?
Query:
https://framenet-akkadian257.wikibase.cloud/query/#PREFIX%20pr%3A%20%3Chttp…
.
Dear all,
I have just started using Wikibase.cloud and have set up this instance:
https://tdkiv.wikibase.cloud/
I have added one property
(https://tdkiv.wikibase.cloud/wiki/Property:P1) and imported some data
(e.g., https://tdkiv.wikibase.cloud/wiki/Item:Q10).
Now I would like to get a list of all items (their QIDs) with P1
present. I have tried the following query:
https://tdkiv.wikibase.cloud/query/embed.html#PREFIX%20ktd%3A%20%3Chttp%3A%…
A short URL to query results:
https://tinyurl.com/2dcy62zk
But, unfortunately, it gives no results.
I suppose it maybe a config issue (I have just used the query service as
it is available out-of-the box with a new Wikibase.cloud instance).
Also, due to the nature of the values assigned P1 which are longer
texts, I have set up String and Monolingual text to a maximum of 2000
chars. I hope it won’t be an underlying issue for the query but I am
just noting it here as the config page mentions that “Longer than
default lengths (which are used on Wikidata) is generally untested and
might have some unexpected outcomes.”
What (and where) should I check or set up to make the query service work?
Or how should I change my query to make it return the results I hope for
:-)?
Thank you in advance for your help!
Linda
---
Linda Jansova
National Library of the Czech Republic
linda.jansova(a)nkp.cz
linda.jansova(a)gmail.com
Hello all again,
I was wondering if anyone knew why a straightforward sparql query on my
wikibase-cloud project should somehow miss a result it should normally find:
"
PREFIX pr: <https://framenet-akkadian.wikibase.cloud/prop/direct/>
PREFIX it: <https://framenet-akkadian.wikibase.cloud/entity/>
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
SELECT DISTINCT
?qItem ?documentLabel ?idDocumentString
WHERE {
?qItem pr:P3 it:Q52 . #P3 = instance of, Q52 = Document
?qItem rdfs:label ?documentLabel .
?qItem pr:P24 ?idDocumentString . #P24 = has idDocument (DataType
String)
}
"
The problem is that the search results (https://tinyurl.com/2dqmn6xe) miss
a Qitem (https://framenet-akkadian.wikibase.cloud/wiki/Item:Q2483) that
should be listed with idDocumentString = '32'. But the page for the Qitem
shows it has a statement of providing this value. The query seems to get
most or all other Qitems that have an idDocument field.
Hello all,
I was wondering if anyone knew the reason behind the following error I get
using wikibaseIntegrator to import a Qitem that has a claim whose value is
a text string in Akkadian?
wikibaseintegrator.wbi_exceptions.ModificationFailed: 'Malformed input: ana
šarri bēlīya aradka Gimillu'
The key lines of code that generate this error should be straightforward:
sentence_text = str(row['text'])[:200]
new_sentence.claims.add(datatypes.String(prop_nr=has_text,
value=sentence_text))
s_id = new_sentence.write().id
where sentence_text is something like 'ana šarri bēlīya aradka Gimillu'.
I don't understand what the problem is.
Hello,
I have been using WikibaseIntegrator to import a large amount of data to my
wikibase-cloud project. I recently started getting the following error when
importing certain forms/statements under the lexeme.write() command:
requests.exceptions.HTTPError: 413 Client Error: Request Entity Too Large
for url: https://framenet-akkadian.wikibase.cloud/w/api.php
Although it seems clear this is because the 'lexeme' is too big, I don't
know exactly why it should be considered big on my machine's end. I don't
add a lot of information at any one time. It happens only for forms of one
particular lexeme, which has quite a lot of forms relatively speaking as
well as statements under those forms.
Matt
Hello,
I was wondering if it is possible to use RaiseWikibase with a Wikibase
cloud instance? I am finding WikibaseIntegrator is too slow for my purposes.
Matt
I also noticed today after a 24 hour period that one big block of mass
deletes no longer appeared in sparql queries, but an older, much smaller
block still is appearing in the queries. I don't understand the logic...