Hi,
I am trying to get alternative names of given names in WikiData with the
following simple query:
PREFIX ps: <http://www.wikidata.org/prop/direct/>
PREFIX wd: <http://www.wikidata.org/entity/>
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
CONSTRUCT {?s rdfs:label ?o}
WHERE { ?s ps:P31 wd:Q202444. ?s rdfs:label ?o}
LIMIT 1000
Initially, the query was much more complex, but I was getting time-outs on
the public WikiData SPARQL endpoint. I decided to use Linked Data Fragments
to offload some filtering from the server to the client.
comunica-sparql "https://query.wikidata.org/bigdata/ldf" -f query >
given_names.n3
(where "query" is a file with the SPARQL query shown above). Unfortunately,
the client tries to get output from the 3rd page, I am getting
the following error:
Could not retrieve
https://query.wikidata.org/bigdata/ldf?subject=http%3A%2F%2Fwww.wikidata.or…
(500: unknown error)
Following the link in fact returns HTTP 500 error with
Error details
java.lang.IllegalStateException
The link points to the 3rd page. It works if you try to go the second page:
https://query.wikidata.org/bigdata/ldf?subject=http%3A%2F%2Fwww.wikidata.or…
Is this a bug or a limitation of a service?
With kind regards,
Maciej Gawinecki
Hello -
The tl;dr version of this post: On a blank Wikibase instance, I want to be
able to do:
api.php?action=wbeditentity&*new=item&id=Q42*&data={"labels":{"en":{"language":"en","value":"Douglas
Adams"}}}
I do not want to do this on wikidata.org - I understand why it makes no
sense in that context. But I would like to be able to do this on my own
Wikibase instance.
Beyond the whimsical like ensuring Doug Adams gets to be Q42, the main
reason for this is data portability and identifier stability. As more
hosted Wikibase providers come online and start offering services, I want
to know that I have data portability if I need to change to a different
provider. Anyone who queries my Wikibase needs to know the identifiers my
Wikibase uses for instances and more importantly for classes, and if I
change providers, those identifiers cannot change without breaking those
queries.
I do not think that MySQL backups are a reliable way to be able to
transition between providers. I am not confident that all providers will
want to offer a service where they accept a MySQL backup to load into their
Wikibase backend, and there are additional challenges moving between
Wikibase versions. (Though some may - I programmatically create the
contents of my Wikibase so I don't care about edit history, but if one were
to care about that history and other things like wikiusers I imagine the
MySQL dumps would be the preferred way to migrate)
One possible solution is to simply create blank items in a new Wikibase,
from 1 to the maximum identifier used in my old wikibase, and then
repopulate each item with the claims from my old Wikibase instance.
Unfortunately this is not a reliable solution because while Wikibase
guarantees that item IDs will not be reused, it does not guarantee that
every ID in the sequence will be created, e.g. in rare cases Wikibase may
go from Q41 to Q43 and skip/never create Q42.
I think Wikibase is awesome, but it is an odd database that does not allow
you to set the keys for the data you are managing :)
In reading through the Wikibase Repo code, it seems like this scenario was
considered though perhaps isn't fully implemented (or has been disabled?).
The code in EntitySavingHelper.php looks like there are/were ways to call
it by providing an ID while still asking for a new entity, though there is
logic earlier in the ModifyEntity code to look for and explicitly reject
the case where the API asks for 'new' and also provides an ID, so I'm not
sure how this code path would get called. There is also code to ask the
entityStores if they 'canCreateWithCustomId', but those all appear to just
return 'false'?
However, if that logic was skipped in the API handler and a bit of code
reworked in ModifyEntity and EntitySavingHelper, along with ensuring that
that the next available ID is kept up to date in the wb_id_counters table
to always be 1 beyond the maximum ID in use, it looks like it might not be
that hard to enable creating entities with specific IDs?
So three questions:
Would the Wikibase development team ever be open to supporting something
like this, behind a flag like $wgWBRepoSettings['allowUserProvidedIds']
that defaulted to false?
Are there more complicated implications from allowing a change like this
that would need to be considered? I understand why the Wikidata.org repo
needs this codepath fast and can't allow users to provide IDs for new
entities anyway, but are there other reasons this isn't supported beyond
"Wikidata doesn't need it?"
Is this all moot with the eventual REST API? I see that there's a PUT
envisioned, could I use that to directly create an item or property and
give it an ID then, or does the ID have to already exist to replace it?
Thank you all for your work on Wikibase and have a nice end of 2020!
Thanks,
-Erik
Hi Community!
I've read over the last 2 weeks various previous comments, questions,
concerns about how to best handle Transliterations which are different from
Translations (consensus is to use Senses for Translations)
But for Transliterations, it seems that a general consensus is around using
Forms (as the Lemon model does) to handle this and also make querying
easier I guess.
I'd like to improve the documentation on the wiki pages for Lexicographical
Data Model with a general page on Transliteration concerns. As I
understand it, there are already Language specific pages to deal with
Transliteration and Translation concerns (such as this one for Bengali
<https://www.wikidata.org/wiki/Wikidata:Lexicographical_data/Documentation/L…>
)
My proposal to all here listening is to have a new sub page called
https://www.wikidata.org/wiki/Wikidata:Lexicographical_data/Documentation/T…
to handle the general documentation effort for that, and where it would
have only a few practical examples to help new users with general Forms
entry of Transliterations, and taking example from this section of the
Lemon model cookbook on Phonetics:
https://lemon-model.net/lemon-cookbook.pdf#subsubsection.2.1.2
The proposed Transliteration page would also have a link from the main doc
page in the Forms section (see my placeholder in the Form section now
<https://www.wikidata.org/wiki/Wikidata:Lexicographical_data/Documentation#F…>)
Thoughts?
Thad
https://www.linkedin.com/in/thadguidry/
FYI, the WMF community wishlist survey is now open for votes on proposals
until December 21st.
---------- Forwarded message ---------
From: Szymon Grabarczuk <sgrabarczuk-ctr(a)wikimedia.org>
Date: Thu, 10 Dec 2020 at 09:27
Subject: [Wikimedia-l] Community Wishlist Survey 2021
To:
We invite all registered users to vote on the 2021 Community Wishlist
Survey[1]. You can vote until 21 December for as many different wishes as
you want.
In the Survey, wishes for new and improved tools for experienced editors
are collected. After the voting, we will do our best to grant your wishes.
We will start with the most popular ones.
We, the Community Tech[2], are one of the Wikimedia Foundation[3] teams. We
create and improve editing and wiki moderation tools. What we work on is
decided based on results of the Community Wishlist Survey. Once a year, you
can submit wishes. After two weeks, you can vote on the ones that you're
most interested in. Next, we choose wishes from the survey to work on. Some
of the wishes may be granted by volunteer developers or other teams.
We are waiting for your votes. Thank you!
[1]
https://meta.wikimedia.org/wiki/Special:MyLanguage/Community_Wishlist_Surve…
[2] https://meta.wikimedia.org/wiki/Special:MyLanguage/Community_Tech
[3] https://meta.wikimedia.org/wiki/Special:MyLanguage/Wikimedia_Foundation
Kind regards,
Szymon Grabarczuk (he/him)
Community Relations Specialist
Wikimedia Foundation <https://wikimediafoundation.org/>
_______________________________________________
Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l(a)lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
<mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
--
Léa Lacroix
Community Engagement Coordinator
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.