Hi,
I have a couple of questions regarding the Wiki Page ID. Does it always
stay unique for the page, where the page itself is just a placeholder for
any kind of information that might change over time?
Consider the following cases:
1. The first time someone creates page "Moon" it is assigned ID=1. If at
some point the page is renamed to "The_Moon", the ID=1 remains intact. Is
this correct?
2. What if we have page "Moon" with ID=1. Someone creates a second-page
"The_Moon" with ID=2. Is it possible that page "Moon" is transformed into a
redirect? Then, "Moon" would be redirecting to page "The_Moon"?
3. Is it possible for page "Moon" to become a category "Category:Moon" with
the same ID=1?
Thanks,
Gintas
Hey,
I'm currently working on a bot for a data import about german judges. The initial purpose has been to make a comparison to find all QIDs of existing Wikidata items that match to our data and record them. We noticed that almost half of the data is missing on Wikidata. Therefore we decided to try to create a bot which imports all missing entries.
I already read the Wikidata:Bot site (https://www.wikidata.org/wiki/Wikidata:Bots <https://www.wikidata.org/wiki/Wikidata:Bots>) but I still have some questions about the bot requirements:
- The first section is for all bots thus also for our bot. One of these requirements is to be able to set a limit for maximum edits per minute. But what does “edit" exactly mean in that case? Is to create an item and to add a label each a separate edit?
- In the second section are requirements for “Langlink import bots”. In which case are these requirements related to our bot? In addition, there is a link in this section for a full list of requirements for "import bots". Which of these entries are requirements, and which are merely recommended?
- In the third section “Statement adding bot” is one requirement "Monitor constraint violation reports for possible errors generated or propagated by your bot”. Should that be implemented as well or is that rather a task for the bot operator?
It would be very helpful to have some example code (preferably in python and pywikibot) of a bot that currently has a bot flag and does imports. Does anyone know where to find some?
Thanks in advance!
Best,
Marisa Nest
………………………………………………………………………
Student assistant at Human-Centered Computing (HCC) Lab
Freie Universität Berlin | Institute of Computer Science
https://www.mi.fu-berlin.de/inf/groups/hcc/ <https://www.mi.fu-berlin.de/inf/groups/hcc/>
Hi!
In RDF exports of Wikidata[1] and in Wikidata Query Service, sitelinks
were always encoded by url-encoding the sitelink text - i.e. link to
"Category:Stuffed animals" were encoded as
/wiki/Category%3AStuffed%20animals.
While this encoding produces a working link, after some time we've
arrived to a conclusion that such encoding is very inconvenient, due to
mismatch with how titles are encoded in Mediawiki, and this mismatch
makes it harder to look up the links. See more in
https://phabricator.wikimedia.org/T131960
We have decided to change the encoding, so that the encoding of the
sitelink above would be /wiki/Category:Stuffed_animals. The encoding now
should match how titles are encoded in Mediawiki codebase (non-ASCII
characters that Mediawiki encodes will still be encoded as before).
Implementation of this change will require database reload, and during
that time there might be inconsistent results returned for some time
(some entities may have new sitelink encoding and some the old one). I
apologize in advance for any inconvenience caused by that. I will
announce additionally when the switch is process has started and when it
is complete.
Thanks,
[1] https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format
--
Stas Malyshev
smalyshev(a)wikimedia.org
Hello,
For your kind attention, on 10-11 June, CIS-A2K (the Center for Internet
and Society - Access to Knowledge, a partner of Wikimedia Foundation in
India), conducted a regional Wikidata workshop. This was the first
completely Wikidata-dedicated workshop by CIS-A2K.
The event was conducted in Bangalore, and 11 Wikimedians from 5 Indic
language communities participated in the workshop.
Please see a report of the event here:
https://cis-india.org/a2k/blogs/wikidata-workshop-south-india-conducted-in-…
Please feel free to ask if you want more details. Your suggestions are
welcome.
Thanks
Tito Dutta
Note: If I don't reply to your email in 2 days, please feel free to remind
me over email or phone call.
Hi Wikidata-niks,
As part of the Metropolitan Museum of Art project, I am interested in
facilitating more public editing of Wikidata items for artwork through
external tools, including that by relative newbies who might have an
interest in art history.
One basic property for artworks that is particularly suited for this field
is Depicts (P180), for example saying that a particular painting depicts a
particular person (or building, or mountain, or divinity, or type of
clothing).
We can do this to some extent now with Listeria and its 'wdedit' option,
but this requires js customization and significant wiki background on the
user''s part.
I was thinking something like the Wikidata Distributed Game might be
interesting and broadly accessible to the public, but that tool currently
only allows multiple-choice edits, and doesn't have a text entry box option.
Would it be possible to have some WiDaR-sh tool that could fill this niche
for artworks?
I think it could be of very broad usefulness and interest to art
communities.
Thanks,
Pharos
Hello,
Our next Wikidata IRC office hour will take place on June 28th, 18:00 UTC
(20:00 in Berlin), on the channel #wikimedia-office.
During one hour, you'll be able to chat with the development team about the
past, current and future projects, and ask any question you want.
See you there,
--
Léa Lacroix
Project Manager Community Communication for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
I note that that version 0.2.5 of the wikidata stand-alone has a new
Mediawiki API service.
SELECT * WHERE {
SERVICE wikibase:mwapi {
bd:serviceParam wikibase:api "Categories" .
bd:serviceParam wikibase:endpoint "en.wikipedia.org" .
bd:serviceParam mwapi:titles "Albert Einstein" .
?category wikibase:apiOutput mwapi:category .
?title wikibase:apiOutput mwapi:title .
}
}
This works great on https://query.wikidata.org/, but when I enter it in
my local installation, I get an error with this line in my stack trace:
java.util.concurrent.ExecutionException:
java.util.concurrent.ExecutionException:
java.lang.IllegalArgumentException: Service URI
http://wikiba.se/ontology#mwapi is not allowed
It's not clear to me from reading the link below whether and how this
can be made to work on my local installation:
https://www.mediawiki.org/wiki/Wikidata_query_service/User_Manual/MWAPI#Ind….
I would benefit greatly from an example.
This looks like a great feature!
Thanks,
- Eric Scott
Forwarding from Wikimedia-L, because Wikidata may help with this a bit.
Is it possible to get a list of items that are Templates, and have the
largest number of sitelinks?
This should probably be grouped by project. A template that is used in most
Wikivoyages (for example) will have much less sitelinks than a template
that is used in most Wikipedias, but it's still should be counted as it's
relevant for this list.
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
---------- Forwarded message ----------
From: Amir E. Aharoni <amir.aharoni(a)mail.huji.ac.il>
Date: 2017-06-28 9:32 GMT+03:00
Subject: Which templates should be global?
To: wikimedia-l <wikimedia-l(a)lists.wikimedia.org>
Hallo,
TLDR: If you are an experienced editor on any Wikimedia project in any
language, please add your ideas here:
https://meta.wikimedia.org/wiki/Which_templates_should_be_global
In more detail:
Continuing some recent discussions from Phabricator[1], Wikimedia
Hackathon, and Wikimedia Developers Summit, I'd like to ask the wider
community of editors in all projects:
Which templates could be useful for all Wikimedia projects, or at least for
_many_ projects?
A lot of templates are replicated manually, and it's a problem that is
well-known to all experienced editors. If there was a technology that
allows templates to be more conveniently globally managed, which templates
would you adapt to this technology first?
I started a list at https://meta.wikimedia.org/wiki/Which_templates_should_
be_global . Please continue it! I'm very interested to hear from all
projects and languages, not only the big Wikipedias, so spread the word.
Thanks!
[1] For example https://phabricator.wikimedia.org/T159334
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
I am puzzled by a pattern I am seeing in submitting nested SELECT
statements to the WDQS.
In cases where a federated query to the WDQS time out, it helps to include
redundant SELECT statements for the same query to resolve.
The following query does not resolve:
[image: Inline image 1]
source: https://gist.github.com/andrawaag/ba1b7d1b9adf06e9d45afa98c1d1e239
results: http://tinyurl.com/ybl55tk4
When the lines 7 and 9 in the above example are duplicated, suddenly the
query does resolve
[image: Inline image 2]
source: https://gist.github.com/andrawaag/7e2e793342011f66bb6a3ea7bb5a1326
results: http://tinyurl.com/y9mjkejo
What is the added value of a double SELECT (line 7,8 and line 10,11). Also
the lines 5 and 9 had to be separated by a SELECT statement to be
successful.
Is this an implementation issue, or am I doing something wrong?
kind regards,
Andra Waagmeester
Hi,
While doing mappings for YSO places in Mix'n'match, my colleague noticed
that the descriptions shown for the Wikidata entities are mismatched and
come from completely different entities. See the attached screenshot.
Here are some examples of mismatches:
Mariehamn - Island group in Norway ??
Manchuria - Province of Finland ??
Menkijärvi - Lake in Stockholm County ??
Nordic countries - Island in Dodecanese, Greece ??
It appears that the descriptions come from different entities from those
shown. The problem seems to be related to the pagination functionality -
when switching between pages, sometimes only part of the information is
updated - either the Wikidata entity names or their descriptions, but
not always both.
The browser was Firefox 45.2.0 running on Windows 7. I could also see
the same problem on Firefox 54.0 running on Ubuntu 16.04. Just choosing
"Automatically matched" and then flipping between the pages a few times
seems to be enough to trigger the problem.
-Osma
--
Osma Suominen
D.Sc. (Tech), Information Systems Specialist
National Library of Finland
P.O. Box 26 (Kaikukatu 4)
00014 HELSINGIN YLIOPISTO
Tel. +358 50 3199529
osma.suominen(a)helsinki.fi
http://www.nationallibrary.fi