> Any feature requests?
Yes, I can think of several ones. To begin with, I would like a feature where the bot operator only needs to give the bot information on a wikipedia page. The bot would then fetch the interwiki links from that page and add them as sitelinks to wikidata.
I am thinking of an command like this one: script.py -site:Hermony Granger -lang:fr
Also, I have the same questions as Merlissimo has allready raised on the "features needed by langlink bots" thread.
Cheers,
Snaevar
----- Original Message -----
From: Joan Creus
Sent: 10/24/12 05:08 PM
To: Discussion list for the Wikidata project.
Subject: [Wikidata-l] New release of Pywikidata: feature requests?
Hi all,
I've released a new version of Pywikidata @
https://github.com/jcreus/pywikidata (not really a release, since there is
no version numbering; I should begin, maybe...).
Hi all,
I've released a new version of Pywikidata @
https://github.com/jcreus/pywikidata (not really a release, since there is
no version numbering; I should begin, maybe...).
New features include:
- Changes pushed to the server are only the properties which have
changed. This means that it is really faster, and it is the way it should
be (according to the spec).
- 'uselang' is no longer accepted, per spec.
- Bot flag allowed.
Right now there's some more stuff left to do (including adapting to changes
to the API, and adding the claim system [still beta]), but is there any
feature request of something which would be useful for bots?
In regards to Pywikipediabot integration I think someone is working on it
(thanks!); yet I think it would be better to wait. Pywikipedia is a mature
project and Pywikidata is still evolving constantly (mostly due to API
changes, which break it). So I'd wait until Pywikidata has matured too, and
Wikidata is deployed to WMF wikis.
By the way, pull requests will be gladly accepted :).
Joan Creus (joancreus@freenode, Joancreus@wikipedia)
Hi all,
I have a question or rather proposal regarding the JSON representation [1].
The "Geo" example on the JSON page[1] implies that there won't be a
fixed representation for data types. Instead of the "value" key that
all the other examples use, the Geo example uses "longitude" and
"latitude". Wouldn't a representation like the following be more
appropriate?
"value": {
"latitude" : 32.233,
"longitude" : -2.233,
}
That is, if "snaktype": "value", then there has to be a "value" key
with a data type specific value object.
Something that imho would also be useful, is a way to specify the data
type - this could be optional. For the Geo example something like the
following would make sense:
"datatype": "geo"
Without such a definition, a consumer would have to derive the data
type from the keys and/or the lexical representation of the values,
which would usually be a tough task.
Cheers,
Andreas
1. https://meta.wikimedia.org/wiki/Wikidata/Development/Phase_2_JSON
--
Telefon/Phone +49 6441 87087-32 · Telefax/Fax +49 6441 87087-17
E-Mail a.schultz(a)mes-semantics.com · Web http://mes-semantics.com
________________________________________________________________
MediaEvent Services GmbH & Co. KG
Berlin Office: Stendaler Straße 4 · 10559 Berlin · Germany
Wetzlar Office: Charlotte-Bamberg-Str. 6 · 35578 Wetzlar · Germany
Handelsregister/Commercial Register: Amtsgericht Wetzlar HRA 4015
USt-IdNr./VAT Reg.No. DE206509024
Komplementärin: MediaEvent Services Verwaltungs GmbH
Handelsregister/Commercial Register: Amtsgericht Wetzlar HRB 5079
Geschäftsführer/Managing Director: Tim Ebert
________________________________________________________________
Hi,
Lydia mentioned in her summary a major discussion about Wikidata in
the Hebrew Wikipedia. The discussion was in Hebrew of course, so I'll
bring a little summary of it.
Eleven people supported the installation of Wikidata. Nobody objected \o/
Despite the wide support, some issues and questions were raised:
1. How is the coordination with interwiki links bot operators progressing?
Will the bots be smart enough not to do anything to articles that are
already listed in the repository and have the correct links displayed?
Will the bots be smart enough to update the repo in the transition
period, when some Wikipedias have Wikidata and some don't?
Will the bots be smart enough not to do anything with articles that
have interwiki conflicts (multiple links, non-1-to-1 linking etc.)?
2. What are the numbers after the Q in the titles in the repo site? -
I replied that they are just sequential identifiers without any
additional meaning. Maybe it can be added to the FAQ.
3. Several people complained about instability in the links editing
pages in the repo: They saw messages about network problems when they
tried to edit links. I experienced this a couple of times, too. I also
saw a complete crash with a "memory full" error once.
4. Somebody noticed that the testing sites don't support unified
accounts (CentralAuth). The production system will, right?
5. Somebody complained that it's too easy to remove a link from a repo
- clicking the "remove" link is enough. I mentioned it in a bug
report:
https://bugzilla.wikimedia.org/show_bug.cgi?id=40200
6. And this is probably the biggest issue: The workflow for adding an
interlanguage link is cumbersome and in some cases the interface
elements are undiscoverable.
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
Hi there, I got a number of questions. Hope not to bother.
1) Has the testing of phase 1 already started on hu.wp?
2) Will phase 1 be also expanded to ns=/=0 (i.e. templates,
categories, help pages...) and to other projects (i.e. Wikisource,
Wikiquote...)?
3) When will be possible to test phase 2 on Wikidata-test-repo?
4) Is there already a script for testing the automatic upload of
interlinks? I'd like to run some tests with my bot.
Thanks in advance.
--
Luca "Sannita" Martinelli
http://it.wikipedia.org/wiki/Utente:Sannita
Sumana says i forward this e-mail to wikidata-l. So i did it :)
---------- Forwarded message ----------
From: Amir Ladsgroup <ladsgroup(a)gmail.com>
Date: Fri, 19 Oct 2012 03:18:57 +0330
Subject: Wikidata bug (or semi-bug ;)
To: wikitech-l(a)lists.wikimedia.org
Hello I'm working on running PWB on wikidata and i want to do some
edits via API so i did this:
http://wikidata-test-repo.wikimedia.de/w/api.php?action=wbsetitem&id=392&da…
but gives me this:
http://wikidata-test-repo.wikimedia.de/w/index.php?title=Q392&diff=105675&o…
As you can see, the label changed but the change is not shown in "List
of pages linked to this item(79 entries)" section. I think it's
because they are not the same but it's better to be!
and besides does anybody know how can i take page id? i mean i give
"Tantalum" and take "392" or "Q392"
Best wishes
--
Amir
--
Amir
On 10/18/2012 04:48 PM, Amir Ladsgroup wrote:
> Hello I'm working on running PWB on wikidata and i want to do some
> edits via API so i did this:
> http://wikidata-test-repo.wikimedia.de/w/api.php?action=wbsetitem&id=392&da…
>
> but gives me this:
> http://wikidata-test-repo.wikimedia.de/w/index.php?title=Q392&diff=105675&o…
>
> As you can see, the label changed but the change is not shown in "List
> of pages linked to this item(79 entries)" section. I think it's
> because they are not the same but it's better to be!
>
> and besides does anybody know how can i take page id? i mean i give
> "Tantalum" and take "392" or "Q392"
>
> Best wishes
Amir, thanks for the bug report! it would probably be best if you
continued this conversation on the wikidata-l list
https://lists.wikimedia.org/mailman/listinfo/wikidata-l (cc'd) or in
#wikimedia-wikidata on Freenode IRC.
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
Hi!
I have published a draft of how changes on the wikidata repository are going to
percolate to the client wikis:
https://meta.wikimedia.org/wiki/Wikidata/Notes/Percolation
Any feedback would be appreciated!
Of course, we are not starting this from scratch. We are currently implementing
a stripped down, naive version of the draft. Basically, it works like this:
* Each change on the repository is recorded in the changes table.
* On each client wiki, a poll script periodically checks the changes table.
* The polling script maintains a local copy of the latest version of each data
entity on each cluster used by client wikis.
* If any page on the wiki is affected by the change, an entry representing that
change is injected into the client's recentchanges table.
* Wiki pages that are affected by a change are invalidated.
I think this will work for now, that is, for a small number of client wikis. The
new draft is an attempt to make this architecture scale up to several hundred
client wikis on multiple database clusters.
-- daniel
Hey,
The new site functionality needed by Wikibase has finally been merged into
MediaWiki core and been removed from Wikibase itself. This means that if
you update Wikibase to master you will also need to update core to master
and vice versa.
The two relevant commits are:
* https://gerrit.wikimedia.org/r/#/c/23528/
* https://gerrit.wikimedia.org/r/#/c/27726/
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--