Thanks for the insight. As the issue is rather pressing from our side,
because merges are accumulating, it would be awesome if it would be
feasible to do a backport and roll that out asap.
Thx,
Sebastian
> Oh, it's a known problem:
> https://phabricator.wikimedia.org/T101694
>
> There is a fix already, which got stuck on a minor cosmetic issue. I'll
merge it
> now, should go live in no more than two weeks. But I can try to get it
> backported and deployed sooner.
>
>
> Am 11.11.2015 um 20:10 schrieb Sebastian Burgstaller:
> > Thanks for your reply, just to complete this conversation, I get the
following
> > error message:
> >
> > {'error': {'*': 'See https://www.wikidata.org/w/api.php for API usage',
> > 'code': 'no-direct-editing',
> > 'info': 'Direct editing via API is not supported for content
'
> > 'model wikibase-item used by Q5972069'},
> > 'servedby': 'mw1119'}
> >
> >
> > Undo with the webinterface works, both as a logged in and anonymous
user. Will
> > tinker around a little, if I do not find a solution, I will file a bug
report.
> >
> > Best,
> > Sebastian
> >
> >
> >>
> >> Am 11.11.2015 um 04:08 schrieb Sebastian Burgstaller:
> >> > The undo and undoafter API calls do not seem to work on Wikidata
items.
> >>
> >> That should work. If it doesn't, please file a bug report, and list
some
> >> revisions you tried to undo.
> >>
> >> Please check if you are able to undo the edit manually over the normal
user
> >> interface, too.
> >>
> >>
> >> --
> >> Daniel Kinzler
> >> Senior Software Developer
> >>
> >> Wikimedia Deutschland
> >
> > Gesellschaft zur Förderung Freien Wissens e.V.
> >
> >
> >
> >
> >
> > _______________________________________________
> > Wikidata mailing list
> > Wikidata at lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikidata
> >
>
>
> --
> Daniel Kinzler
> Senior Software Developer
>
> Wikimedia Deutschland
> Gesellschaft zur Förderung Freien Wissens e.V.
Still a couple days available to offer yourself as Wikimania
scholarships committee member, or as ambassador for your community.
-------- Messaggio inoltrato --------
Oggetto: [Wikimania-l] Invitation to Wikimania 2016 Scholarship Committee
Data: Sun, 1 Nov 2015 21:11:05 +0000 (UTC)
Dear Wikimaniacs,
we are now calling for a scholarship review committee.
This year, we additionally want a scholarship ambassador for each major
geography or language, including a representative from each entity which
pays for additional scholarships; the ambassadors will be committee
observers.
The scholarship review committee is an important and diverse group of
volunteers who help to run the scholarship program in accordance with
the pillars
<https://meta.wikimedia.org/wiki/Wikimania_2016_bids/Esino_Lario/Pillars> of
this Wikimania. We encourage people from all Wikimedia wikis to apply
for this position so that the committee can handle applications in many
different languages.
The main duties of the committee members prior to Wikimania 2016 are:
* Participation in periodic online meetings with scholarships program
manager and other committee members.
* Review and edit communications material (e.g. Scholarship wiki,
application questions).
* Assistance in determination of scholarship applicant requirements.
* Assurance of due consideration and speedy response time to Wikimania
scholarship applications in multiple languages.
* Work with the local team.
For further information please visit m:Wikimania/Scholarships/Tasks
<https://meta.wikimedia.org/wiki/Wikimania/Scholarships/Tasks> (the
committee will also be polishing this page).
We are looking for Wikimedians from all over the world, who are:
* fluent in written English and have good communication skills, or can
name local community member(s)/ambassadors who help them with
English translations.
* discreet and able to handle confidential applicant information, and
objectively assess candidates.
* willing to review scholarship applications in late 2015/early 2016.
* having either or both:
o previously attended Wikimania.
o strong knowledge of the cross-project Wikimedia community.
You will be working remotely. While we hope that scholarship review
committee members will enjoy Wikimania 2016 by giving significant input
to it, the local team cannot guarantee financial support for the
committee members' travel expenses since scholarship review committee
members may not apply for a scholarship themselves.
If you're interested in serving on the scholarship reviewing committee
and/or as scholarship ambassador, *please send us an email at
domande.wikimania(a)wikimedia.it*. If you have any questions, please don't
hesitate to contact us.
Deadline to apply is *Sunday, November 15, 2015*. The local team will
contact all candidates and publish the list of scholarship committee
members right afterwards. Please help with translations on
wm2016:Scholarship committee
<https://wikimania2016.wikimedia.org/wiki/Scholarship_committee>.
Proposed timeline
* Nov 1–Nov 15: Call for Scholarship Committee
* Nov 16–Dec 6: Preparations of the Scholarship Committee
* Dec 6–Jan 9: Submission time
Thanks,
Federico Leva
<https://wikimania2016.wikimedia.org/wiki/User:Nemo_bis> and Martin
Rulsch <https://wikimania2016.wikimedia.org/wiki/User:DerHexer>
Wikimania 2016 team, scholarships subteam
Thanks for your reply, just to complete this conversation, I get the
following error message:
{'error': {'*': 'See https://www.wikidata.org/w/api.php for API usage',
'code': 'no-direct-editing',
'info': 'Direct editing via API is not supported for content '
'model wikibase-item used by Q5972069'},
'servedby': 'mw1119'}
Undo with the webinterface works, both as a logged in and anonymous user.
Will tinker around a little, if I do not find a solution, I will file a bug
report.
Best,
Sebastian
>
> Am 11.11.2015 um 04:08 schrieb Sebastian Burgstaller:
> > The undo and undoafter API calls do not seem to work on Wikidata items.
>
> That should work. If it doesn't, please file a bug report, and list some
> revisions you tried to undo.
>
> Please check if you are able to undo the edit manually over the normal
user
> interface, too.
>
>
> --
> Daniel Kinzler
> Senior Software Developer
>
> Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.
The Gene Wiki team is experiencing a problem that may suggest some areas
for improvement in the general wikidata experience.
When our project was getting started, we had some fairly long public
debates about how we should structure the data we wanted to load [1].
These resulted in a data model that, we think, remains pretty much true to
the semantics of the data, at the cost of distributing information about
closely related things (genes, proteins, orthologs) across multiple,
interlinked items. Now, as long as these semantic links between the
different item classes are maintained, this is working out great. However,
we are consistently seeing people merging items that our model needs to be
distinct. Most commonly, we see people merging items about genes with
items about the protein product of the gene (e.g. [2]]). This happens
nearly every day - especially on items related to the more popular
Wikipedia articles. (More examples [3])
Merges like this, as well as other semantics-breaking edits, make it very
challenging to build downstream apps (like the wikipedia infobox) that
depend on having certain structures in place. My question to the list is
how to best protect the semantic models that span multiple entity types in
wikidata? Related to this, is there an opportunity for some consistent way
of explaining these structures to the community when they exist?
I guess the immediate solutions are to (1) write another bot that watches
for model-breaking edits and reverts them and (2) to create an article on
wikidata somewhere that succinctly explains the model and links back to the
discussions that went into its creation.
It seems that anyone that works beyond a single entity type is going to
face the same kind of problems, so I'm posting this here in hopes that
generalizable patterns (and perhaps even supporting code) can be realized
by this community.
[1]
https://www.wikidata.org/wiki/Wikidata_talk:WikiProject_Molecular_biology#D…
[2] https://www.wikidata.org/w/index.php?title=Q417782&oldid=262745370
[3]
https://s3.amazonaws.com/uploads.hipchat.com/25885/699742/rTrv5VgLm5yQg6z/m…
Hi everyone,
Alongside the data model protection discussion, I am now developing a bot
which should undo the merges we are facing in the Gene Wiki project.
In order for the rollback Wikidata API call to work, I guess our bot
account would require special permission for rollback to be able to do
that?
The undo and undoafter API calls do not seem to work on Wikidata items.
Unfortunately, these restrictions are not documented in the Wikidata
specific API help:
https://www.wikidata.org/w/api.php?action=help&modules=main
Are there other ways to achieve undo/rollback of certain changes which I am
not aware of? Because using the data of the last good old revision and
writing it freshly as the latest revision is probably not the way to go...
Thanks!
Best,
Sebastian
In the article "Presenting Wikidata knowledge" [1], I've Been a bit Bold
and specified a recipe:
1. Find existing interesting wiki pages in the domain of your application.
2. View the Wikidata information for those pages, choose interesting
properties.
3. Associate Wikidata entity IDs with entities of your application.
4. Display their Wikidata information in the user's language.
5. Use the Wikidata "sitelinks" information about the item to provide links
to the full Wikipedia (and Wikiquote, Wikivoyage, etc.) article about the
entity in the user's language.
But I realize for something like a reference app there won't be Wikidata
items for every entity in your app for step 3: not every book in print has
a Wikidata item, nor does every musical recording, etc. For those there are
already identifiers such as ISBNs and "MusicBrainz release group ID"s (mmm,
brains). I assume reference app developers already use these more complete
identifiers and so I'm inviting them to add Wikidata entity IDs where
available.
I think these other identifiers are all "Wikidata property representing a
unique identifier" and there are about 350 of them [2] But surprisingly, I
couldn't find an easy way to look up a Wikidata item using these other
identifiers.
I found you can do it one-by-one in Wikidata Query [3] and in Wikidata
Query Serivce [4] but neither seems amenable to doing a query on the fly
"Get me the Wikidata item for each of these 100 ISBNs "2-7071-1620-3", ...
Also, is this a temporary thing? Will Wikidata eventually have items for
every book published, every musical recording, etc. and become a superset
of all those unique identifiers?
Thanks!
[1] https://www.mediawiki.org/wiki/API:Presenting_Wikidata_knowledge
[2] https://www.wikidata.org/wiki/Special:WhatLinksHere/Q19847637?limit=500
[2] https://wdq.wmflabs.org/api?q=string%5B957:"2-7071-1620-3"%5D and
[3] https://query.wikidata.org with the the SPARQL (mmm, sparkly)
PREFIX wdt: <http://www.wikidata.org/prop/direct/>
SELECT ?book WHERE {
?book wdt:P957 "2-7071-1620-3"
}
--
=S Page WMF Tech writer
I wrote an article about the Wikidata API for the Web APIs hub [1], which
encourages third-party developers to use Wikimedia data and APIS, <
https://www.mediawiki.org/wiki/API:Presenting_Wikidata_knowledge>
It shows Maxime Lathuilière's inventaire.io as it talks about associating
Wikidata entity IDs with your app and showing data. I welcome any feedback,
or you can fix obvious misteaks. Cast your mind back before you knew how to
make Wikidata API request. Would this have helped you? What's missing?
Thanks!
[1] https://www.mediawiki.org/wiki/API:Web_APIs_hub , I'll add it to the
"Inspire" section soon.
--
=S Page WMF Tech writer
Hi! We looked at the logs. 21,740,641 requests are coming from a single IP
without a user agent that we can't geolocate because it's in the 10 range.
Looking into the actual queries revealed that it's probably a broken bot.
Stas said "the query makes no sense and is broken" and that it "looks like
somebody trying to download whole DB in very weird way but is doing it all
wrong."
We are investigating the issue.
– *Mikhail Popov* // Data Analyst, Discovery