Hello,
I’m doing some analyses in which I want to identify Wikidata edits done via editing tools (e.g. via QuickStatements, etc…). To identify these edits, I've first flagged and removed bot edits and then I’ve generated a list of the 1000 most popular revision comment words (ignoring case and some punctuation characters as part of this process). Within this list of words, I've identified 15 words that I believe indicate tool edits. I’ve included these 15 words below.
Does anyone know of tool edits that would be missed if I search for revisions that contain one of these 15 words in their comments? Put another way, are there editing tools not listed below? If so, can I identify edits from those tools from revision comments?
#quickstatements
#petscan
#autolist2
autoedit
nameguzzler
labellister
#itemcreator
#dragrefjs
[[useryms/lc|lcjs]]
#wikidatagame
[[wikidataprimary
#mix'n'match
mix'n'match
#distributedgame
[[userjitrixis/nameguzzlerjs|nameguzzler]]
Thanks in advance,
Andrew Hall
Hi all,
We would like to link from our pages to wikidata pages. Who should we
contact in this regard? We would need a contact other than the mailing
list, if possible.
I also want to make sure we will not disturb the SPARQL endpoint service
with our query. We want to retrieve all pages pointing to a UniProt
entry regardless of the taxon. So far we have this query
SELECT ?item ?itemLabel ?UniProt_ID ?taxonID WHERE {
?item wdt:P352 ?UniProt_ID ;
wdt:P703 ?taxon .
?taxon wdt:P31 wd:Q16521 ;
wdt:P685 ?taxonID .
SERVICE wikibase:label { bd:serviceParam wikibase:language
"[AUTO_LANGUAGE],en". }
}
I tried it with a limit of 100 and it worked fine but wondering what
would be the recommended way if we want them all.
By the way, what would be the way to get that query using the query
helper? I did not managed so I wrote it manually.
Regards,
Hey folks,
I wanted to draw your attention to a deletion nomination discussion for an
experimental template – {{Cite Q}}
<https://en.wikipedia.org/wiki/Template:Cite_Q> – pulling bibliographic
data from Wikidata:
https://en.wikipedia.org/wiki/Wikipedia:Templates_for_discussion/Log/2017_S…
As you'll see, there is significant resistance against the broader usage of
a template which exemplifies how structured bibliographic data in WIkidata
could be reused across Wikimedia projects.
I personally think many of the concerns brought up by editors who support
the deletion request are legitimate. As the editor who nominated the
template for deletion notes: "The existence of the template is one thing;
the advocacy to use this systematically is another one altogether. Anybody
seeking that kind of systematic, radical change in Wikipedia must get
consensus for that in Wikipedia first. Being BOLD is fine but has its
limits, and this kind of thing is one of them."
I find myself in agreement with this statement, which I believe applies to
much more than just bibliographic data from Wikidata: it's about virtually
any kind of data and contents reused across projects governed by different
policies and expectations. I think what's happening is that an experimental
template – primarily meant to showcase how data reuse from Wikidata
*might *work
– is perceived as a norm for how references *will* or *should* work in the
future.
If you're involved in the WikiCite initiative, and are considering
participating in the deletion discussion, I encourage you to keep a
constructive tone and understand the perspective of people who are
concerned about the use and misuse of this template.
As one of the WikiCite organizers, I see the success of the initiative as
coming from rich, highly curated data that other projects will want to
reuse, and from technical and usability advances for all contributors, not
from giving an impression that the goal is to use Wikidata to subvert how
other Wikimedia communities do their job. I'll post a note explaining my
perspective.
Dario
Is there anyone that has done any work on how to encode statements as
features for neural nets? I'm mostly interested in sparse encoders for
online training of live networks.