Hi,
I have a couple of questions regarding the Wiki Page ID. Does it always
stay unique for the page, where the page itself is just a placeholder for
any kind of information that might change over time?
Consider the following cases:
1. The first time someone creates page "Moon" it is assigned ID=1. If at
some point the page is renamed to "The_Moon", the ID=1 remains intact. Is
this correct?
2. What if we have page "Moon" with ID=1. Someone creates a second-page
"The_Moon" with ID=2. Is it possible that page "Moon" is transformed into a
redirect? Then, "Moon" would be redirecting to page "The_Moon"?
3. Is it possible for page "Moon" to become a category "Category:Moon" with
the same ID=1?
Thanks,
Gintas
Hi!
I would like to initiate a discussion about coordinate precision in
Wikidata and Query Service. The reason is that right now we do not have
any limit to precision, coordinates are basically doubles, and that
allows to specify over-precise coordinates and makes it harder to
compare them - both between themselves within Wikidata and with outside
services.
>From the precision description in [1], we would rarely need beyond third
or fourth digit after the decimal point. However, we have in the
database coordinates like: Point(13.366666666 41.766666666) which
pretends to specify it with sub-millimeter accuracy - for an entity that
describes a municipality[2]!
We do have precision on values - e.g. the above has specified precision
of "arcseconds" - so it may be just a formatting issue, but even
arcsecond looks somewhat over-precise for a city. And it may be a bit
challenging to convert DMS precision DD precision.
But the bigger question is whether we should store over-precise
coordinates in the database at all, or we should round them up on export
or inside the data. The formulae that are used to calculate distances
have, by obvious reasons, limited precision, and direct comparisons
can't take precision into account, which may lead to such coordinates
very hard to work with. Should we maybe just put a limit on how precise
we put coordinates into RDF and in query service? Would four decimals
after the dot be enough? According to [4] this is what commercial GPS
device can provide. If not, why and which accuracy would be appropriate?
We do export precision of the coordinate as wikibase:geoPrecision[3] -
and we currently have 258060 distinct values for it. This is very weird.
I am not sure precision is useful in this form. Can anybody tell me any
use case for this number now? If not, maybe we should change how we
represent it. I'm also not sure where these come from as we only have
13 options in the UI. Bots?
[1] https://en.wikipedia.org/wiki/Decimal_degrees
[2] https://www.wikidata.org/wiki/Q116746
[3]
https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Globe_coor…
[4]
https://gis.stackexchange.com/questions/8650/measuring-accuracy-of-latitude…
--
Stas Malyshev
smalyshev(a)wikimedia.org
Hi folks,
is anyone using the Wikidata entity dump dcatap.rdf at
https://dumps.wikimedia.org/wikidatawiki/entities/dcatap.rdf?
It is very rarely used and is thus causing us a (probably) undue
maintenance burden, because of which we plan to remove it.
If anyone is making use of it, please speak up so that we can keep it or
find a viable alternative.
Cheers,
Marius
Hello,
I’m doing some analyses in which I want to identify Wikidata edits done via editing tools (e.g. via QuickStatements, etc…). To identify these edits, I've first flagged and removed bot edits and then I’ve generated a list of the 1000 most popular revision comment words (ignoring case and some punctuation characters as part of this process). Within this list of words, I've identified 15 words that I believe indicate tool edits. I’ve included these 15 words below.
Does anyone know of tool edits that would be missed if I search for revisions that contain one of these 15 words in their comments? Put another way, are there editing tools not listed below? If so, can I identify edits from those tools from revision comments?
#quickstatements
#petscan
#autolist2
autoedit
nameguzzler
labellister
#itemcreator
#dragrefjs
[[useryms/lc|lcjs]]
#wikidatagame
[[wikidataprimary
#mix'n'match
mix'n'match
#distributedgame
[[userjitrixis/nameguzzlerjs|nameguzzler]]
Thanks in advance,
Andrew Hall
Hey folks,
I wanted to draw your attention to a deletion nomination discussion for an
experimental template – {{Cite Q}}
<https://en.wikipedia.org/wiki/Template:Cite_Q> – pulling bibliographic
data from Wikidata:
https://en.wikipedia.org/wiki/Wikipedia:Templates_for_discussion/Log/2017_S…
As you'll see, there is significant resistance against the broader usage of
a template which exemplifies how structured bibliographic data in WIkidata
could be reused across Wikimedia projects.
I personally think many of the concerns brought up by editors who support
the deletion request are legitimate. As the editor who nominated the
template for deletion notes: "The existence of the template is one thing;
the advocacy to use this systematically is another one altogether. Anybody
seeking that kind of systematic, radical change in Wikipedia must get
consensus for that in Wikipedia first. Being BOLD is fine but has its
limits, and this kind of thing is one of them."
I find myself in agreement with this statement, which I believe applies to
much more than just bibliographic data from Wikidata: it's about virtually
any kind of data and contents reused across projects governed by different
policies and expectations. I think what's happening is that an experimental
template – primarily meant to showcase how data reuse from Wikidata
*might *work
– is perceived as a norm for how references *will* or *should* work in the
future.
If you're involved in the WikiCite initiative, and are considering
participating in the deletion discussion, I encourage you to keep a
constructive tone and understand the perspective of people who are
concerned about the use and misuse of this template.
As one of the WikiCite organizers, I see the success of the initiative as
coming from rich, highly curated data that other projects will want to
reuse, and from technical and usability advances for all contributors, not
from giving an impression that the goal is to use Wikidata to subvert how
other Wikimedia communities do their job. I'll post a note explaining my
perspective.
Dario
Is there anyone that has done any work on how to encode statements as
features for neural nets? I'm mostly interested in sparse encoders for
online training of live networks.
Hi everyone!
I have resumed my tests of Wikibase. I installed the latest version of
MediaWiki and Wikibase without problems, without errors. I just have
problems with the add statement button (+Add).
At the beginning I thought that it could be the same problem that I
experiment the last time that I worked with Wikibase. However, this is
different because I haven't duplicated the constants and I haven't the
PHP error about the constants. I checked the console and I read:
This page is using the deprecated ResourceLoader module
"jquery.ui.position". load.php:19:950
This page is using the deprecated ResourceLoader module
"jquery.ui.widget". load.php:30:794
This page is using the deprecated ResourceLoader module
"jquery.ui.core".
Please use "mediawiki.ui.button" or "oojs-ui" instead. load.php:1:80
This page is using the deprecated ResourceLoader module
"jquery.tipsy". load.php:69:171
Searching I found this topic in MediaWiki
<https://www.mediawiki.org/wiki/Topic:Ty4vs2ssccigib5s>, but I read in
the Phabricator task <https://phabricator.wikimedia.org/T175910> that it
is solved. When I installed Wikibase I made a clone of the latest
version in GitHub.
Anyone knows what is happening in this case?
Thanks in advance!
Regards, Iván
--
Iván Hernández Cazorla
Miembro de *Wikimedia España*
Heiya,
cross posting for those who are interested and have not come across this
message yet.
Cheers Karsten
-------- Weitergeleitete Nachricht --------
Betreff: [SMW-devel] SMWCon Fall 2017 news: The program and schedule
are now online
Datum: Sat, 23 Sep 2017 17:25:37 +0200
Von: Rutledge, Lloyd <Lloyd.Rutledge(a)ou.nl>
An: semediawiki-devel(a)lists.sourceforge.net
**News: The program and schedule are now online**
Dear users, developers and all people interested in semantic wikis,
We are happy to announce SMWCon Fall 2017 - the 14th Semantic MediaWiki
Conference:
Dates: October 4th to October 6th 2017 (Wednesday to Friday).
Location: Rotterdam Zoo (Blijdorp), Rotterdam, the Netherlands
Conference page:
https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2017
Participants: Everybody interested in semantic wikis, especially in
Semantic MediaWiki, e.g., users, developers, consultants, business
representatives, researchers.
SMWCon Fall 2017 will be supported by ArchiXL B.V. [0], Wikibase
Solutions [1], The Open University in the Netherlands [2] and Open
Semantic Data Association e. V. [3].
Following the success of this format the SMWCon will have one tutorial
and workshop day preceding two conference days.
Participating in the conference: To help us planning, you can already
informally register on the conference page, although a formal
registration will later be needed.
Contributing to the conference: If you want to present your work in the
conference please go to the conference page and add your talk there. To
create an attractive program for the conference, we will later ask you
to give further information about your proposals.
Among others, we encourage contributions on the following topics:
Applications of semantic wikis:
Semantic wikis for enterprise workflows and business intelligence
Semantic wikis for corporate or personal knowledge management
Exchange on business models with semantic wikis
Lessons learned (best/worst practices) from using semantic wikis or
their extensions
Semantic wikis in e-science, e-humanities, e-learning, e-health,
e-government
Semantic wikis for finding a common vocabulary among a group of people
Semantic wikis for teaching students about the Semantic Web
Offering incentives for users of semantic wikis
Challenges and obstacles for semantic wikis in business environments
Development of semantic wikis:
Semantic wikis as knowledge base backends / data integration platforms
Comparisons of semantic wiki concepts and technologies
Community building, feature wishlists, roadmapping of Semantic
MediaWiki
Improving user experience in a semantic wiki
Speeding up semantic wikis
Integrations and interoperability of semantic wikis with other
applications and mashups
Modeling of complex domains in semantic wikis, using rules, formulas
etc.
Access control and security aspects in semantic wikis
Multilingual semantic wikis
For questions about sponsorship opportunities, please do not hesitate to
contact Ad Strack van Schijndel <ad at wikibase.nl>.
Hope to see you in Rotterdam!
Remco de Boer, Toine Schijvenaars, Esther Greefhorst, Lloyd Rutledge,
Erwin Oord, Ad Strack van Schijndel
(The Organizing Committee)
[0] http://www.archixl.nl/en/
[1] http://www.wikibase.nl/
[2] https://www.ou.nl/
[3] https://opensemanticdata.org/