On 19.12.2012 18:13, Gregor Hagedorn wrote:
On 19 December 2012 17:03, Daniel Kinzler daniel.kinzler@wikimedia.de wrote:
Indeed we do: https://wikidata.org/wiki/Wikidata:Glossary
I use "precision" exactly like that: significant digits when rendering output or parsing intput. It can be used to *guess* at the values accuracy, but is not the same.
(I cannot see that definition there, the word "precision" does not exist on that page.)
Oh, I didn't mean to imply it's there. But we do have a glossary, and that's where any definition should go, once we have them.
There is an issue here that we can speak of a the precision of a number in this sense (number of significant digits).
I'd prefer a concept that is independent of representation, the number of significant digits depends on the unit and also a base-10 representation.
PS (I am still unsure how you define in "Wikidata speak" the term "accuracy"; Daniel gave an def. but I could not follow it yet).
I was using "accuracy" to describe how sure we are about a given value, and "precision" to describe how exactly we represent (render) the value. I now see that this choice of terminology may be confusing, and that there are also multiple aspects of certainty.
So, please suggest terms to use for at least these two things:
1) value certainty (ideally, not using "digits", but something that is independent of unit and rendering) 2) output exactness (here, the number of digits is actually what we want to talk about)
Perhaps we need additional distinctions, but I'm sure we at least need these two concepts.
-- daniel