I believe there are a lot of dangerous assumptions on http://simia.net/valueparser/
First: there is no indication in a number that it is _not_ endlessly precise. Apostles = 12 has no uncertainty, representing it as 12 ± 1 is wrong, but also 12 ± 0.5 is wrong.
The same applies to a number like 12.2. The data source and author MAY desire to express significant digits, but we simply don't know. Wikidata should keep this at the don't know level and not force-convert a number of unknown measurement precision to a number with explicitly stated (but potentially totally wrong) precision or accuracy limits.
For example, in science it is quite common to give light microscopic measurement to one decimals behind the micrometer, even though the precision is 0.2 µm. The latter is simply known and therefore not constantly repeated, unless specific circumstances justify this.
As discussed above: plus minus 1 s.d. does not give you a confidence interval for the mean, it gives you a measure of dispersion.
---------
My proposal: make the default: plus-minus values unknown, only significant digits known. The interpretation of significant digits is not machine-available unless qualifiers say so. It can however be used to result in an estimate of significant digits after conversion.
Make the interval-points an option. If explicitly entered: excellent information. If not: don't try to create (false) knowledge from void.
Gregor