Hi!
I would like to initiate a discussion about coordinate precision in Wikidata and Query Service. The reason is that right now we do not have any limit to precision, coordinates are basically doubles, and that allows to specify over-precise coordinates and makes it harder to compare them - both between themselves within Wikidata and with outside services.
From the precision description in [1], we would rarely need beyond third
or fourth digit after the decimal point. However, we have in the database coordinates like: Point(13.366666666 41.766666666) which pretends to specify it with sub-millimeter accuracy - for an entity that describes a municipality[2]!
We do have precision on values - e.g. the above has specified precision of "arcseconds" - so it may be just a formatting issue, but even arcsecond looks somewhat over-precise for a city. And it may be a bit challenging to convert DMS precision DD precision.
But the bigger question is whether we should store over-precise coordinates in the database at all, or we should round them up on export or inside the data. The formulae that are used to calculate distances have, by obvious reasons, limited precision, and direct comparisons can't take precision into account, which may lead to such coordinates very hard to work with. Should we maybe just put a limit on how precise we put coordinates into RDF and in query service? Would four decimals after the dot be enough? According to [4] this is what commercial GPS device can provide. If not, why and which accuracy would be appropriate?
We do export precision of the coordinate as wikibase:geoPrecision[3] - and we currently have 258060 distinct values for it. This is very weird. I am not sure precision is useful in this form. Can anybody tell me any use case for this number now? If not, maybe we should change how we represent it. I'm also not sure where these come from as we only have 13 options in the UI. Bots?
[1] https://en.wikipedia.org/wiki/Decimal_degrees [2] https://www.wikidata.org/wiki/Q116746 [3] https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Globe_coord... [4] https://gis.stackexchange.com/questions/8650/measuring-accuracy-of-latitude-...
Stas and All,
Would the coordinate precision in Wikidata and Query Service allow us now to move to the cellular and even the atomic and subatomic levels, say, for example, when querying for differences between microscopic species in a piece of earth in a municipality that might be rich in smallest life forms - say differences between nanobes https://en.wikipedia.org/wiki/Nanobe ? And would Wikidata want even further precision for knowledge generation based on what we might learn from expansion microcopy e.g. Ed Boyden's work at MIT - say in brain science - https://www.youtube.com/watch?v=bPlr31LrT0g ?
Scott
On Tue, Aug 29, 2017 at 2:13 PM, Stas Malyshev smalyshev@wikimedia.org wrote:
Hi!
I would like to initiate a discussion about coordinate precision in Wikidata and Query Service. The reason is that right now we do not have any limit to precision, coordinates are basically doubles, and that allows to specify over-precise coordinates and makes it harder to compare them - both between themselves within Wikidata and with outside services.
From the precision description in [1], we would rarely need beyond third or fourth digit after the decimal point. However, we have in the database coordinates like: Point(13.366666666 41.766666666) which pretends to specify it with sub-millimeter accuracy - for an entity that describes a municipality[2]!
We do have precision on values - e.g. the above has specified precision of "arcseconds" - so it may be just a formatting issue, but even arcsecond looks somewhat over-precise for a city. And it may be a bit challenging to convert DMS precision DD precision.
But the bigger question is whether we should store over-precise coordinates in the database at all, or we should round them up on export or inside the data. The formulae that are used to calculate distances have, by obvious reasons, limited precision, and direct comparisons can't take precision into account, which may lead to such coordinates very hard to work with. Should we maybe just put a limit on how precise we put coordinates into RDF and in query service? Would four decimals after the dot be enough? According to [4] this is what commercial GPS device can provide. If not, why and which accuracy would be appropriate?
We do export precision of the coordinate as wikibase:geoPrecision[3] - and we currently have 258060 distinct values for it. This is very weird. I am not sure precision is useful in this form. Can anybody tell me any use case for this number now? If not, maybe we should change how we represent it. I'm also not sure where these come from as we only have 13 options in the UI. Bots?
[1] https://en.wikipedia.org/wiki/Decimal_degrees [2] https://www.wikidata.org/wiki/Q116746 [3] https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_ Dump_Format#Globe_coordinate [4] https://gis.stackexchange.com/questions/8650/measuring- accuracy-of-latitude-and-longitude
-- Stas Malyshev smalyshev@wikimedia.org
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Hi Stas and wikidatans,
the coordinates with (sub)millimeter precision might be imported from external dataset. With precise geodesic measurements, it is technically possible to reach such precision, yet it often decrease significantly by transformation to WGS84. Some places like important peaks or astronomy observatories can be located with such precision.
So I am generally against limiting the number of digits too much, but limit it to something like millimeter precision sounds reasonable.
Regards, Jan
Hi!
transformation to WGS84. Some places like important peaks or astronomy observatories can be located with such precision.
True, but this probably won't be kept up-to-date and most likely would be useless for Wikidata users since random very precise data can not be relied on unless there's a guarantee certain set of data (e.g. all observatories) have the same accuracy and it is kept up-to-date. Which Wikidata does not have.
So I am generally against limiting the number of digits too much, but limit it to something like millimeter precision sounds reasonable.
I think we should at least limit it to specified precision (not the case today) but maybe even more in case precision is too high. I don't think anything beyond 1m is really useful - please provide use cases if you think otherwise.
Also, wikibase:geoPrecision seems to be completely useless in the form it exists now. Is anybody actually using this value, and if so, how? If not, I'll probably change how it works.
Hi
On 30 August 2017 at 22:29, Stas Malyshev smalyshev@wikimedia.org wrote:
True, but this probably won't be kept up-to-date and most likely would be useless for Wikidata users since random very precise data can not be relied on unless there's a guarantee certain set of data (e.g. all observatories) have the same accuracy and it is kept up-to-date.
Like in all the wiki-world, this should be guaranteed through the reference claim, shouldn't it?
I think we should at least limit it to specified precision (not the case today) but maybe even more in case precision is too high. I don't think anything beyond 1m is really useful - please provide use cases if you think otherwise.
I can't think of anything really useful right now... What I'm afraid of is creating fake grids like it happened in DBpedia with too much rounding (see [1], p. 44/45). Maybe with 1 m precision it won't happen today, but it might in the future for very POI-dense areas.. (might it?)
[1] https://zenodo.org/record/55381
https://zenodo.org/record/55381 Best regards Jan
You'll want to get down to millimeter.
For one reason...There are already tons of public LIDAR datasets with point data down to millimeter. I've even seen monuments and public art sculptures that have LIDAR datasets down to millimeter precision using WGS84 base references. (think knowing the coordinates for George Washington's left nostril on Mount Rushmore against trilateration references of WGS84 ... crazy I know, but its out there) https://tnris.org/data-catalog/entry/city-of-georgetown-2015-50-cm/ https://nationalmap.gov/elevation.html https://catalog.data.gov/dataset/lidar-point-cloud-usgs-national-map
Having it all queryable in Wikidata ? hmmm... not for me, other data catalogs and GIS systems handle that job. But having the datasets linked etc. to spatial references ? You'll want to ensure that folks can populate against a spatial reference property like the City of Georgetown example link above that gives: Spatial Reference:
EPSG 2277 https://epsg.io/2277 My thoughts, -Thad +ThadGuidry https://plus.google.com/+ThadGuidry
On Wed, Aug 30, 2017 at 4:19 PM Jan Macura macurajan@gmail.com wrote:
Hi
On 30 August 2017 at 22:29, Stas Malyshev smalyshev@wikimedia.org wrote:
True, but this probably won't be kept up-to-date and most likely would be useless for Wikidata users since random very precise data can not be relied on unless there's a guarantee certain set of data (e.g. all observatories) have the same accuracy and it is kept up-to-date.
Like in all the wiki-world, this should be guaranteed through the reference claim, shouldn't it?
I think we should at least limit it to specified precision (not the case today) but maybe even more in case precision is too high. I don't think anything beyond 1m is really useful - please provide use cases if you think otherwise.
I can't think of anything really useful right now... What I'm afraid of is creating fake grids like it happened in DBpedia with too much rounding (see [1], p. 44/45). Maybe with 1 m precision it won't happen today, but it might in the future for very POI-dense areas.. (might it?)
[1] https://zenodo.org/record/55381
https://zenodo.org/record/55381 Best regards Jan _______________________________________________ Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
On Tue, Aug 29, 2017 at 2:13 PM, Stas Malyshev smalyshev@wikimedia.org wrote:
[...] Would four decimals after the dot be enough? According to [4] this is what commercial GPS device can provide. If not, why and which accuracy would be appropriate?
I think that should be 5 decimals for commercial GPS, per that link? It also suggests that "The sixth decimal place is worth up to 0.11 m: you can use this for laying out structures in detail, for designing landscapes, building roads. It should be more than good enough for tracking movements of glaciers and rivers. This can be achieved by taking painstaking measures with GPS, such as differentially corrected GPS."
Do we hope to store datasets around glacier movement? It seems possible. (We don't seem to currently https://www.wikidata.org/wiki/Q770424 )
I skimmed a few search results, and found 7 (or 15) decimals given in one standard, but the details are beyond my understanding: http://resources.esri.com/help/9.3/arcgisengine/java/gp_toolref/geoprocessin... https://stackoverflow.com/questions/1947481/how-many-significant-digits-shou... https://stackoverflow.com/questions/7167604/how-accurately-should-i-store-la...
[4] https://gis.stackexchange.com/questions/8650/measuring-accuracy-of-latitude-...
Hi!
For one reason...There are already tons of public LIDAR datasets with point data down to millimeter. I've even seen monuments and public art sculptures that have LIDAR datasets down to millimeter precision using WGS84 base references.
OK, but: 1. For objects we store in wikidata, what exactly is known with this precision? E.g., if you have https://www.wikidata.org/wiki/Q5577191 - which part of it the coordinate specifies up to millimeter? That is a big monument, lots of possibilities. No way to know. 2. For objects we store in wikidata, this is probably not stable - even if we somehow defined that the point we know up to millimeter is the tip of Schiller's nose in Q5577191 - that probably would change with weather conditions, erosion, ground shifts, continental drift (on millimeter scale, that becomes relevant), etc. 3. For any practical use - such as querying - having data for one statue only with millimeter accuracy is useless, to really do some calculation where it matters all statues (or at least all statues, say, in New York - or at least most of them) should have such accuracy. That obviously won't be the case.
For this reasons, I don't think millimeter accuracy would ever be relevant for Wikidata.
Having it all queryable in Wikidata ? hmmm... not for me, other data catalogs and GIS systems handle that job.
Exactly.
Having it all queryable in Wikidata ? hmmm... not for me, other data catalogs and GIS systems handle that job.
Exactly.
Glad you agree. :) Once someone comes up with a REAL usecase and not hyperbole, then we can get to work...until then. GIS systems do the job quite well, and many are publicly queryable and... FUNDED :) :)
-Thad +ThadGuidry https://plus.google.com/+ThadGuidry
Now the spatial reference....
That's what folks should discuss about. That's a real need, but I think is already solved with an existing property ? or even identifier system ? -Thad +ThadGuidry https://plus.google.com/+ThadGuidry
On Thu, Aug 31, 2017 at 4:30 PM Thad Guidry thadguidry@gmail.com wrote:
Having it all queryable in Wikidata ? hmmm... not for me, other data
catalogs and GIS systems handle that job.
Exactly.
Glad you agree. :) Once someone comes up with a REAL usecase and not hyperbole, then we can get to work...until then. GIS systems do the job quite well, and many are publicly queryable and... FUNDED :) :)
-Thad +ThadGuidry https://plus.google.com/+ThadGuidry
Hi!
relied on unless there's a guarantee certain set of data (e.g. all observatories) have the same accuracy and it is kept up-to-date.
Like in all the wiki-world, this should be guaranteed through the reference claim, shouldn't it?
No, reference only says where the data comes from, best case - when it was retrieved (vast majority now doesn't have that either). But no guarantee it is still true with claimed precision.
I can't think of anything really useful right now... What I'm afraid of is creating fake grids like it happened in DBpedia with too much rounding (see [1], p. 44/45). Maybe with 1 m precision it won't happen today, but it might in the future for very POI-dense areas.. (might it?)
Thanks for the link! Indeed, if the precision is too low, there is danger, but from what I understood (my Czech is kinda weak ;) the grids there are also because of some data representation artifact.
Hi!
I think that should be 5 decimals for commercial GPS, per that link? It also suggests that "The sixth decimal place is worth up to 0.11 m: you can use this for laying out structures in detail, for designing landscapes, building roads. It should be more than good enough for tracking movements of glaciers and rivers. This can be achieved by taking painstaking measures with GPS, such as differentially corrected GPS."
This does not seem to be typical (or recommended) use case for Wikidata. If you need to build a road, you better have some GIS database beyond Wikidata I think :)
Do we hope to store datasets around glacier movement? It seems possible. (We don't seem to currently https://www.wikidata.org/wiki/Q770424 )
I skimmed a few search results, and found 7 (or 15) decimals given in one standard, but the details are beyond my understanding:
Note that there's a difference between what general GIS standard would require (which has much more use cases), what we want to store on Wikidata and what we want to use for RDF export and querying. The latter is of more concern to me - as overprecision there might actually make things a bit harder to work with (such as - "are these two things actually the same thing?" or "are they located in the same place?") Of course, all those problems are solvable, but why not make it easier?
The 9 digits precision was based on a survey of Wikipedia we did back then and the most precise GPS coordinates in Wikipedia. Unfortunately, I don't remember anymore what article it was - it was some article listing a number of places that have, due to whatever reason - really high precisions. If someone finds the article again, I would be thankful. It might help in this conversation. The 9 digits were not chosen arbitrarily, but based on the requirements from Wikipedia.
But, this is just the most detailed precision. In most cases, as you notice, we won't need this high precision, but we will have a much lower precision. Pinning down a municipality with a 9 digit precision is obviously nonsense. For most countries, any precision beyond 0 seems quite nonsensical.
But that's also true for time. The time data model allows to second-precision, but obviously, for much of the data that does not make sense. Nevertheless, the datamodel supports saving it, we don't want to loose here compared to the base data.
I am not sure I understand the issue and what the suggestion is to solve it. If we decide to arbitrarily reduce the possible range for the precision, this still won't lead to any improvements for countries as compared to statues. As far as I can tell, the only way to actually solve this is to provide query patterns that take the precision into account and to have the system implement it correctly.
On Thu, Aug 31, 2017 at 2:38 PM Stas Malyshev smalyshev@wikimedia.org wrote:
Hi!
I think that should be 5 decimals for commercial GPS, per that link? It also suggests that "The sixth decimal place is worth up to 0.11 m: you can use this for laying out structures in detail, for designing landscapes, building roads. It should be more than good enough for tracking movements of glaciers and rivers. This can be achieved by taking painstaking measures with GPS, such as differentially corrected GPS."
This does not seem to be typical (or recommended) use case for Wikidata. If you need to build a road, you better have some GIS database beyond Wikidata I think :)
Do we hope to store datasets around glacier movement? It seems possible. (We don't seem to currently https://www.wikidata.org/wiki/Q770424 )
I skimmed a few search results, and found 7 (or 15) decimals given in one standard, but the details are beyond my understanding:
Note that there's a difference between what general GIS standard would require (which has much more use cases), what we want to store on Wikidata and what we want to use for RDF export and querying. The latter is of more concern to me - as overprecision there might actually make things a bit harder to work with (such as - "are these two things actually the same thing?" or "are they located in the same place?") Of course, all those problems are solvable, but why not make it easier?
-- Stas Malyshev smalyshev@wikimedia.org
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Hi!
I am not sure I understand the issue and what the suggestion is to solve it. If we decide to arbitrarily reduce the possible range for the
Well, there are actually several issues right now.
1. Our RDF output produces coordinates with more digits that specified precision of the actual value. 2. Our precision values as specified in wikibase:geoPrecision seem to make little sense. 3. We may represent the same coordinates for objects located in the same place as different ones because of precision values being kinda chaotic. 4. We may have different data from other databases because our coordinate is over-precise.
(1) is probably easiest to fix. (2) is a bit harder, and I am still not sure how wikibase:geoPrecision is used, if at all. (3) and (4) are less important, but it would be nice to improve, and maybe they will be mostly fixed once (1) and (2) are fixed. But before approaching the fix, I wanted to understand what are expectations from precision and if there can or should be some limits there. Technically, it doesn't matter too much - except that some formulae for distances do not work well for high precisions because of limited accuracy of 64-bit double, but there are ways around it. So technically we can keep 9 digits or however many we need, if we wanted to. I just wanted to see if we should.
The reason why we save the actual value with more digits than the precision (and why we keep the precision as an explicit value at all) is because the value could be entered and displayed either as decimal digits or in minutes and seconds. So internally one would save 20' as 0.333333333, but the precision is still just 2. This allows to roundtrip.
I hope that makes any sense?
Yes, that means that using the values for comparison without taking the prevision into account will fail.
I don't think comparison and other operators were ever specified for the datetypes. This has bitten us before, and I think it would be valuable to do. And that would resolve these issues, and some others.
Would there be people interested in doing that? I sure would love to get it right.
On Thu, Aug 31, 2017, 16:17 Stas Malyshev smalyshev@wikimedia.org wrote:
Hi!
I am not sure I understand the issue and what the suggestion is to solve it. If we decide to arbitrarily reduce the possible range for the
Well, there are actually several issues right now.
- Our RDF output produces coordinates with more digits that specified
precision of the actual value. 2. Our precision values as specified in wikibase:geoPrecision seem to make little sense. 3. We may represent the same coordinates for objects located in the same place as different ones because of precision values being kinda chaotic. 4. We may have different data from other databases because our coordinate is over-precise.
(1) is probably easiest to fix. (2) is a bit harder, and I am still not sure how wikibase:geoPrecision is used, if at all. (3) and (4) are less important, but it would be nice to improve, and maybe they will be mostly fixed once (1) and (2) are fixed. But before approaching the fix, I wanted to understand what are expectations from precision and if there can or should be some limits there. Technically, it doesn't matter too much - except that some formulae for distances do not work well for high precisions because of limited accuracy of 64-bit double, but there are ways around it. So technically we can keep 9 digits or however many we need, if we wanted to. I just wanted to see if we should.
-- Stas Malyshev smalyshev@wikimedia.org
And thanks for the use cases. This helps a lot with thinking about this.
On Thu, Aug 31, 2017, 16:31 Denny Vrandečić vrandecic@gmail.com wrote:
The reason why we save the actual value with more digits than the precision (and why we keep the precision as an explicit value at all) is because the value could be entered and displayed either as decimal digits or in minutes and seconds. So internally one would save 20' as 0.333333333, but the precision is still just 2. This allows to roundtrip.
I hope that makes any sense?
Yes, that means that using the values for comparison without taking the prevision into account will fail.
I don't think comparison and other operators were ever specified for the datetypes. This has bitten us before, and I think it would be valuable to do. And that would resolve these issues, and some others.
Would there be people interested in doing that? I sure would love to get it right.
On Thu, Aug 31, 2017, 16:17 Stas Malyshev smalyshev@wikimedia.org wrote:
Hi!
I am not sure I understand the issue and what the suggestion is to solve it. If we decide to arbitrarily reduce the possible range for the
Well, there are actually several issues right now.
- Our RDF output produces coordinates with more digits that specified
precision of the actual value. 2. Our precision values as specified in wikibase:geoPrecision seem to make little sense. 3. We may represent the same coordinates for objects located in the same place as different ones because of precision values being kinda chaotic. 4. We may have different data from other databases because our coordinate is over-precise.
(1) is probably easiest to fix. (2) is a bit harder, and I am still not sure how wikibase:geoPrecision is used, if at all. (3) and (4) are less important, but it would be nice to improve, and maybe they will be mostly fixed once (1) and (2) are fixed. But before approaching the fix, I wanted to understand what are expectations from precision and if there can or should be some limits there. Technically, it doesn't matter too much - except that some formulae for distances do not work well for high precisions because of limited accuracy of 64-bit double, but there are ways around it. So technically we can keep 9 digits or however many we need, if we wanted to. I just wanted to see if we should.
-- Stas Malyshev smalyshev@wikimedia.org
Hi!
The reason why we save the actual value with more digits than the precision (and why we keep the precision as an explicit value at all) is because the value could be entered and displayed either as decimal digits or in minutes and seconds. So internally one would save 20' as 0.333333333, but the precision is still just 2. This allows to roundtrip.
I hope that makes any sense?
Yes, for primary data storage (though roundtripping via limited-precision doubles is not ideal, but I guess good enough for now). But for secondary data/query interface, I am not sure 0.333333333 is that useful. What would one do with it, especially in SPARQL?
The GPS unit on my boat regularly claims an estimated position error of 4 feet after it has acquired its full complement of satellites. This is a fairly new mid-price GPS unit using up to nine satellites and WAAS. So my recreational GPS supposedly obtains fifth-decimal-place accuracy. It was running under an unobstructed sky, which is common when boating. Careful use of a good GPS unit should be able to achieve this level of accuracy on land as well.
From http://www.gps.gov/systems/gps/performance/accuracy/ the raw accuracy
of the positioning information from a satellite is less than 2.4 feet 95% of the time. The accuracy reported by a GPS unit is degraded by atmospheric conditions; false signals, e.g., bounces; and the need to determine position by intersecting the raw data from several satellites. Accuracy can be improved by using more satellites and multiple frequencies and by comparing to a signal from a receiver at a known location.
The web page above claims that accuracy can be improved to a few centimeters in real time and down to the millimeter level if a device is left in the same place for a long period of time. I think that these last two accuracies require a close-by receiver at a known location and correspond to what is said in [4].
peter
On 08/30/2017 06:53 PM, Nick Wilson (Quiddity) wrote:
On Tue, Aug 29, 2017 at 2:13 PM, Stas Malyshev smalyshev@wikimedia.org wrote:
[...] Would four decimals after the dot be enough? According to [4] this is what commercial GPS device can provide. If not, why and which accuracy would be appropriate?
I think that should be 5 decimals for commercial GPS, per that link? It also suggests that "The sixth decimal place is worth up to 0.11 m: you can use this for laying out structures in detail, for designing landscapes, building roads. It should be more than good enough for tracking movements of glaciers and rivers. This can be achieved by taking painstaking measures with GPS, such as differentially corrected GPS."
Do we hope to store datasets around glacier movement? It seems possible. (We don't seem to currently https://www.wikidata.org/wiki/Q770424 )
I skimmed a few search results, and found 7 (or 15) decimals given in one standard, but the details are beyond my understanding: http://resources.esri.com/help/9.3/arcgisengine/java/gp_toolref/geoprocessin... https://stackoverflow.com/questions/1947481/how-many-significant-digits-shou... https://stackoverflow.com/questions/7167604/how-accurately-should-i-store-la...
[4] https://gis.stackexchange.com/questions/8650/measuring-accuracy-of-latitude-...
Glittertinden, a mountain in Norway have a geopos 61.651222 N 8.557492 E, alternate geopos 6835406.62, 476558.22 (EU89, UTM32).
Some of the mountains are measured to within a millimeter in elevation. For example Ørneflag is measured to be at 1242.808 meter, with a position 6705530.826, 537607.272 (EU89, UTM32) alternate geopos 6717133.02, 208055.24 (EU89, UTM33). This is on a bolt on the top of the mountain. There is an on-going project to map the country withing 1x1 meter and elevation about 0.2 meter.
One arc second is about 1 km, so five digits after decimal point should be about 1 cm lateral precision.
Goepositions isn't a fixed thing, there can be quite large tidal waves, and modelling and estimating them is an important research field. The waves can be as large as ~0.3 meter. (From long ago, ask someone working on this.) Estimation of where we are is to less than 1 cm, but I have heard better numbers.
All geopos hould have a reference datum, without it it is pretty useless when the precision is high. An easy fix could be to use standard profiles with easy to recognize names, like "GPS", and limit the precision in that case to two digits after decimal point on an arc second.
Note that precision in longitude will depend on actual latitude.
On Fri, Sep 1, 2017 at 9:43 PM, Peter F. Patel-Schneider < pfpschneider@gmail.com> wrote:
The GPS unit on my boat regularly claims an estimated position error of 4 feet after it has acquired its full complement of satellites. This is a fairly new mid-price GPS unit using up to nine satellites and WAAS. So my recreational GPS supposedly obtains fifth-decimal-place accuracy. It was running under an unobstructed sky, which is common when boating. Careful use of a good GPS unit should be able to achieve this level of accuracy on land as well.
From http://www.gps.gov/systems/gps/performance/accuracy/ the raw accuracy of the positioning information from a satellite is less than 2.4 feet 95% of the time. The accuracy reported by a GPS unit is degraded by atmospheric conditions; false signals, e.g., bounces; and the need to determine position by intersecting the raw data from several satellites. Accuracy can be improved by using more satellites and multiple frequencies and by comparing to a signal from a receiver at a known location.
The web page above claims that accuracy can be improved to a few centimeters in real time and down to the millimeter level if a device is left in the same place for a long period of time. I think that these last two accuracies require a close-by receiver at a known location and correspond to what is said in [4].
peter
On 08/30/2017 06:53 PM, Nick Wilson (Quiddity) wrote:
On Tue, Aug 29, 2017 at 2:13 PM, Stas Malyshev smalyshev@wikimedia.org
wrote:
[...] Would four decimals after the dot be enough? According to [4] this is what commercial GPS device can provide. If not, why and which accuracy would be appropriate?
I think that should be 5 decimals for commercial GPS, per that link? It also suggests that "The sixth decimal place is worth up to 0.11 m: you can use this for laying out structures in detail, for designing landscapes, building roads. It should be more than good enough for tracking movements of glaciers and rivers. This can be achieved by taking painstaking measures with GPS, such as differentially corrected GPS."
Do we hope to store datasets around glacier movement? It seems possible. (We don't seem to currently https://www.wikidata.org/wiki/Q770424 )
I skimmed a few search results, and found 7 (or 15) decimals given in one standard, but the details are beyond my understanding: http://resources.esri.com/help/9.3/arcgisengine/java/gp_
toolref/geoprocessing_environments/about_coverage_precision.htm
significant-digits-should-i-store-in-my-database-for-a-gps-coordinate
accurately-should-i-store-latitude-and-longitude
accuracy-of-latitude-and-longitude
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Not sure if I would go for it, but…
"Precision for the location of the center should be one percent of the square root of the area covered."
Oslo covers nearly 1000 km², that would give 1 % of 32 km or 300 meter or 0.3 arc seconds.
On Mon, Nov 6, 2017 at 2:50 AM, John Erling Blad jeblad@gmail.com wrote:
Glittertinden, a mountain in Norway have a geopos 61.651222 N 8.557492 E, alternate geopos 6835406.62, 476558.22 (EU89, UTM32).
Some of the mountains are measured to within a millimeter in elevation. For example Ørneflag is measured to be at 1242.808 meter, with a position 6705530.826, 537607.272 (EU89, UTM32) alternate geopos 6717133.02, 208055.24 (EU89, UTM33). This is on a bolt on the top of the mountain. There is an on-going project to map the country withing 1x1 meter and elevation about 0.2 meter.
One arc second is about 1 km, so five digits after decimal point should be about 1 cm lateral precision.
Goepositions isn't a fixed thing, there can be quite large tidal waves, and modelling and estimating them is an important research field. The waves can be as large as ~0.3 meter. (From long ago, ask someone working on this.) Estimation of where we are is to less than 1 cm, but I have heard better numbers.
All geopos hould have a reference datum, without it it is pretty useless when the precision is high. An easy fix could be to use standard profiles with easy to recognize names, like "GPS", and limit the precision in that case to two digits after decimal point on an arc second.
Note that precision in longitude will depend on actual latitude.
On Fri, Sep 1, 2017 at 9:43 PM, Peter F. Patel-Schneider < pfpschneider@gmail.com> wrote:
The GPS unit on my boat regularly claims an estimated position error of 4 feet after it has acquired its full complement of satellites. This is a fairly new mid-price GPS unit using up to nine satellites and WAAS. So my recreational GPS supposedly obtains fifth-decimal-place accuracy. It was running under an unobstructed sky, which is common when boating. Careful use of a good GPS unit should be able to achieve this level of accuracy on land as well.
From http://www.gps.gov/systems/gps/performance/accuracy/ the raw accuracy of the positioning information from a satellite is less than 2.4 feet 95% of the time. The accuracy reported by a GPS unit is degraded by atmospheric conditions; false signals, e.g., bounces; and the need to determine position by intersecting the raw data from several satellites. Accuracy can be improved by using more satellites and multiple frequencies and by comparing to a signal from a receiver at a known location.
The web page above claims that accuracy can be improved to a few centimeters in real time and down to the millimeter level if a device is left in the same place for a long period of time. I think that these last two accuracies require a close-by receiver at a known location and correspond to what is said in [4].
peter
On 08/30/2017 06:53 PM, Nick Wilson (Quiddity) wrote:
On Tue, Aug 29, 2017 at 2:13 PM, Stas Malyshev smalyshev@wikimedia.org
wrote:
[...] Would four decimals after the dot be enough? According to [4] this is what commercial GPS device can provide. If not, why and which accuracy would be
appropriate?
I think that should be 5 decimals for commercial GPS, per that link? It also suggests that "The sixth decimal place is worth up to 0.11 m: you can use this for laying out structures in detail, for designing landscapes, building roads. It should be more than good enough for tracking movements of glaciers and rivers. This can be achieved by taking painstaking measures with GPS, such as differentially corrected GPS."
Do we hope to store datasets around glacier movement? It seems possible. (We don't seem to currently https://www.wikidata.org/wiki/Q770424 )
I skimmed a few search results, and found 7 (or 15) decimals given in one standard, but the details are beyond my understanding: http://resources.esri.com/help/9.3/arcgisengine/java/gp_tool
ref/geoprocessing_environments/about_coverage_precision.htm
https://stackoverflow.com/questions/1947481/how-many-signifi
cant-digits-should-i-store-in-my-database-for-a-gps-coordinate
should-i-store-latitude-and-longitude
[4] https://gis.stackexchange.com/questions/8650/measuring-accur
acy-of-latitude-and-longitude
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata