The 9 digits precision was based on a survey of Wikipedia we did back then and the most precise GPS coordinates in Wikipedia. Unfortunately, I don't remember anymore what article it was - it was some article listing a number of places that have, due to whatever reason - really high precisions. If someone finds the article again, I would be thankful. It might help in this conversation. The 9 digits were not chosen arbitrarily, but based on the requirements from Wikipedia.
But, this is just the most detailed precision. In most cases, as you notice, we won't need this high precision, but we will have a much lower precision. Pinning down a municipality with a 9 digit precision is obviously nonsense. For most countries, any precision beyond 0 seems quite nonsensical.
But that's also true for time. The time data model allows to second-precision, but obviously, for much of the data that does not make sense. Nevertheless, the datamodel supports saving it, we don't want to loose here compared to the base data.
I am not sure I understand the issue and what the suggestion is to solve it. If we decide to arbitrarily reduce the possible range for the precision, this still won't lead to any improvements for countries as compared to statues. As far as I can tell, the only way to actually solve this is to provide query patterns that take the precision into account and to have the system implement it correctly.