I was reading this question here:
What datatype to use when storing latitude and longitude data in SQL databases?
And it seems the general consensus is that using Decimal(9,6) is the way to go. The question for me is, how accurate do I really need this?
For instance, Google's API returns a result like:
"lat": 37.4219720,
"lng": -122.0841430
Out of -122.0841430, how many digits do I need? I've read several guides but I can't make enough sense out of them to figure this out.
To be more precise in my question: If I want to be accurate within 50 feet of the exact location, how many decimal points do I need to store?
Perhaps a better question would actually be a non-programming question, but it would be: how much more accurate does each decimal point give you?
Is it this simple?
Accuracy versus decimal places at the equator
decimal degrees distance
places
-------------------------------
0 1.0 111 km
1 0.1 11.1 km
2 0.01 1.11 km
3 0.001 111 m
4 0.0001 11.1 m
5 0.00001 1.11 m
6 0.000001 0.111 m
7 0.0000001 1.11 cm
8 0.00000001 1.11 mm
ref : https://en.wikipedia.org/wiki/Decimal_degrees#Precision