I use HTML5's geolocation API and the position object has a "accuracy" property which is a number that may vary depending on the accuracy of the positioning.
So far so good. But the value is an unspecified unit.. sometimes it's 60, or 1250 or even 18 000.
I've read the W3C Geolocation API Specification[1], they talk about this property but never mention how it should be interpreted.
The only place I've found a possible answer is in the Department of Defence World Geodic System[2] technical report.. They mention accuracy in cm units with a 1σ (standard deviation).
But that doesn't tell me what's the accuracy unit that the HTML5 geolocation API returns me.. I can only suppose it's in cm..
[1] http://dev.w3.org/geo/api/spec-source.html
[2] http://earth-info.nga.mil/GandG/publications/tr8350.2/wgs84fin.pdf
From the documentation:
The accuracy attribute denotes the accuracy level of the latitude and longitude coordinates. It is specified in meters and must be supported by all implementations. The value of the accuracy attribute must be a non-negative real number.
So its measurement is meters.