In the Criteria
class, there are two constants, ACCURACY_HIGH
and ACCURACY_FINE
, which are apparently used to require the LocationManager
to return higher accuracy location updates. Here is what the documentation says about each of these constants:
public static final int ACCURACY_FINE (Added in API level 1)
A constant indicating a finer location accuracy requirement Constant Value: 1 (0x00000001)
public static final int ACCURACY_HIGH (Added in API level 9)
a constant indicating a high accuracy requirement - may be used for horizontal, altitude, speed or bearing accuracy. For horizontal and vertical position this corresponds roughly to an accuracy of less than 100 meters. Constant Value: 3 (0x00000003)
Does anyone know which of these two constants provides (i.e. requires) the highest level of accuracy?
From what I can see in the source code, ACCURACY_FINE is grouped with ACCURACY_COARSE with constant values of 1 & 2 respectively. ACCURACY_LOW, MEDIUM and HIGH are grouped together with constant values 1, 2 & 3.
It seems that setAccuracy expects and returns either COARSE or FINE, while setVerticalAccuracy, setHorizontalAccuracy, setSpeedAccuracy and setBearingAccuracy expect LOW, MEDIUM or HIGH. Furthermore, when you call setAccuracy, it sets horizontal accuracy like so:
public void setAccuracy(int accuracy) {
if (accuracy < NO_REQUIREMENT || accuracy > ACCURACY_COARSE) {
throw new IllegalArgumentException("accuracy=" + accuracy);
}
if (accuracy == ACCURACY_FINE) {
mHorizontalAccuracy = ACCURACY_HIGH;
} else {
mHorizontalAccuracy = ACCURACY_LOW;
}
}
It's really confusing, but I hope I cleared it up for you a bit. Here's a link to the source in grepcode, you can download it and see for yourself if you don't have the source locally.