I am trying to use the Wifimanager to calculate the Signal Level of the access points found during a scan.
I am using the following method:
WifiManager.calculateSignalLevel(int, int)
But it appears to always return the same int no matter what the RSSI level is.
Here is my code:
public int calculateQoS(int aRSSI){
signalLevel = WifiManager.calculateSignalLevel(RSSI, 5);
return signalLevel;
}
public void testCalculateQoS(){
Log.d("signal", "signal = : "
+ connMonitor.calculateQoS(-44)
+ " " + connMonitor.calculateQoS(-80)
+ " " + connMonitor.calculateQoS(-120)
+ " " + connMonitor.calculateQoS(-20));
}
The logging outputs 1 for all the test cases for calculateQoS(int).
Am I missing something simple here? Why is the SignalLevel always 1?
It seems that calculateSignalLevel is implemented this way:
public static int calculateSignalLevel(int rssi, int numLevels) {
if (rssi <= MIN_RSSI) {
return 0;
} else if (rssi >= MAX_RSSI) {
return numLevels - 1;
} else {
int partitionSize = (MAX_RSSI - MIN_RSSI) / (numLevels - 1);
return (rssi - MIN_RSSI) / partitionSize;
}
}
Maybe this code snippet can explain your problem. Also note: