I'm making a map app, including the location arrow that shows you which way you're facing, like so:
I get the orientation directly from SensorManager.getOrientation()
, using the first returned value: azimuth. When the phone is held so that the screen is pointing above the horizon, and in portrait, the arrow works fine. However:
The carefully constructed and scientific image below shows what I mean (where blue is the user's facing, red is the arrow direction, the screen is approximately facing the user's face, and Google Maps does exactly what I want):
(Note that, with Google Maps, it doesn't successfully do the last two actions on the list if auto-rotate is off. But I'm not even at that stage, yet. One thing at a time.)
It appears as though it's simply using the Y axis pointing direction as shown here: http://developer.android.com/reference/android/hardware/SensorEvent.html, when I want it to use the reverse of the Z axis pointing direction, most of the time, and the Y when the phone is flat. However, given the values that getOrientation()
returns, I'd have to write complex cases to fix some of the issues, and the phone-facing-horizon use case is unsolvable. I'm certain there's an easier way.
Here's my code (Where lastAcceleration and lastMagneticField both came from the internal sensor):
float[] rotationMatrix = new float[9];
if(SensorManager.getRotationMatrix(rotationMatrix, null, lastAcceleration, lastMagneticField)){
float[] orientMatrix = new float[3];
SensorManager.getOrientation(rotationMatrix, orientMatrix);
orientation = orientMat[0]*180/(float)Math.PI;
}
What am I doing wrong? Is there an easier way to do this?
Edit: Just to clarify, I'm making the assumption that the user is holding the device in front of them, and the screen is pointing towards them. Beyond that, I obviously can't tell if only one of them rotates. Also, I am using the motion of the user when they are moving, but this is for when they are stationary.
Did you call remapCoordinateSystem? Otherwise, you only get the right facing value when the phone is hold vertically. For the case When the phone is held so the screen's facing is level with the horizon, there is no way you can get the user's facing. Because to get the facing you have to project the z value of the of the sensor reading into the xy plane in the world coordinate and it is zero when the device is held horizontally.
To be more precise, if you want to get the phone facing then the phone has to be inclined at least about 25 degrees from horizontal and you have to call remapCoordinateSystem. The following code will give you what you want for the last 2 pictures above.
Code
float[] rotationMatrix = new float[9];
if(SensorManager.getRotationMatrix(rotationMatrix, null, lastAcceleration, lastMagneticField)){
float[] orientMatrix = new float[3];
float remapMatrix = new float[9];
SensorManager.remapCoordinateSystem(rotationMatrix, SensorManager.AXIS_X, SensorManager.AXIS_Z, remapMatrix);
SensorManager.getOrientation(remapMatrix, orientMatrix);
orientation = orientMat[0]*180/(float)Math.PI;
}
The getOrientation gives you the correct values assuming the phone is laying flat. Thus, if the phone is held vertically, then you have to remap coordinate to get the flat position. Geometrically, you project the the phone -z axis down to the world xy plane and then calculate the angle between this projection vector and the world y-axis.