I'm currently developing a simple 2D game for Android. I have a stationary object that's situated in the center of the screen and I'm trying to get that object to rotate and point to the area on the screen that the user touches. I have the constant coordinates that represent the center of the screen and I can get the coordinates of the point that the user taps on. I'm using the formula outlined in this forum: How to get angle between two points?
It says as follows "If you want the the angle between the line defined by these two points and the horizontal axis:
double angle = atan2(y2 - y1, x2 - x1) * 180 / PI;".
I implemented this, but I think the fact the I'm working in screen coordinates is causing a miscalculation, since the Y-coordinate is reversed. I'm not sure if this is the right way to go about it, any other thoughts or suggestions are appreciated.
Assumptions: x
is the horizontal axis, and increases when moving from left to right.
y
is the vertical axis, and increases from bottom to top. (touch_x, touch_y)
is the
point selected by the user. (center_x, center_y)
is the point at the center of the
screen. theta
is measured counter-clockwise from the +x
axis. Then:
delta_x = touch_x - center_x
delta_y = touch_y - center_y
theta_radians = atan2(delta_y, delta_x)
Edit: you mentioned in a comment that y increases from top to bottom. In that case,
delta_y = center_y - touch_y
But it would be more correct to describe this as expressing (touch_x, touch_y)
in polar coordinates relative to (center_x, center_y)
. As ChrisF mentioned,
the idea of taking an "angle between two points" is not well defined.