For a square grid the euclidean distance between tile A and B is:
distance = sqrt(sqr(x1-x2)) + sqr(y1-y2))
For an actor constrained to move along a square grid, the Manhattan Distance is a better measure of actual distance we must travel:
manhattanDistance = abs(x1-x2) + abs(y1-y2))
How do I get the manhattan distance between two tiles in a hexagonal grid as illustrated with the red and blue lines below?
I once set up a hexagonal coordinate system in a game so that the y-axis was at a 60-degree angle to the x-axis. This avoids the odd-even row distinction.
(source: althenia.net)
The distance in this coordinate system is:
dx = x1 - x0
dy = y1 - y0
if sign(dx) == sign(dy)
abs(dx + dy)
else
max(abs(dx), abs(dy))
You can convert (x', y) from your coordinate system to (x, y) in this one using:
x = x' - floor(y/2)
So dx
becomes:
dx = x1' - x0' - floor(y1/2) + floor(y0/2)
Careful with rounding when implementing this using integer division. In C for int y
floor(y/2)
is (y%2 ? y-1 : y)/2
.