Im trying to calculate the maximum manhattan distance of a large 2D input , the inputs are consisting of (x, y)s and what I want to do is to calculate the maximum distance between those coordinates In less than O(n^2) time , I can do it in O(n^2) by going through all of elements sth like :
*(Manhattan distance between two points (X1,Y1) and (X2,Y2) is : |X1-X2| + |Y1-Y2|)
for ( 0 -> n )
for ( 0-> n )
{ // here i calculate |Xi - Xj| + |Yi - Yj| which is maximum }
but It won't work efficiently for very large inputs :(
anyone has any idea for a better algorithm ?
There are only two cases to consider, if we only consider results such that Xi <= Xj
.
Yi <= Yj
, then the distance is (Xj + Yj) - (Xi + Yi)
(Xj - Yj) - (Xi - Yi)
By breaking it down into these cases, I have gotten rid of the absolute value function making it much easier to reason about the distances.
So, we simply pick points with minimum and maximum x+y
, and compute the distance. Then pick points with minimum and maximum x-y
, and compute the distance. One of those two distances is your maximum.
This can be done in O(n)
, which is asymptotically optimal.