Calculating coordinates given a bearing and a distance

user106996 picture user106996 · May 18, 2009 · Viewed 33k times · Source

I am having problems implementing the function described here here.

This is my Java implementation:

private static double[] pointRadialDistance(double lat1, double lon1, 
        double radianBearing, double radialDistance) {
     double lat = Math.asin(Math.sin(lat1)*Math.cos(radialDistance)+Math.cos(lat1)
             *Math.sin(radialDistance)*Math.cos(radianBearing));
     double lon;
     if(Math.cos(lat) == 0) {  // Endpoint a pole
        lon=lon1;      
     }
     else {
        lon = ((lon1-Math.asin(Math.sin(radianBearing)*Math.sin(radialDistance)/Math.cos(lat))
                +Math.PI) % (2*Math.PI)) - Math.PI;
     }
    return (new double[]{lat, lon});
}

I convert the degree bearing to radians and convert the distance (km) into a radians distance before calling the function - so that's not the problem.

However, when I input coordinates such as: lat = 49.25705; lon = -123.140259; with a bearing of 225 (south-west) and a distance of 1km

I get this returned: lat: -1.0085434360125864 lon: -3.7595299668539504

Its obviously not correct, can anyone see what I am doing wrong?

Thanks

Answer

las3rjock picture las3rjock · May 18, 2009

It seems like these are the issues in your code:

  1. You need to convert lat1 and lon1 to radians before calling your function.
  2. You may be scaling radialDistance incorrectly.
  3. Testing a floating-point number for equality is dangerous. Two numbers that are equal after exact arithmetic might not be exactly equal after floating-point arithmetic. Thus abs(x-y) < threshold is safer than x == y for testing two floating-point numbers x and y for equality.
  4. I think you want to convert lat and lon from radians to degrees.

Here is my implementation of your code in Python:

#!/usr/bin/env python

from math import asin,cos,pi,sin

rEarth = 6371.01 # Earth's average radius in km
epsilon = 0.000001 # threshold for floating-point equality


def deg2rad(angle):
    return angle*pi/180


def rad2deg(angle):
    return angle*180/pi


def pointRadialDistance(lat1, lon1, bearing, distance):
    """
    Return final coordinates (lat2,lon2) [in degrees] given initial coordinates
    (lat1,lon1) [in degrees] and a bearing [in degrees] and distance [in km]
    """
    rlat1 = deg2rad(lat1)
    rlon1 = deg2rad(lon1)
    rbearing = deg2rad(bearing)
    rdistance = distance / rEarth # normalize linear distance to radian angle

    rlat = asin( sin(rlat1) * cos(rdistance) + cos(rlat1) * sin(rdistance) * cos(rbearing) )

    if cos(rlat) == 0 or abs(cos(rlat)) < epsilon: # Endpoint a pole
        rlon=rlon1
    else:
        rlon = ( (rlon1 - asin( sin(rbearing)* sin(rdistance) / cos(rlat) ) + pi ) % (2*pi) ) - pi

    lat = rad2deg(rlat)
    lon = rad2deg(rlon)
    return (lat, lon)


def main():
    print "lat1 \t lon1 \t\t bear \t dist \t\t lat2 \t\t lon2"
    testcases = []
    testcases.append((0,0,0,1))
    testcases.append((0,0,90,1))
    testcases.append((0,0,0,100))
    testcases.append((0,0,90,100))
    testcases.append((49.25705,-123.140259,225,1))
    testcases.append((49.25705,-123.140259,225,100))
    testcases.append((49.25705,-123.140259,225,1000))
    for lat1, lon1, bear, dist in testcases:
        (lat,lon) = pointRadialDistance(lat1,lon1,bear,dist)
        print "%6.2f \t %6.2f \t %4.1f \t %6.1f \t %6.2f \t %6.2f" % (lat1,lon1,bear,dist,lat,lon)


if __name__ == "__main__":
    main()

Here is the output:

lat1     lon1        bear    dist        lat2        lon2
  0.00     0.00       0.0       1.0        0.01        0.00
  0.00     0.00      90.0       1.0        0.00       -0.01
  0.00     0.00       0.0     100.0        0.90        0.00
  0.00     0.00      90.0     100.0        0.00       -0.90
 49.26   -123.14     225.0      1.0       49.25      -123.13
 49.26   -123.14     225.0    100.0       48.62      -122.18
 49.26   -123.14     225.0   1000.0       42.55      -114.51