Given two bearing, how do I find the smallest angle between them?
So for example if 1 heading is 340 degrees and the second is 10 degrees the smallest angle will be 30 degrees.
I've attached a picture to show what I mean. I've tried subtracting one from the other but that didn't work because of the wrap around effect of a circle. I've also tried using negative degrees (180 - 359 being -180 to 0) but that got messed up when trying to calculate the angle between positive and negative number.
I'm sure there must be an easier way that having lots of if
statements.
Thank for your help. Adam
BTW. This is a navigation question so the radius of the circle is unknown.
I ended up using the following formula found on this message board because I needed the result to be signed based on the direction (clockwise or counterclockwise). It has a good explanation of exactly what's going on.
((((bearing - heading) % 360) + 540) % 360) - 180