I am trying to figure out the correct trig. eq./function to determine the following: The Angle-change (in DEGREES) between two DIRECTION VECTORS(already determined), that represent two line-segment. This is used in the context of SHAPE RECOGTNITION (hand drawn by user on screen).
SO basically,
a) if the user draws a (rough) shape, such as a circle, or oval, or rectangle etc - the lines that makes up that shape are broken down in to say.. 20 points(x-y pairs).
b) I have the DirectionVector for each of these LINE SEGMENTS.
c) So the BEGINNING of a LINE SEGMENT(x0,y0), will the END points of the previous line(so as to form a closed shape like a rectangle, let's say).
SO, my question is , given the context(i.e. determinign the type of a polygon), how does one find the angle-change between two DIRECTION VECTORS(available as two floating point values for x and y) ???
I have seen so many different trig. equations and I'm seeking clarity on this.
Thanks so much in advance folks!
If (x1,y1) is the first direction vector and (x2,y2) is the second one, it holds:
cos( alpha ) = (x1 * x2 + y1 * y2) / ( sqrt(x1*x1 + y1*y1) * sqrt(x2*x2 + y2*y2) )
sqrt means the square root.
Look up http://en.wikipedia.org/wiki/Dot_product
Especially the section "Geometric Representation".