I am working with OpenCV, Android and OpenGL for an Augmented Reality project. As far as I know the coordintate system in OpenGL is
The OpenCV coordinate system is:
When combining these devices with android sensors how can I do the coordinate system conversions and [R|t] matrix conversion? Is there a good tutorial or documentation were all of this conffusing stuff is explained?
If you look at the picture, then you see, that the both coordinate systems have the same handednes, but the OpenCV one is rotated by pi around the x axis. This can be represented by the following rotation matrix:
1 0 0
0 -1 0
0 0 -1