libgdx coordinate system differences between rendering and touch input

Brian Mains picture Brian Mains · May 13, 2013 · Viewed 15.5k times · Source

I have a screen (BaseScreen implements the Screen interface) that renders a PNG image. On click of the screen, it moves the character to the position touched (for testing purposes).

public class DrawingSpriteScreen extends BaseScreen {
    private Texture _sourceTexture = null;
    float x = 0, y = 0;

    @Override
    public void create() {
        _sourceTexture = new Texture(Gdx.files.internal("data/character.png"));
    }

    .
    .
}

During rendering of the screen, if the user touched the screen, I grab the coordinates of the touch, and then use these to render the character image.

@Override
public void render(float delta) {
    if (Gdx.input.justTouched()) {
        x = Gdx.input.getX();
        y = Gdx.input.getY();
    }

    super.getGame().batch.draw(_sourceTexture, x, y);
}

The issue is the coordinates for drawing the image start from the bottom left position (as noted in the LibGDX Wiki) and the coordinates for the touch input starts from the upper left corner. So the issue I'm having is that I click on the bottom right, it moves the image to the top right. My coordinates may be X 675 Y 13, which on touch would be near the top of the screen. But the character shows at the bottom, since the coordinates start from the bottom left.

Why is what? Why are the coordinate systems reversed? Am I using the wrong objects to determine this?

Answer

Pranav008 picture Pranav008 · May 13, 2013

To detect collision I use camera.unproject(vector3). I set vector3 as:

x = Gdx.input.getX();     
y = Gdx.input.getY();
z=0;

Now I pass this vector in camera.unproject(vector3). Use x and y of this vector to draw your character.