I am trying to use sprite sheet in OpenGL, implementing it through Array Texture
This is how I load my texture:
QImage image;
image.load("C:\\QtProjects\\project\\images\\spritesheet.png", "png");
const unsigned char* data = image.bits();
int twidth = image.width(), theight = image.height();
glTexStorage3D(GL_TEXTURE_2D_ARRAY, 1, GL_RGBA, twidth / 3, theight / 4, 12);
glTexSubImage3D(GL_TEXTURE_2D_ARRAY, 0, 0, 0, 0, twidth / 3, theight / 4, 12, GL_BGRA, GL_UNSIGNED_BYTE, data);
glUseProgram(scene_program);
glTexParameteri(GL_TEXTURE_2D_ARRAY, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D_ARRAY, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D_ARRAY, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D_ARRAY, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
This is my fragment shader:
#version 330 core
in vec3 Color;
in vec2 Texture_coord;
out vec4 outColor;
uniform sampler2DArray tex;
void main()
{
vec2 coord;
coord.x = Texture_coord.x;
coord.y = Texture_coord.y;
vec4 color = texture(tex, vec3(coord, 0));
outColor = color;
}
I am drawing it to the quad with texture coordinates (0.0, 0.0)
in one corner and (1.0, 1.0)
in other, so I expect to get a first sprite from first row on my quad.
But I get this corrupted piece of image: http://i.stack.imgur.com/bfXgr.jpg
This is whole sprite sheet: http://i.stack.imgur.com/LaZKP.png
P.S.: whole image is loaded correctly, I can display it or it's part through traditional GL_TEXTURE_2D
I don't believe the problem is directly related to array textures. The real issue is in how you are trying to load part of your input image into a texture:
glTexStorage3D(GL_TEXTURE_2D_ARRAY, 1, GL_RGBA, twidth / 3, theight / 4, 12);
glTexSubImage3D(GL_TEXTURE_2D_ARRAY, 0, 0, 0, 0, twidth / 3, theight / 4, 12,
GL_BGRA, GL_UNSIGNED_BYTE, data);
The glTexSubImage3D()
call does not know how large your input image is. it will simply read your image data sequentially based on the sizes you pass in as parameters. If you give it width that is 1/3 of your original image width, it will read 3 texture rows from the first row of your input image, then another 3 texture rows from the second row of your image, etc.
You can specify the size of your input rows (in pixels) with the GL_UNPACK_ROW_LENGTH
pixel storage value. For your case, the following call specifies the width of your input image:
glPixelStorei(GL_UNPACK_ROW_LENGTH, twidth);
I don't think there's a reasonable way of loading your whole sprite sheet into the array texture with a single glTexSubImage3D()
call. It's probably easiest if you make a separate call for each image. The whole thing will then look like this:
GLuint texId = 0;
glGenTextures(1, &texId);
glBindTexture(GL_TEXTURE_2D_ARRAY, texId);
glTexParameteri(GL_TEXTURE_2D_ARRAY, ...);
int subWidth = twidth / 3;
int subHeight = theight / 4;
glTexStorage3D(GL_TEXTURE_2D_ARRAY, 1, GL_RGBA, subWidth, subHeight, 12);
glPixelStorei(GL_UNPACK_ROW_LENGTH, twidth);
for (int iRow = 0; iRow < 4; ++iRow) {
for (int iCol = 0; iCol < 3; ++iCol) {
glTexSubImage3D(
GL_TEXTURE_2D_ARRAY, 0,
0, 0, 3 * iRow + iCol,
subWidth, subHeight, 1,
GL_BGRA, GL_UNSIGNED_BYTE,
data + (iRow * subHeight * twidth + iCol * subWidth) * 4);
}
}
You could try to avoid the separate calls by using other unpack parameters like GL_UNPACK_IMAGE_HEIGHT
, but I'm not convinced that there are settings that would work for this case.