gldrawpixel rgb value from a array

Rana Tallal picture Rana Tallal · Nov 18, 2013 · Viewed 7.4k times · Source

I have a three dimensional array

unsigned int window_RGBData[3][640][480];
void display(){
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glDrawPixels(window_Width,window_Height,GL_RGB,GL_UNSIGNED_INT, window_RGBData);
    glutSwapBuffers();
}

it only shows me a black screen, no matter what the values in the array. The array is like this, dimension 1 is red, dimension 2 is green and third is blue. When I use gl_unsinged_byte instead of int I get lines of black and white(if all array is 255). So it is reading from the array. But I think I am not able to specify the correct format of the array to opengl. I can use the

glBegin(GL_POINT)
// code here
glEnd

but for a specific reason this way is out of bounds.

Any ideas on how to specify the format here in gldrawpixels function or any other way.

initialization etc:

int Image::OpenGLShow(){
// Display onscreen
    int argc=1;
    char* argv[1];
    argv[0]=strdup("Helloguys");
    glutInit(&argc, argv);
    window_Height=Height;
    window_Width= Width;
  glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);
  glutInitWindowSize(Width, Height);
  glutCreateWindow("OpenGL glDrawPixels demo");

  glutDisplayFunc(display);
  glEnable(GL_DEPTH_TEST);

  glutMainLoop();
    return 0; 
}

EDIT: I believe the problem is with the values I am putting in my array. Now this is how the array is being populated:

for(int i =0;i<200;i++){
        for (int j=0; j<200;j++){
            // rgb values at i*j point on screen. 
            window_RGBData[0][i][j]=red_value;  // 0-255
            window_RGBData[1][i][j]=green_value;
            window_RGBData[2][i][j]=blue_value;
        }
    }

So basically I was assuming that the gldrawpixels method will combine the three together and display the pixel, depending on the format I provided. Which defininetly is not what gldrawpixels is doing. So should I change the way the array is being populated or change the formate value?

Answer

genpfault picture genpfault · Nov 18, 2013

That array is too big to put on the stack.

Increase your stack size, allocate it on the heap, or try something smaller:

#include <GL/glut.h>
#include <cstdlib>

const unsigned int W = 200;
const unsigned int H = 200;

void display()
{
    glClearColor( 0, 0, 0, 1 );
    glClear( GL_COLOR_BUFFER_BIT );

    unsigned int data[H][W][3];
    for( size_t y = 0; y < H; ++y )
    {
        for( size_t x = 0; x < W; ++x )
        {
            data[y][x][0] = ( rand() % 256 ) * 256 * 256 * 256;
            data[y][x][1] = ( rand() % 256 ) * 256 * 256 * 256;
            data[y][x][2] = ( rand() % 256 ) * 256 * 256 * 256;
        }
    }

    glDrawPixels( W, H, GL_RGB, GL_UNSIGNED_INT, data );

    glutSwapBuffers();
}

int main( int argc, char **argv )
{
    glutInit( &argc, argv );
    glutInitDisplayMode( GLUT_RGBA | GLUT_DOUBLE );
    glutInitWindowSize( W, H );
    glutCreateWindow( "GLUT" );
    glutDisplayFunc( display );
    glutMainLoop();
    return 0;
}