Why is memset() incorrectly initializing int?

Ahmad picture Ahmad · Aug 18, 2011 · Viewed 11.7k times · Source

Why is the output of the following program 84215045?

int grid[110];
int main()
{
    memset(grid, 5, 100 * sizeof(int));
    printf("%d", grid[0]);
    return 0;
}

Answer

Stephen Canon picture Stephen Canon · Aug 18, 2011

memset sets each byte of the destination buffer to the specified value. On your system, an int is four bytes, each of which is 5 after the call to memset. Thus, grid[0] has the value 0x05050505 (hexadecimal), which is 84215045 in decimal.

Some platforms provide alternative APIs to memset that write wider patterns to the destination buffer; for example, on OS X or iOS, you could use:

int pattern = 5;
memset_pattern4(grid, &pattern, sizeof grid);

to get the behavior that you seem to expect. What platform are you targeting?

In C++, you should just use std::fill_n:

std::fill_n(grid, 100, 5);