"dereferencing type-punned pointer will break strict-aliasing rules" warning

petersohn picture petersohn · Nov 12, 2010 · Viewed 56.6k times · Source

I use a code where I cast an enum* to int*. Something like this:

enum foo { ... }
...
foo foobar;
int *pi = reinterpret_cast<int*>(&foobar);

When compiling the code (g++ 4.1.2), I get the following warning message:

dereferencing type-punned pointer will break strict-aliasing rules

I googled this message, and found that it happens only when strict aliasing optimization is on. I have the following questions:

  • If I leave the code with this warning, will it generate potentially wrong code?
  • Is there any way to work around this problem?
  • If there isn't, is it possible to turn off strict aliasing from inside the source file (because I don't want to turn it off for all source files and I don't want to make a separate Makefile rule for this source file)?

And yes, I actually need this kind of aliasing.

Answer

moonshadow picture moonshadow · Nov 12, 2010

In order:

  • Yes. GCC will assume that the pointers cannot alias. For instance, if you assign through one then read from the other, GCC may, as an optimisation, reorder the read and write - I have seen this happen in production code, and it is not pleasant to debug.

  • Several. You could use a union to represent the memory you need to reinterpret. You could use a reinterpret_cast. You could cast via char * at the point where you reinterpret the memory - char * are defined as being able to alias anything. You could use a type which has __attribute__((__may_alias__)). You could turn off the aliasing assumptions globally using -fno-strict-aliasing.

  • __attribute__((__may_alias__)) on the types used is probably the closest you can get to disabling the assumption for a particular section of code.

For your particular example, note that the size of an enum is ill defined; GCC generally uses the smallest integer size that can be used to represent it, so reinterpreting a pointer to an enum as an integer could leave you with uninitialised data bytes in the resulting integer. Don't do that. Why not just cast to a suitably large integer type?