I'm reading the code of an iPhone sample project (Xcode IDE, Apple LLVM compiler 4.2). In a header file of an external library (written in C) for that iPhone sample project, there're some events declared in enumeration type:
typedef enum _Application_Events
{
EVENT_EXIT = 0x80000000,
EVENT_TOUCH,
EVENT_DRAG,
EVENT_RELEASE_TOUCH,
EVENT_ROTATE_0,
EVENT_ROTATE_90,
EVENT_ROTATE_180,
EVENT_ROTATE_270
} Application_Events;
I don't understand what kind of values are assigned to those events. Is 0x80000000
supposed to be a big positive integer (2147483648
), or negative zero, or a negative integer (-2147483648
)?
I inspected in Xcode debugger, with the compiler being Apple LLVM compiler 4.2, the EVENT_EXIT
equals (int) -2147483648
and the EVENT_RELEASE_TOUCH
equals (int) -2147483645
and so on.
Apparently, they're treated in two's complement representation. An related post can be found here.
But what I'm not sure about now are these:
(1) The underlying data type for 0x80000000
always being int
or something else in other situations? Is this depended on compiler or platform?
(2) If I assigned a hexadecimal value to a signed integer like this, is it always interpreted as the two's complement representation? Is this depended on compiler or platform? A related post can be found here. Another reference can be found here.
Please share some ideas. Thank you all :D
Like many things in C-like languages, an enumeration is just an integer. Setting the first value like this will cause the compiler to increment from there, guaranteeing that all enumeration values are less than 0. (as a signed integer by 2s compliment, the high bit being set will indicate a negative number)
Likely, the programmers chose this value to be able to send various kinds of events, and shouldn't collide with the others.
In a nutshell, don't worry about the actual value; it's just a number. Use the name and understand that's supposed to be the meaning in the context of the calls that use or return those codes.