Why does this part of code fail:
Integer.parseInt("11000000000000000000000000000000",2);
Exception in thread "main" java.lang.NumberFormatException: For input string: "11000000000000000000000000000000"
As far as I understand Integer is a 32 bit value. The number of zeros and ones in the upper code is 32. If there are 31 the code works. Why is that so?
Your code fails because it tries to parse a number that would require 33 bits to store as a signed integer.
A signed int
is a 32 bit value in two's complement representation, where the first bit will indicate the sign of the number, and the remaining 31 bits the value of the number. (-ish.) Java only supports signed integers, and parseInt()
and friends aren't supposed to parse two's complement bit patterns – and thus interpret the 1
or (possibly implied) 0
at the 32nd position from the right as the sign. They're meant to support parsing a human-readable reprentation, which is an optional -
(or +
) for the sign, followed by the absolute value of a number.
In this context, it's a false intuition that leads you to expect the behaviour you describe: if you were parsing any other base besides base 2 (or maybe the other commonly used power-of-two bases), would you expect the first digit of the input to affect the sign? Obviously you wouldn't; having, say, parseInt("2147483648")
return -2147483648
by design would be PHP levels of crazy.
Special-casing power-of-two bases also feels odd. Better to have a separate approach to handling bit patterns, for example the one in this answer.