Java signed zero and boxing

uhz picture uhz · Feb 8, 2013 · Viewed 15.7k times · Source

Lately I've written a project in Java and noticed a very strange feature with double/Double implementation. The double type in Java has two 0's, i.e. 0.0 and -0.0 (signed zero's). The strange thing is that:

0.0 == -0.0

evaluates to true, but:

new Double(0.0).equals(new Double(-0.0))

evaluates to false. Does anyone know the reason behind this?

Answer

assylias picture assylias · Feb 8, 2013

It is all explained in the javadoc:

Note that in most cases, for two instances of class Double, d1 and d2, the value of d1.equals(d2) is true if and only if

   d1.doubleValue() == d2.doubleValue() 

also has the value true. However, there are two exceptions:

  • If d1 and d2 both represent Double.NaN, then the equals method returns true, even though Double.NaN==Double.NaN has the value false.
  • If d1 represents +0.0 while d2 represents -0.0, or vice versa, the equal test has the value false, even though +0.0==-0.0 has the value true.

This definition allows hash tables to operate properly.


Now you might ask why 0.0 == -0.0 is true. In fact they are not strictly identical. For example:

Double.doubleToRawLongBits(0.0) == Double.doubleToRawLongBits(-0.0); //false

is false. However, the JLS requires ("in accordance with the rules of the IEEE 754 standard") that:

Positive zero and negative zero are considered equal.

hence 0.0 == -0.0 is true.