Is code that uses the static Object.Equals to check for null more robust than code that uses the == operator or regular Object.Equals? Aren't the latter two vulnerable to being overridden in such a way that checking for null doesn't work as expected (e.g. returning false when the compared value is null)?
In other words, is this:
if (Equals(item, null)) { /* Do Something */ }
more robust than this:
if (item == null) { /* Do Something */ }
I personally find the latter syntax easier to read. Should it be avoided when writing code that will handle objects outside the author's control (e.g. libraries)? Should it always be avoided (when checking for null)? Is this just hair-splitting?
There's no simple answer for this question. Anyone who says always use one or the other is giving you poor advice, in my opinion.
There are actually several different methods you can call to compare object instances. Given two object instances a
and b
, you could write:
Object.Equals(a,b)
Object.ReferenceEquals(a,b)
a.Equals(b)
a == b
These could all do different things!
Object.Equals(a,b)
will (by default) perform reference equality comparison on reference types and bitwise comparison on value types. From the MSDN documentation:
The default implementation of Equals supports reference equality for reference types, and bitwise equality for value types. Reference equality means the object references that are compared refer to the same object. Bitwise equality means the objects that are compared have the same binary representation.
Note that a derived type might override the Equals method to implement value equality. Value equality means the compared objects have the same value but different binary representations.
Note the last paragraph above ... we'll discuss this a bit later.
Object.ReferenceEquals(a,b)
performs reference equality comparison only. If the types passed are boxed value types, the result is always false
.
a.Equals(b)
calls the virtual instance method of Object
, which the type of a
could override to do anything it wants. The call is performed using virtual dispatch, so the code that runs depends on the runtime type of a
.
a == b
invokes the static overloaded operator of the **compile-time type* of a
. If the implementation of that operator invokes instance methods on either a
or b
, it may also depend on the runtime types of the parameters. Since the dispatch is based on the types in the expression, the following may yield different results:
Frog aFrog = new Frog();
Frog bFrog = new Frog();
Animal aAnimal = aFrog;
Animal bAnimal = bFrog;
// not necessarily equal...
bool areEqualFrogs = aFrog == bFrog;
bool areEqualAnimals = aAnimal = bAnimal;
So, yes, there is vulnerability for check for nulls using operator ==
. In practice, most types do not overload ==
- but there's never a guarantee.
The instance method Equals()
is no better here. While the default implementation performs reference/bitwise equality checks, it is possible for a type to override the Equals()
member method, in which case this implementation will be called. A user supplied implementation could return whatever it wants, even when comparing to null.
But what about the static version of Object.Equals()
you ask? Can this end up running user code? Well, it turns out that the answer is YES. The implementation of Object.Equals(a,b)
expands to something along the lines of:
((object)a == (object)b) || (a != null && b != null && a.Equals(b))
You can try this for yourself:
class Foo {
public override bool Equals(object obj) { return true; } }
var a = new Foo();
var b = new Foo();
Console.WriteLine( Object.Equals(a,b) ); // outputs "True!"
As a consequence, it's possible for the statement: Object.Equals(a,b)
to run user code when neither of the types in the call are null
. Note that Object.Equals(a,b)
does not call the instance version of Equals()
when either of the arguments is null.
In short, the kind of comparison behavior you get can vary significantly, depending on which method you choose to call. One comment here, however: Microsoft doesn't officially document the internal behavior of Object.Equals(a,b)
. If you need an iron clad gaurantee of comparing a reference to null without any other code running, you want Object.ReferenceEquals()
:
Object.ReferenceEquals(item, null);
This method makes the intent extremently clear - you are specifically expecting the result to be the comparison of two references for reference equality. The benefit here over using something like Object.Equals(a,null)
, is that it's less likely that someone will come along later and say:
"Hey, this is awkward, let's replace it with: a.Equals(null)
or a == null
which potentially may be different.
Let's inject some pragmatism here, however. So far we've talked about the potential for different modalities of comparison to yield different results. While this is certainly the case, there are certain types where it's safe to write a == null
. Built-in .NET classes like String
and Nullable<T>
have well defined semantics for comparison. Furthermore, they are sealed
- preventing any change to their behavior through inheritance. The following is quite common (and correct):
string s = ...
if( s == null ) { ... }
It's unnecessary (and ugly) to write:
if( ReferenceEquals(s,null) ) { ... }
So in certain limited cases, using ==
is safe, and appropriate.