Why doesn't 'ref' and 'out' support polymorphism?

Andreas Grech picture Andreas Grech · Jul 30, 2009 · Viewed 11.3k times · Source

Take the following:

class A {}

class B : A {}

class C
{
    C()
    {
        var b = new B();
        Foo(b);
        Foo2(ref b); // <= compile-time error: 
                     // "The 'ref' argument doesn't match the parameter type"
    }

    void Foo(A a) {}

    void Foo2(ref A a) {}  
}

Why does the above compile-time error occur? This happens with both ref and out arguments.

Answer

Eric Lippert picture Eric Lippert · Jul 30, 2009

=============

UPDATE: I used this answer as the basis for this blog entry:

Why do ref and out parameters not allow type variation?

See the blog page for more commentary on this issue. Thanks for the great question.

=============

Let's suppose you have classes Animal, Mammal, Reptile, Giraffe, Turtle and Tiger, with the obvious subclassing relationships.

Now suppose you have a method void M(ref Mammal m). M can both read and write m.


Can you pass a variable of type Animal to M?

No. That variable could contain a Turtle, but M will assume that it contains only Mammals. A Turtle is not a Mammal.

Conclusion 1: ref parameters cannot be made "bigger". (There are more animals than mammals, so the variable is getting "bigger" because it can contain more things.)


Can you pass a variable of type Giraffe to M?

No. M can write to m, and M might want to write a Tiger into m. Now you've put a Tiger into a variable which is actually of type Giraffe.

Conclusion 2: ref parameters cannot be made "smaller".


Now consider N(out Mammal n).

Can you pass a variable of type Giraffe to N?

No. N can write to n, and N might want to write a Tiger.

Conclusion 3: out parameters cannot be made "smaller".


Can you pass a variable of type Animal to N?

Hmm.

Well, why not? N cannot read from n, it can only write to it, right? You write a Tiger to a variable of type Animal and you're all set, right?

Wrong. The rule is not "N can only write to n".

The rules are, briefly:

1) N has to write to n before N returns normally. (If N throws, all bets are off.)

2) N has to write something to n before it reads something from n.

That permits this sequence of events:

  • Declare a field x of type Animal.
  • Pass x as an out parameter to N.
  • N writes a Tiger into n, which is an alias for x.
  • On another thread, someone writes a Turtle into x.
  • N attempts to read the contents of n, and discovers a Turtle in what it thinks is a variable of type Mammal.

Clearly we want to make that illegal.

Conclusion 4: out parameters cannot be made "larger".


Final conclusion: Neither ref nor out parameters may vary their types. To do otherwise is to break verifiable type safety.

If these issues in basic type theory interest you, consider reading my series on how covariance and contravariance work in C# 4.0.