I could not find the main difference. And I am very confused when we could use inheritance and when we can use subtyping. I found some definitions but they are not very clear.
What is the difference between subtyping and inheritance in object-oriented programming?
In addition to the answers already given, here's a link to an article I think is relevant. Excerpts:
In the object-oriented framework, inheritance is usually presented as a feature that goes hand in hand with subtyping when one organizes abstract datatypes in a hierarchy of classes. However, the two are orthogonal ideas.
- Subtyping refers to compatibility of interfaces. A type
B
is a subtype ofA
if every function that can be invoked on an object of typeA
can also be invoked on an object of typeB
.- Inheritance refers to reuse of implementations. A type
B
inherits from another typeA
if some functions forB
are written in terms of functions ofA
.However, subtyping and inheritance need not go hand in hand. Consider the data structure deque, a double-ended queue. A deque supports insertion and deletion at both ends, so it has four functions
insert-front
,delete-front
,insert-rear
anddelete-rear
. If we use justinsert-rear
anddelete-front
we get a normal queue. On the other hand, if we use justinsert-front
anddelete-front
, we get a stack. In other words, we can implement queues and stacks in terms of deques, so as datatypes,Stack
andQueue
inherit fromDeque
. On the other hand, neitherStack
norQueue
are subtypes ofDeque
since they do not support all the functions provided byDeque
. In fact, in this case,Deque
is a subtype of bothStack
andQueue
!
I think that Java, C++, C# and their ilk have contributed to the confusion, as already noted, by the fact that they consolidate both ideas into a single class hierarchy. However, I think the example given above does justice to the ideas in a rather language-agnostic way. I'm sure others can give more examples.