Beyond the improved readability, is there any advantage to includes
over indexOf
? They seem identical to me.
What is the difference between this
var x = [1,2,3].indexOf(1) > -1; //true
And this?
var y = [1,2,3].includes(1); //true
tl;dr: NaN
is treated differently:
[NaN].indexOf(NaN) > -1
is false
[NaN].includes(NaN)
is true
From the proposal:
Motivation
When using ECMAScript arrays, it is commonly desired to determine if the array includes an element. The prevailing pattern for this is
if (arr.indexOf(el) !== -1) { ... }
with various other possibilities, e.g.
arr.indexOf(el) >= 0
, or even~arr.indexOf(el)
.These patterns exhibit two problems:
- They fail to "say what you mean": instead of asking about whether the array includes an element, you ask what the index of the first occurrence of that element in the array is, and then compare it or bit-twiddle it, to determine the answer to your actual question.
- They fail for
NaN
, asindexOf
uses Strict Equality Comparison and thus[NaN].indexOf(NaN) === -1
.Proposed Solution
We propose the addition of an
Array.prototype.includes
method, such that the above patterns can be rewritten asif (arr.includes(el)) { ... }
This has almost the same semantics as the above, except that it uses the SameValueZero comparison algorithm instead of Strict Equality Comparison, thus making
[NaN].includes(NaN)
true.Thus, this proposal solves both problems seen in existing code.
We additionally add a
fromIndex
parameter, similar toArray.prototype.indexOf
andString.prototype.includes
, for consistency.
Further information: