Why does the "in" operator in Javascript return true when testing if "0" exists in array, even when the array doesn't appear to contain "0"?
For example, this returns true, and makes sense:
var x = [1,2];
1 in x; // true
This returns false, and makes sense:
var x = [1,2];
3 in x; // false
However this returns true, and I don't understand why:
var x = [1,2];
0 in x;
It refers to the index or key, not the value. 0
and 1
are the valid indices for that array. There are also valid keys, including "length"
and "toString"
. Try 2 in x
. That will be false (since JavaScript arrays are 0-indexed).
See the MDN documentation.