Why does javascript's "in" operator return true when testing if 0 exists in an array that doesn't contain 0?

Mariano Peterson picture Mariano Peterson · Jun 18, 2010 · Viewed 33.6k times · Source

Why does the "in" operator in Javascript return true when testing if "0" exists in array, even when the array doesn't appear to contain "0"?

For example, this returns true, and makes sense:

var x = [1,2];
1 in x; // true

This returns false, and makes sense:

var x = [1,2];
3 in x; // false

However this returns true, and I don't understand why:

var x = [1,2];
0 in x;

Answer

Matthew Flaschen picture Matthew Flaschen · Jun 18, 2010

It refers to the index or key, not the value. 0 and 1 are the valid indices for that array. There are also valid keys, including "length" and "toString". Try 2 in x. That will be false (since JavaScript arrays are 0-indexed).

See the MDN documentation.