Why does instanceof return false for some literals?

Jonathan Lonowski picture Jonathan Lonowski · Oct 15, 2008 · Viewed 106.4k times · Source
"foo" instanceof String //=> false
"foo" instanceof Object //=> false

true instanceof Boolean //=> false
true instanceof Object //=> false
false instanceof Boolean //=> false
false instanceof Object //=> false

12.21 instanceof Number //=> false
/foo/ instanceof RegExp //=> true

// the tests against Object really don't make sense

Array literals and Object literals match...

[0,1] instanceof Array //=> true
{0:1} instanceof Object //=> true

Why don't all of them? Or, why don't they all not?
And, what are they an instance of, then?

It's the same in FF3, IE7, Opera, and Chrome. So, at least it's consistent.

Answer

John Millikin picture John Millikin · Oct 15, 2008

Primitives are a different kind of type than objects created from within Javascript. From the Mozilla API docs:

var color1 = new String("green");
color1 instanceof String; // returns true
var color2 = "coral";
color2 instanceof String; // returns false (color2 is not a String object)

I can't find any way to construct primitive types with code, perhaps it's not possible. This is probably why people use typeof "foo" === "string" instead of instanceof.

An easy way to remember things like this is asking yourself "I wonder what would be sane and easy to learn"? Whatever the answer is, Javascript does the other thing.