#5 – null is not an object?
Object? Shouldn't null be the absence of meaningful value? Well, yes. Despite the above result, null is not considered an instance of an object:
Looking at the language specification, you can see that:
44.3.9 *undefined value*: primitive value used when a variable has not been assigned a value.
4.3.11 *null value*: primitive value that represents the intentional absence of any object value.
So, it's a case of misinterpretation.
#4 – NaN is a Number
What's "Not a Number"? Well... a number!
Funny as this might look, there's more. Not a Number has "identity issues" and is not equal to itself:
Well, the technical explanation for this is complex and it's related to the types of NaN (quiet NaN and signaling NaN). You can read more here, but the real way to check for a number is using the function isNaN():
#3 – Math.min() > Math.max()
Hum... So, the minimum value is higher than the max?
Let's look at what they "represent":
Well, seems to be wrong. These don't represent the max or min values for a number, but actually functions that given two numbers return the max or the min of the provided parameters.
But why Infinity? And, apparently, in the inverse order? Well, looking at min(), "all numbers that are lower than positive infinity should be the smallest from a list, if there aren't smaller". So, the below makes sense:
Number 5 is the minimum between 5 and positive Infinity.
Source : Math.max and Math.min
#2 – true + true === 2
Let's do the math:
So, someone entering the language might think that true === 1. Lets check:
As odd as it might look, it actually makes sense. "EcmaScript standard specifies that unless either of the arguments is a string, the + operator is assumed to mean numeric addition and not string concatenation." So, it was the sum of the conversion to integer. As for the second part, true === 1 yields false because it's also comparing type and
Source: Adding Booleans
#1 – 0.1 + 0.2 !== 0.3
This is the coolest one and it's not a bug or anything. And it actually happens on several other programming languages, like C# (.NET) and I've written about it in the past
"Computers can only natively store integers, so they need some way of representing decimal numbers. This representation comes with some degree of inaccuracy. That’s why, more often than not, 0.1 + 0.2 !== 0.3."