Here’s a bit of code I’ve been looking at as part of an exercise. It works if x is left undefined as presented below (value === 100), it works if x is set to say 10 (value === 10), but if I purposely set var x = null, I end up with value set to null also, where it should be 100.
// JavaScript exercise
var x, value;
if( x === undefined || null ) {
value = 100;
}
else {
value = x;
}
console.log( 'X is ' + x );
console.log( 'Value is ' + value );
// end of exercise
I’ve tried altering the evaluation step in a couple of ways
if( x === (undefined || null) )
and…
if( typeof(x) === undefined || null )
but both neither of those worked either, and broke the parts that had previously worked into the bargain. What am I missing?