> They're not the same kind of thing at all, if you insist that we should be able to compare them then they're not equal, and since programmers are human and make mistakes probably comparing them is itself a mistake.
It is literally encoded in the spec.
You're saying that the design makes sense because there's a definition, and the definition is what makes it make sense.
There is value in having defined behaviors, but those behaviors can't be immune from criticism. That's letting the tail wag the dog. The purpose of a program is not to execute the rules of the programming language. It's to perform a real and useful task. If those real and useful tasks are complicated because synthetic and arbitrary behaviors of the language exist, then the language is wrong. The tool exists to do work. The work does not exist to provide academic examples for the language.
And, yes, it's possible for it to be impossible to determine a reasonable behavior, but that still doesn't mean we can't have reasonable behavior whenever possible.
The tasks are not complicated because of this, it literally is default behavior in mainstream languages. And no, they’re neither synthetic, nor arbitrary limitations. The rule is based on types, not on whatever one specific value might mean.
And if they were to define “exceptions”, where do you draw the line?
“F41S3” is this? False? No? What if I’m a l33t h4x0r? What about 0xF4153? Looks false enough to me.
And then all of a sudden code that is expecting to get an array or undefined gets handed a zero or an empty string because someone called it with
x && [x]
Or x.length && x
And that’s how you end up with a zero showing up on a webpage instead of a list of to-do items when the list is empty.Programming languages aren't for the machine, they're for humans, and humans make mistakes so we need to design the language with that in mind. "Truthiness" is a footgun, it increases the chance you'll write something you did not mean without even realising.
there are plenty of javascript examples that are actually weird though, especiall when javascript DOES apply meaning to strings, e.g. when attempting implicit integer parsing.
Because "0" is false. In a logical world, a non-empty string being truthy is fine even if the value is "false". Javascript isn't logical.
true ```
Excuse me?
> In a logical world, a non-empty string being truthy is fine even if the value is "false". Javascript isn't logical.
You must hate our illogical world built on C, because it has the same behavior.
Should "0" also === 0? How about "{}" === {}? When does it stop?
Python does the same thing. I don’t like it there either, but at least it’s more consistent about it.
I think after the 1970s "Worse is better" languages vanish from the Earth the last shadow of that thinking left might be Javascript, hopefully by then humans aren't writing it but of course for "compatibility" it'll still be implemented by web browsers.