Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Disagree... for example from the link:

  In JavaScript:

  '5' + 3 gives '53'
  Whereas

  '5' - 3 gives 2
That's just logical and obvious, since "+" is both numerical addition, and string concatenation, but "-" is only numerical subtraction, and the '5' starts out as a string.

Every language has 'gotchas'. Doesn't really matter which you pick to learn first at all. The more important thing is that you don't give up. possibly there are languages that just make people want to give up, but I'd say perhaps they're not motivated enough to learn if that's the case.

I started out on BASIC, and after a while I decided it was a piece of shit language and learnt assembly. But it taught me programming which is what I wanted to learn. I'm really glad I learnt BASIC first... essentially I learnt to swim really fast through syrup, and then switched to swimming in water.

The good thing about javascript as a first language is that people can be programming in it immediately, in their browser. They have a built in REPL to help them, as well as a debugger, profiler, etc. They have numerous docs to look at, and if they go to any website they can check the source to see how it works. That's a big win.



Calling your example logical and obvious is only logical and obvious if you're completely trapped within the JS mindset. It's the same as how people defend the absurd semantics of Visual Basic and PHP. It's fine if you like it, but to claim that it's objectively okay is just not supported by fact.

It's also worth considering that even if every language has 'gotchas', some of them have far worse gotchas than others. It is worthwhile to choose a starting language that teaches the fewest bad habits and the fewest bizarre rules so that people can easily learn new languages.


See my comment below. It's not about being trapped in any mindset. It's about thinking logically.

Either '5' - 3 results in an error of some kind, or it evaluates to 2. There is no other logical outcome. Javascript chooses to do the latter.

I also disagree that starting language matters. It's like saying you need to have a steinway grand to learn the piano properly.

Learning is about learning what not to do just as much, if not more, than what to do.

Also if you learn to swim through syrup, imagine how fast you'll be when you try swimming through water...


Your ideas about what is "logical" are completely arbitrary. Look, I can make up rules too:

* '5' - 3 should return '5', since it's the string '5' minus all the instances of the character '3' in it. There is no other logical outcome!

* '5' - 3 should return an empty string, since it's the string '5' with the last three characters removed. There is no other logical outcome!

* '5' - 3 should return '2' -- since we started with a string, the result should turn back into a string. There is no other logical outcome!

* '5' - 3 should return 50, since the only logical way to do math on a character is to take the UTF-8/ASCII value of it and then do the math. There is no other logical outcome!

* '5' - 3 should return undefined, since subtracting from a string typically doesn't produce a reasonable result. There is no other logical outcome!

You have provided absolutely no rational basis for discriminating between the merits of these choices, so your claim that Javascript's choice is one of exactly two "logical" ones is bizarre. If you think Javascript's choice is better, give a reason why it's better, don't just say that it's better.


The time wasted learning the quirky semantics and special rules covering each operator and data type in JavaScript could be better spent learning to reason about algorithms and data structures in a less confusing language. Not all learning is equal, and not all challenges are exactly the same.

The claim that by hindering a newbie programmer's attempts to express themselves will make them more effective in a better language is ridiculous. Even if it's true, it's missing the point - your goal when teaching newbie programmers should be to teach them good habits and generally applicable skills, and most importantly, you want them to love programming.

Having to memorize arcane minutiae and spend tons of time debugging problems caused by stupid design decisions is not going to make people love programming. JavaScript is tremendously accessible by virtue of its ubiquity, but that does NOT implicitly make it a good language for learning to program. Its numerous flaws and divergent implementations will drive away beginning programmers that might otherwise learn to love programming if presented with a better environment.


Please do not downvote maximusprime's comments just because you disagree with them.


It's logical and obvious if you already know the language; otherwise, it's perverse. I had to read your comment (the initial edit of it) twice to be sure you weren't being sarcastic.


No it's not.

Logically, the only other thing that could happen is for an "error" or "exception" be thrown when you do

  '5' - 3
There's really only 2 choices. Either convert the '5' to a number and subtract, or throw a hissy fit because it's a string.

That's not perverse. It's very logical.


You're looking at the wrong side of it. The subtraction does behave in a perfectly logical manner. The problem is that, given the the behavior of the subtraction operator, the addition operator's actions are illogical. Specifically, I'd argue it's perverse because it breaks commutativity:

'5' + 3 - 3 != '5' - 3 + 3

The logical approach would be to only assume that '+' is a string concatenation if both operands are strings and otherwise type coerce into numbers. Then:

'5' + 3 = 8 '5' - 3 = 2

To someone who doesn't code Javascript for a living, the above seems like a far more consistent and useful behavior.


That would be ridiculous because then what would 'hello' + 1 equal? Everyone would expect it to equal 'hello1'.


Except, not "everyone" would expect that.

I'd expect "hello" - 1 to return NaN, since you can't perform numerical subtraction on something that isn't a number. That's exactly what Javascript does.

In the same way, I'd expect "hello" + 1 to return Nan, since you can't perform string concatenation on something that isn't a string, and you can't perform numerical addition on something that isn't a number.

Edited for clarity


"hello" - 1 doesn't return NaN out of some sense of exception to trying to subtract from a string.

It returns it because parseFloat("hello") = NaN, and NaN - 1 = NaN.


I would expect 'hello' + 1 to throw an exception. I would similarly expect '5' - 3 to throw an exception. Why? Because you can't add a string to an integer, and nor can you subtract an integer from a string. Doing anything else is arbitrary and unpredictable, IMO, leading to subtle type errors. You want to find type errors early as possible, rather than letting bogus values flow through the program.

I think Javascript is broken here, and Python has it right.


You're conveniently ignoring the fact that the + operator also means string concatenation.

Javascript thinks to itself 'hello' string concatenated with a 1 value.


Huh? Everyone? I expect it to be 'hellp' because I am thinking it works similar to Ruby's 'hello'.succ


That's definitely not obvious. Some people think in patterns or generalizations.

String [binary_operator] Number = String

String [binary_operator] Number = Number

That's just begging for further explanation.

Explaining JavaScript is probably more challenging than say Java. Java has many keywords and usages, but when it comes to explaining the concept of those keywords/semantics to students, it may be a little bit less confusing.


OK here's plain english simple explain...

Left hand side is a string, and "+" is string concatenation as well as addition. So '5' + 3 = '53' (String concatenation). Just as "Hello" + 1 would equal "Hello1". The right hand side is converted to a string.

Left hand side is a string, but "-" is subtraction for numbers. JS converts the '5' to a number, then subtracts. So '5' - 3 = 2 (Numerical subtraction).

That's the explanation, and it's not terribly hard to get past.


The issue isn't explaining it. The issue is it needing an explanation at all.


You picked one of a dozen or more examples from one list of problems with JavaScript. Of course this particular example can be understood by those of us who already have programming experience. JavaScript's type coercion is likely to be a big stumbling block for many beginners and Resig specifically points this out.


I hate behavior myself, though it's probably because I learned Perl first, where there's a separate string concatenation operator ("." in perl5, "_" in perl6). I do wish javascript had that same separation so that "+" was always addition and not sometimes string concatenation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: