Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>Most of the Node.js vs Go arguments are weak. It's surprising that Node.js is still outpacing Go in popularity in spite of all this slander.

Thats a rather defensive position for node in what is a very rare use case for the language. It's unsurprising Node.js is still outpacing Go, given the large number of JS developers and the fact that Go is pretty much worthless for hosting front end web applications (you won't find your favorite asset pipeline in Go).

Its not surprising at all that they switched to a different language for data processing & pipelines, and its still somewhat surprising as that they chose Go, given that most teams in a situation like this would switch to the even more popular JVM/Spark/Storm/Kafka stack.

Finally, your statement that The real underlying issue here is that one CPU core is given too much work while others are more or less idle. isn't accurate - the issue is one thread was has too much work - no modern OS built in the last 20 years would allow a single process to hog all the CPU time unless you explicitly turned off the kernel's scheduling. The root of the eventloop issue, is eventloop specific and its even more Node-specific since the event loop is pretty much the only way to achieve concurrency in node. Other languages (like Go and Java) at least have options for other models of concurrency.

Consider the following - what if both processes got tied up? Do you just start another process? Would it feasible or wise to run 1000 processes (no it wont)? However this is a problem that you won't come across in Go by using goroutines and taking advantage of its scheduler, as you can easily run 1000s of goroutines performantly.

That said - this is a rather narrow use case to make the judgement that one language is better than the other - its just the case that Go is likely better suited for these kind of services.



> the issue is one thread was has too much work

The underlying issue is that when you have a consumer-facing API which accepts HTTP requests with a body, the first thing you should think about is limits.

> Consider the following - what if both processes got tied up? Do you just start another process? Would it feasible or wise to run 1000 processes (no it wont)? However this is a problem that you won't come across in Go by using goroutines and taking advantage of its scheduler, as you can easily run 1000s of goroutines performantly.

I have no experience of go, but my understanding is that goroutines are green threads multiplexed over a small thread pool. If you get 5 MB of JSON in N different requests (N=number of cores) at the same time, I don't see go generating free CPU time out of thin air. The usual way to go about these things in a language without multithreading is to have a queue and a process pool, but this also won't magically solve the issue if all cores are busy.


>If you get 5 MB of JSON in N different requests (N=number of cores) at the same time, I don't see go generating free CPU time out of thin air.

You don't, but the scheduler normally won't allow one thread to completely starve the cpu. Of course, its clear they should be using limits, however JVM, glibc threads scheduler or Go's green threads likely wouldn't allow a single thread to completely starve the CPU, eventually the scheduler will step in and divert resources to another thread.

Without limits in a threaded solution, you would see the latency increase, but you wouldn't see the application stop taking requests altogether.

However there are real benefits for having an event loop concurrency, so this shouldn't be taken as a reason one model is strictly better than another.


Every language requires you to be careful.

C makes you do array out of bounds checks. Javascript makes you worry about tying up your event loop with eg massive string processing.

Just get a friggin asynchronous JSON parser if you're running it on untrusted client input (ie any client input). It's not that hard.

Maybe node should provide a "tainted" feature for modules to mark variables that are "untrusted" and provide some warnings when functions like JSON.parse are run on them.

The upside of JS is massive - easier to reason about control flow than threads, and much easier to build something much FASTER and efficient than threads.


I think good languages require you to care about things that matter for your domain. For example, C's bounds checks are a consequence of demanding fine-grained control.

The problem for me with Node here is that whole cooperative-multitasking thing doesn't directly buy you anything. It's a historical accident, not a necessary downside of an otherwise-positive choice. That's distinct from a browser or a GUI environment, where letting a single thread control the display and events really does buy you things you care about.


I care about single threadedness and evented paradigm to provide guarantees and simplify my reasoning about things. I know that if I call a function, it will return synchronously, but not necessarily with a callback. I know that my objects won't be clobbered by other threads, etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: