> maybe because I don't like nested callbacks and I don't like Javascript.
I'm an assembly hacker, but recently found that JavaScript really improved over the past five years (I suspect, to a great extent, because of v8). The newer ES5 and proposed ES6 standards add a bunch of features geared toward raw data manipulation (like Buffers and Views -- one way to implement contiguous blocks of memory).
As for callback hell, there are tools to help you out. There are node modules implementing fibers, and there are utilities like async.waterfall (https://github.com/caolan/async) that flatten the callback pyramid.
> most serious back end servers are not written in node either.
Define "serious back end servers". Most architectures involve a plethora of software servers (we have passed the age of a single apache server hosting requests years ago), and many companies use node in their stack (I remember seeing an HN post about how LinkedIn saw node.js performance far exceed their older ruby-on-rails platform)
As for google and amazon, at that scale I'm sure they are using some custom server code (possibly in a proprietary language) on custom hardware.
> Not sure if arithmetic exercises on v8 show anything.
You can run a similar test using `ab` on custom servers, but the results are fairly inconclusive due to the fact that the dominating factor is the network stack (and not the event reactor model or implementation). Arithmetic benchmarks are a proxy to measure the distance to bare metal.
> due to the fact that the dominating factor is the network stack (and not the event reactor model or implementation)
Good point. Sometimes it is necessary to tweak OS network settings (increase # of ports, file descriptors etc).
> Arithmetic benchmarks are a proxy to measure the distance to bare metal.
Presumably node is to be used in a heavily IO concurrent environment. Multiple connections coming and going. Single arithmetic benchmarks don't matter much or how close to the metal they are, that is not what node is mostly used for (if it is used for that those block the whole process and make for a pretty bad server). So a better test is to see how node handles 100000 connections, so maybe a mix concurrent connections and arithmetic benchmarks? 100000 users connect and do some arithmetic operations.
I'm an assembly hacker, but recently found that JavaScript really improved over the past five years (I suspect, to a great extent, because of v8). The newer ES5 and proposed ES6 standards add a bunch of features geared toward raw data manipulation (like Buffers and Views -- one way to implement contiguous blocks of memory).
As for callback hell, there are tools to help you out. There are node modules implementing fibers, and there are utilities like async.waterfall (https://github.com/caolan/async) that flatten the callback pyramid.
> most serious back end servers are not written in node either.
Define "serious back end servers". Most architectures involve a plethora of software servers (we have passed the age of a single apache server hosting requests years ago), and many companies use node in their stack (I remember seeing an HN post about how LinkedIn saw node.js performance far exceed their older ruby-on-rails platform)
As for google and amazon, at that scale I'm sure they are using some custom server code (possibly in a proprietary language) on custom hardware.
> Not sure if arithmetic exercises on v8 show anything.
You can run a similar test using `ab` on custom servers, but the results are fairly inconclusive due to the fact that the dominating factor is the network stack (and not the event reactor model or implementation). Arithmetic benchmarks are a proxy to measure the distance to bare metal.