Hacker News new | past | comments | ask | show | jobs | submit login
JavaScript Promises: There and back again (html5rocks.com)
217 points by sacado2 on Dec 17, 2013 | hide | past | favorite | 74 comments



Broader adoption of JS on the server will only begin after ES6 generators land in Node. Due to the complexity of async code, writing JS on the server is not easy for the average team.

As of now, you could use Promises but it is still kind of messy without generators. Here is a simple example:

  #Without generators
  Credentials.get({ token })
      .then (credentials) =>
          User.get({ username: credentials.username })

  #With generators
  credentials = yield Credentials.get({ token })
  User.get({ username: credentials.username })
We use FaceBook's regenerator project to compile ES6 features down to Node's currently supported Javascript features. If you don't want to use the --harmony flag, you should really try it out. https://github.com/facebook/regenerator


Whilst thanks to lack of choice JavaScript is so important on the browser side, I would hope the Node infatuation would eventually pass where you do have a choice. Async calls in Node are done via explicit passing of closures. Callbacks, Promises etc all look bad compared to simple sequential instructions like:

   credentials := NewCredentials(token)
   User.findByName(credentials.userName)   
The above is in Go, and it is just as fast if not more. Whilst all requests are on their own goroutines, the runtime will schedule them using the same efficient async io that does not need to be made visible as in JavaScript.


  var withThis = (object, f) => function bound() {
    return f.apply(object, arguments);
  }

  function getCredentialsByToken(token) {
    return newCredentials(token)
    .then(withThis(User, User.findByName));
  }


That reinforces zsombor's point nicely.


I was hoping it shows that it really depends on your style (procedural or functional). Thanks to generators and arrow functions, ES6 lets you write code using either.

When you're writing in functional style, you'd use arrows, which work nicely both for small lambdas and for writing combinators that can be partially applied. When you're writing in a more procedural style, generators are your "do syntax" equivalent(-ish):

  async(function* getCredentialsByToken(token) {
    credentials = yield NewCredentials(token)
    return User.findByName(credentials.userName)  
  });
ES6 is a lot more powerful than it looks :)


This isn't a question of "my style" -- zsombor's notation is objectively better. It achieves the same results with fewer tokens and less intellectual overhead.

The difference between "functional" and "procedural" style here is nothing more than monadic bind operator vs do-notation, but what you have above or in the initial response is nothing like that.

Also I'm not sure "arrows" -- http://www.haskell.org/haskellwiki/Arrow -- means what you think it means in the context of functional programming? It's not sugar for `function(){}`...


Whoops, yes I mean "arrow function syntax", not arrows.

As for style, using combinators is usually considered to be specific to functional programming, right?

IMO the interesting part is how the terse arrow function syntax in ES6 puts combinators on the same level of expressiveness as everything else. Contrast to ES5 you have to use the awkward `return function() {}` syntax, and you have to do something similar in go - `return func (param type) type { ... }`

Both styles are now almost equally terse and you can pick either depending whether you feel its more appropriate in the given situation.

Also: https://gist.github.com/Gozala/7242467


Those aren't go routines, there's a yield sitting right there. And there's yield/return in your second example, and there's the equivalent in the promises example.

This isn't a terseness issue; it's an error issue. Favoring yield/return and equivalents over sequential code in CSP is exactly the same as favoring goto over while, or goto over exceptions. Necessary sometimes? Yes. Something to be celebrated and preferred? No.

The trouble is, of course, that in Javascript runtimes the "yield" or equivalent is how scheduling is achieved, because there is no capacity for preemption.

Thus, Node.js returns us to the bad old days of Windows 3.1 and System 6, where one faulty bit of code grinds the entire system to a halt.

Edit: Go goroutines of course also have a cooperative scheduler currently, but at least the result of having it isn't baked into the idioms and language constructs.


True. Though I suppose that writing an async generator runner that utilizes multiple workers (e.g. via webworkers in browser, via isolates in node - if they ever happen) is also possibly doable at some time in the future. Those generators wont have any access to the current scope, but that might be better[1]

Still, scheduling would continue to be cooperative via yield (just split amongst multiple threads). Go is clearly much further ahead in that regard :D

[1]: https://groups.google.com/forum/#!topic/golang-nuts/hZjvk-EP...

note: I'm curious, doesn't shared state from those closures let you shoot yourself in the foot in Go? You still have to be careful not to use anything not thread-safe that can potentially change...


Sure, but you always opt-in to using variables that are scoped outside of the current thread. You don't opt-in to the explicit control operators encouraged by the style where you don't use sequential statements -- they are required in order to program correctly. This is why I think the goto analogy is appropriate.


As of Go 1.2 (released a few weeks ago) goroutines are preempted. Sure currently this only happens in the presence of function calls, but it is hard to create a non preemptible infinite loop without function calls by accident.


Completely agree!

This isn't a new thihg anyway, there are currently a lot of implementations, I even wrote one myself[1].

By the way, most (sane) libraries that use yield/await perfom a lot better, than vanilla promises. Because when you do .then() on already resolved promise you have to wait for the next tick, or setImmediate or whatewer. Yield-based implementations can continue right away in this case.

[1]: https://github.com/omgtehlion/asjs


>Because when you do .then() on already resolved promise you have to wait for the next tick, or setImmediate or whatewer. Yield-based implementations can continue right away in this case.

Actually I learned to embrace next-tickness of promises in Q because I don't have to think twice about different possible execution order depending on promise's current state.

I would honestly prefer immediate continuation to be opt-in. This worked well in .NET (its Task's ContinueWith has a flag called TaskContinuationOptions.ExecuteSynchronously that tells the scheduler “you can run me right away if you like”).


If you use yield in JS you don’t have to think about execition order too. Because execution sequence will always be the same, and JS lacks real threading, so you won’t notice it anyway.


What about this kind of code?

    var x;

    promise().then(function (result) {
      x = result;
    }).done();

    x.doStuff();
This is a contrived example, and pretty stupid too. Still, if promises are always resolved on next tick, it will always fail. If promises may be resolved in the same tick, then you may miss the mistake.

A more real-world example is synchronization code like this[1]. Can you say at a glance if this code works fine both with resolved and pending promises? You'd have to check every `then`. You don't get this kind of problem with always-asynchronous promises.

In other words, it's a tradeoff between performance and introducing possibility of several different code paths, and it gets messier with non-trivial code.

  [1]: https://gist.github.com/gaearon/7930162


That is not true at all about Promises/A+.

The requirement is to always calls handlers with an empty stack and in a certain order. In practice only naivest implementations would call platform scheduler for every `.then()`. Implementations can in all cases call the handlers synchronously when it would matter.


> In practice, this requirement ensures that onFulfilled and onRejected execute asynchronously, after the event loop turn in which then is called, and with a fresh stack.

http://promises-aplus.github.io/promises-spec/#notes (emphasis mine)


Notice that's in the explanatory Notes section, not the normative Requirements section.


That's just the specification. Doing exactly what the spec says is the definition of a naive implementation.


What is the point of having a specification, if everyone decides not to follow it ?


I did not say that one should not to comply to a specification. There is a big difference between implementations complying to specification and implementations where the source code is basically just duplication of the spec (which obviously runs very slowly).

So my point is that even if the spec might say that "Do A, then B and then C", that's not what an implementation has to do EXACTLY as long as the outcome is exactly the same in both cases.

For example a property load is a hugely complex operation in the ECMAScript spec that would take thousands of instructions to implement naively. In practice e.g. V8 will in many cases get away literally with one machine instruction in best case. None of the specified property load steps are obviously happening but yet somehow V8 is spec compliant.


Good but can you yield from the global scope? if not you still need to write at least a function.

The hardest stuff in JS is that :

    function F(){throw "error"}

   try{setTimeout(F,1000)}catch(error){ console.log("error")}
we cant catch async errors outside the async callback , will generators fix this ? ( by the way , that's why you should never throw any error yourself in js ).


You cannot yield from the global scope; you have to explicitly be inside a generator function. This isn't really a problem in practice except for one-off-scripts, since most modules don't have side effects but instead export functions (possibly generator functions). But for those one-off scripts it is indeed a bit awkward... you could imagine ways to get around it.

You can catch async errors with generators and promises; see e.g. the linked post, or http://jlongster.com/A-Study-on-Solving-Callbacks-with-JavaS...


> we cant catch async errors outside the async callback , will generators fix this ?

You can inject an exception into a generator via the throw method[0], and it'll be raised at the yield, as if `yield` had thrown[1].

JS exception handling facilities are still crap, though.

[0] http://wiki.ecmascript.org/doku.php?id=harmony:generators#me...

[1] http://wiki.ecmascript.org/doku.php?id=harmony:generators#in...


That's why node modules generally don't throw exceptions, but rather pass them as the first argument to the callback.

I think you can catch async errors if you call the throw method on the generator.

http://wiki.ecmascript.org/doku.php?id=harmony:generators#me...


  var just = property => object => {
    var o = {}; 
    o[property] = object[property]; 
    return o;
  }

  Credentials.get({token})
    .then(just('username'))
    .then(withThis(User, User.get));
withThis available in a comment below.


Cool trick with the redundant indentation which can be applied to generators too:

    #Without generators
    Credentials.get({ token })
        .then(credentials => User.get({ username: credentials.username }))

    #With generators

    var credentials = 
        yield Credentials.get({ token })
            User.get({ username: credentials.username })


This might be not a very good example. But generators have more good features than just cleaner code.

1. You write code almost like synchronous one. You can use any control-flow statements you are familiar with. Just dont forget to turn a promise into real value with `yield` keyword.

2. How much times have you wrote such statements?

    X.prototype.foo = function() {
        var self = this;
        // or
        var that = this;
        this.bar().then(function(x) {
            that.baz(x);
        });
    };
or }.bind(this));

With yield you can just write your logic, not a boilerplate:

    X.prototype.foo = async(function*() {
        var x = yield this.bar();
        this.baz();
    });


1. Yes this is the problem. It is very easy to write something like this:

    var a = yield A();
    var b = yield B();
This is massively less performant than:

    var a = A();
    var b = B();
    a = yield a;
    b = yield b;
In promises you naturally do `Promise.all([A(), B()]).spread((a, b) => //Use a and b);`

2. Why did you stop using arrow functions?

    X.prototype.foo = function() {
        this.bar().then(x => this.baz(x));
    };
3. You are overvaluing control-flow constructs. Try-catch is always a catch-all error silencer where as promises can extend `.catch()` and `.finally()` to overcome flaws. Hand written loops are not that common, one mostly uses .filter, .map and .reduce in these kind of work flows.


1. You can easily write the same crappy code with promises anyway. And if you do think that `Promise.all` will save you, then you can use it with yield. With destructuring you even don’t need `.spread`.

Even more, with yield you can easily achieve parallel execution just moving `yield` keyword to the place where you need actual value. Good luck doing this with promises.

2. Unfortunately, traceur-compiler does not support fat arrow yet;

3. If you use try-catch just as a silencer, man I have bad news for you. And if you provide real error-handling in `.fail()` then what’s the problem writing the same code inside catch clause?


Well yes you can achieve better concurrent execution if you essentially duplicate the code. I also showed you how to do it with promises already (in 1 line) so not sure why you are wishing me good luck :)

There is just no way to use yield without it either being superfluous or sacrificing concurrency. Of course it is great when you have a sequence I guess, but promise code with arrows is not messy in comparison, maybe slightly more verbose at best.

Consider something like this which is far from optimal:

    let suspend = require('suspend'),
        request = require('request');

    let getParsed = suspend(function* (urls) {
        urls.forEach((url) => request(url, suspend.fork()));
        return (yield suspend.join()).map((r) => parseBody(r.body));
    });
This is again very easy to write inadvertently.

Maximizing concurrency:

    let getParsed = Promise.coroutine(function* (urls) {
        return yield urls.map((url) => {
            request(url).spread((response, body) => parseBody(body));
        });
    });
However, now that we have maximum concurrency, it is entirely pointless to even have a generator:

    let getParsed = urls =>
        urls.map((url) => {
            request(url).spread((response, body) => parseBody(body));
        });
    });
I said try-catch silences all errors that are not event meant to be handled but are bugs in code that should never be thrown in the first place. For example, if you have a typo in your code you will not know about that instead you will handle it like an expected error like network error.

Since promise `.catch()` is not limited by the language, you can do e.g. `.catch(NetworkError, e => ());` or whatever.


While I agree that using callback parameters instead of variable assignment is a bit fugly I think its not that bad and just takes think some time to get used to. For me, the thing that really benefits the most from generator syntax is structured control flow, specially loops: if you use generators you can use the existing JS for and while loop which is much better than doing the same with raw callbacks (manual recursion is basically gotos) or a an async library (promise-based or not, its going to be very verbose and still won't play well with "break" and "return" statements).


In the submitted article, two callback methods are usually provided (one for success and another for failure). Is that possible when using generators? If so, what does that look like?


In generator code you'd use try/catch to handle failure instead.

    try {
      var user = yield Users.get(1234);
      return user.name;
    } catch(ex) {
      // do whatever
    }
rather than..

    return Users.get(1234).then(
      function (u) { return u.name; },
      function (ex) { /* do whatever */ }
    );


Those are not entirely equivalent. For example, if you make a typo when writing "user" or "u", the first one will catch that typo (cannot get property name of undefined) while the second one will not.

  Users.get(1234)
    .then(u => u.name)
    .catch(ex => 1/* do whatever */);


What are the advantages of compiling to ES5 instead of using --harmory?


"Cross-platformness". Someone who wants to use your library and doesnt use harmony won't have to start his node process up with `--harmony` which is sort of a win I guess


I recently found that AngularJS has supports for promises. I replaced some of the callback code to use promises and it's a lot more readable.

Callbacks:

    Backend.CallA(param, function() { 
      aDone = true;
      if (aDone && bDone) { doC(); }
    });
    Backend.CallB(param, function() { 
      bDone = true;
      if (aDone && bDone) { doC(); }
    });
Promises:

    var promiseA = Backend.CallA(param);
    var promiseB = Backend.CallB(param);
    $q.all([promiseA, promiseB]).then(doC);


IIRC Angular's promises are based on stripped down version of Kris Kowal's Q. You may want to check it out:

https://github.com/kriskowal/q


As of AngularJS version 1.2, the promises provider ($q) is fully Promises/A+ compliant.

http://blog.angularjs.org/2013/11/angularjs-120-timely-deliv...


What version of Javascript has Promises and does anyone know the estimated time it might take to make it over to V8 and subsequently node.js?

The biggest issue I had with promises is interfacing with non-promise code. There are methods to 'lift' node.js code that uses callbacks into promises, but the resulting code is much noiser.

I'm a bit disappointed that the official JS version didn't cut down on the visual noise of the piping so the actual work being done stands out. Maybe there needs to be a different type of syntax highlighter that mutes the promises piping and highlights the actual calls.


Promises are in ES6. There is a prototype version of them in V8 behind the usual --harmony flag, although it is very bad and deviates from the standard quite a lot. E.g. it passes 221/879 of the Promises/A+ tests [1], which only test the then method; it also deviates on other methods, as noted in the article [2].

Node tends to keep up with V8 versions pretty well, so I anticipate by the time Node 0.12 is released the --harmony flag will enable the Promise global. (Hopefully by then the V8 team will have gotten their act together on standards compliance.) Node core though is pretty committed to the error-first callback pattern; that will be a longer shift, I think.

[1]: https://twitter.com/promisesaplus/status/407721467778846720 [2]: http://www.html5rocks.com/en/tutorials/es6/promises/#toc-api


The implementation in Blink is now passing the Promises/A+ tests at 100% [1], and we'll be updating the V8 implementation accordingly. We want to be shipping the one in V8 ASAP, but as you note it needs some work, so we'll be addressing those issues and then getting it out the door very soon.

[1]: https://twitter.com/ChromiumDev/status/413035115665575936


I've just been using the Q promise library and it works fine in NodeJS. The Q wrappers for NodeJS style functions is a bit ugly, but it works very well.

The real problem is the core NodeJS libraries aren't promise compliant. I don't know if there's a module that cleanly wraps all of these in their promise analogs but that'd be the best thing here.

Calling Q.nsend() on everything is messy, but is a lot better than dealing with ridiculously nested callbacks.


Have you looked into bluebird[1]? It and most of the available promise libraries have a nodify-esque function where you can simple do this:

    var fs = Promise.promisifyAll(require("fs"));
And all of the regular methods in fs are now promisable. Q also seems to be one of the slowest[2] promise libraries there is. I'm not sure why so many people use it. That guy has a rather large benchmark table from a couple months ago as well and Q seems to be shockingly slow compared to the dozen or so other promise libraries out there.

1. https://github.com/petkaantonov/bluebird

2. http://spion.github.io/posts/why-i-am-switching-to-promises....


> The real problem is the core NodeJS libraries aren't promise compliant.

This isn't a problem at all in practice.

To say that node core is not "promise compliant" gives people the idea that there is something that prevents them from using promises in node. In reality, most promise libraries have a "denodeify" function that will convert any function that expects a node-style callback into one that returns a promise instead.

  var fs = require('fs');
  var rsvp = require('rsvp');
  
  // Create a version of fs.readFile that returns
  // a promise instead of expecting a callback.
  var readFile = rsvp.denodeify(fs.readFile);
  
  readFile('/etc/passwd', 'utf8').then(function (contents) {
    // do something with the file contents
  }, function (error) {
    // handle the error
  });
It is similarly easy to convert functions that expect other types of callbacks (read: not node-style callbacks where the first argument is always an error) to functions that return promises using deferreds.


That's a good pattern, but obnoxious to apply on large numbers of functions. If something just did this for you in advance, that'd be what I'm looking for.

Trouble is, NodeJS set a convention and now everyone adheres to it. I'm not complaining there's standards, in fact that's a good thing. Part of me is just disappointed that the promise standardization happened too late in the game for NodeJS to build around it.


It really isn't much of a problem. A lot of the time its as easy as replacing

  var lib = require('lib');
with (given var Promise = require('bluebird'))

  var lib = require('lib');
  Promise.promisifyAll(lib);
And in case of classes, you'd use promisifyAll on about 2-3 prototypes instead.


I assume that some future major revision of node will move its API across and that in the meantime, promise versions of core libraries will start appearing on npm.


> The real problem is the core NodeJS libraries aren't promise compliant. I don't know if there's a module that cleanly wraps all of these in their promise analogs but that'd be the best thing here.

There's a few of these in npm, but I think the best one is probably the pr package [1], which lets you do e.g. var fs = require("pr/fs") and then use Bluebird-promise returning versions of the fs methods.

[1]: https://npmjs.org/package/pr


Interesting. I found that TameJS or Iced CoffeeScript mostly solved the async terribleness problem in node, but you basically are using a non-standard toolchain at that point, and setting that up for a whole team is sort of absurd.

Also, I'm not fully convinced that node is worth the code complexity that all of these solutions bring with them when go or scala can give you better performance with much cleaner code.


I've been really enjoying ToffeeScript, which provides implicit continuations. It was trivial to implement coroutines on top of it: https://gist.github.com/luciangames/7776345.

It works very well if you bear in mind what's happening to the call stack. In my case, every call to pause! or wait! is scheduling a continuation to be called later by the game loop. If I'm not mistaken, the overhead of each continuation is just that of a closure.

It's very comfortable, coming from a background in Stackless Python, Lua, and Unity.

I wish more people would take a look at ToffeeScript.


Fantastic! I don't know if promises are the ultimate solution to async programming, but they're a hell of a lot better than plain old callbacks.


Unrelated to the article content, I just noticed the table of contents on the left side. The slow fade in/out when you mouse over it is perfect! It's fixed position so you always see it, yet it's totally unobtrusive while reading the article. Brilliant!


Slightly off-topic but the new HTML5Rocks site is much more responsive and easier to read.


Agreed, it looks great. I was poking around the site hoping to find a link to a Bootstrap-esque code repository of the base HTML and CSS.



I didn't realize the entire site was on github (obviously I didn't realize any of the site was on github to be honest). Thanks so much for the links.



Call me crazy, but I fail to see the improvement in this weird syntax and I've had enough. Go, there we go.


Composibility and error handling.


I have that already with libraries like caolan/async and node.js convention of first-argument-error + callbacks. Now, of course I would like to have that features natively, but not at the cost of having more (ugly) syntax. My point is that if I have to learn new syntax, I would then use the time to learn a modern and efficient new programming language.


does this mean anything for NodeJS programmers that rely on an existing promise library (eg Q)?


Well presumably this makes it's way into V8 and therefore node. I'd assume the performance is better than Q.

The biggest issue for me is Q's habit of hiding errors despite me defining a Q.onerror handler. Hopefully a native implementation would fail a little louder.


I've had this issue with Q and when. Sometimes, things would fail inside a then chain, and then I can't figure out where exactly it failed, because though it fell through a couple thens, I'm not sure exactly where it came from, or how it got that way.


Have you tried setting

    Q.longStackSupport = true
? This is invaluable for debugging (although it does slow things down a lot, so don't use it in production)


Yep, and this is exactly why its so nice to have promises built into the language. That way, the implementing engine could provide long stack traces for them as well as extra inspection and tooling of all currently pending async operations - and it could do that at the lowest possible performance cost.


Q contains some helper functions for creating generators, tying them to a promise, and automatically starting the generator, eg Q.async(). I've found that I'd rather continue using Q and use Q.async()/Q.spawn() rather than dealing directly with the generators and managing them.


Are there tools that visualize code as a flowchart, like he did halfway through the tutorial?


does latest NodeJS has this?


npm install es6-promise for a shim


you can always `npm install <your-favorite-promises-library>`


nodewebkit has Promises.


not yet




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: