Cramp is a fully asynchronous web framework that's very good at dealing with thousands of open connections. It supports WebSockets (latest protocol) and Server-sent Events out of the box. Optionally, supports using Ruby 1.9 fibers to prevent spaghetti evented code - which means seamless Active Record integration.
Goliath has a websockets branch and also supports eventsource and streaming pretty easily. If it's something you need, definitely look into Goliath as well.
Personally, I think Cramp's interface is a lot better than Goliath when it comes to streaming, because everything is streaming in Cramp. YMMV. https://github.com/lifo/cramp/blob/master/examples/sse/serve... in case you want to compare. Also, goliath websockets branch looks outdated.
FWIW, I do like Goliath. More async solutions in Ruby the better!
HN meta point here. This link was posted 18 days ago too, but with a forward slash at the end: http://news.ycombinator.com/item?id=2874982 (nonetheless it gets an upvote again for being cool). Potential fix/update for news.arc on this issue or just a useful trick to let slide? :-)
Ha! I didn't realize about '/'. I asked in #startups about giving my old thread a bump but no one responded. So I tried making a new post and it worked!
It's unfortunate but there should be better ways for people to repost URLs over time like this one. This seems an appropriate repost in any case considering the first didn't get its due props. It's the links that do crazy well and then somehow get reposted days later that are really annoying ;-)
I didn't know about the / approach but have used the "?" suffix "trick" a couple of times over the years when submitting things that other people put boring/inappropriate titles on that didn't catch on. I believe # will also work since AJAXified URLs need to be supported.
I'd love to see it too! However, I do think the comparisons should be about the ability handle the number of open connections seamlessly and not req/sec. And relevant memory/cpu usages.