Hacker News new | past | comments | ask | show | jobs | submit login

I really liked the turkey, and I thought the solution looked elegant, so I had a go at it in CoffeeScript and node-canvas: https://gist.github.com/1475034

All requests in less than 66 ms, mean less than 19 ms.




AppEngine instances, unless they're running in a special backend instance, exist in a special containerized machine. These machines are very slow and don't have much RAM - 600 MHz and 128 MB. Based on the source that was posted on the blog, this app was running on a normal instance, as the source did not contain a backends.yaml file.

So it makes sense that a Core 2 Duo running at 2.13 GHz and slow CoffeeScript would be ~2x as fast. My personal experience of running Go code locally (on a Core i5 2500K) versus on a normal AppEngine instance showed a slowdown of around 6x or more.


This isn't true any more. They recently announced faster frontend instances with more memory (http://googleappengine.blogspot.com/2011/12/app-engine-161-r...), and I'm willing to bet this has been available internally since long before Thanksgiving.

I'd bet money that this app used the beefiest class of frontend instances.


Nope, it's just a standard app engine set up.


I can't argue with that, though I'm running this on just one core (single process).

If I was better with the v8 profiler I would check to see how much time it spends in JS land vs C, since node-canvas is all C, as well as the http parser.


Go runs on a single core unless you start using channels and/or threading libraries. So it should be apples-to-apples.


True, but the base http package handles each request on a different goroutine, so your requests could be handled concurrently:

http://golang.org/pkg/http/#Server.Serve

Though adgar is right, for now I believe you need to specify GOMAXPROCS to utilize multiple cores.

http://golang.org/pkg/runtime/#GOMAXPROCS


Go runs on a single core if you use channels anyway unless you specify an environment variable (something like GO_MAX_PROCS). Not sure about threading libraries since I haven't really considered using them in Go. I'm actually curious what people are using them for in Go - maybe games?


GOMAXPROCS it is :-).

It's a quick fixup, the runtime will eventually schedule on multiple threads without needing the user to specify anything.


I have added the v8 profiler output to the gist. It seems to spend most of it's time in zlib.


To be fair, you loadtested with ab, whereas the author launched his feature on the Google homepage.


Sadly I don't have the opportunity. :-)


What was the median response time?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: