Does Ruby 1.9 have a web page or something? Ruby's site has very little information on how's the project going, and I am not too involved to follow the mailing lists.
Questions like: when Rails support is coming? Or will native threads be supported?
With Ruby threads you get the worst of both worlds. They are preemptive and user-level.
1.9 will introduce native threads, which aren't much better. Each native thread requires megabytes of memory for its stack. Co-routines require only 64k of memory.
Concurrency in Rubinius should require even less memory overhead as it is stackless.
1.9 will introduce native threads, which aren't much better.
What? They're not only "better", they're actually _threads_, i.e. are able to run in parallel, you know? What are Ruby 1.8 threads good for, except for sitting on sockets?
Producer/consumer where there are multiple I/O bound producers (examples that come to mind: RSS reader, web spider, multiple-file search). They can also be a useful abstraction for things like waiting for events from multiple sources, or running quasi-realtime simulations.
I agree, though, that real threads would be a significant improvement. Or better yet, MxN threads like GHC.
I'm not sure how using 20 threads tests threading performance, but there's probably some issue that Ruby used to have that I'm not aware of.
It is impressive though, that to beat the given Ruby time on my machine in SBCL, I actually had to add optimize declarations. Of course, I have no idea what kind of machine the published number were from, and I'm too lazy to try and install Ruby 1.9 myself, but it seems that the old 'Ruby use -> slowness' implication no longer holds.
[edit] Hold on, the Ruby 1.8 test, which takes 22 seconds in their figures, takes 4.3 on my machine. So that would mean Ruby 1.9 is like super-sonic ultra fast. At least on this benchmark. Which is mostly testing the speed of the sorting routine which I suppose is written in C. So what are we talking about, anyway? I'll shut up now.
it seems that the old 'Ruby use -> slowness' implication no longer holds.
That's a fallacy in the first place (at least for web apps)... Most web application code spends most of its time waiting for the db to respond, not number-crunching.
That does not mean performance is irrelevant, does it? Heavily used web apps benefit a lot from a fast runtime (it allows you to push the point where you have to split across servers forward quite a bit), and being able to do CPU-intensive stuff in the same environment when you need to instead of having to bust out the C or Java or whatever is very pleasant.
Those numbers do look a little funny, 22 seconds seems way to high for Ruby 1.8, on my macbook 2.6Ghz it's around 3.2 secs.
Out of curiosity, what were you doing that made your SBCL numbers so high? I've got 1.8.6 (running natively on OSX) clocked at 1.3 seconds and SBCL (linux on VMWare) at 78 ms.
(These times are with the random number generation removed, with them the SBCL time jumps to 90ms and 1.8.6 jumps to 4 seconds).
http://www.skitoy.com/p/python-vs-ruby-performance/172
Bottom line is well over 70% of time is going to rand() and 25% to list overhead, threading is drowned out by that noise.