I wish Crystal would take off. It has so many things going for it (many of them mentioned in the article): performance, useful tooling such as an opinionated formatter, an integrated RSpec-like test framework, a powerful standard library, an awesome type system that gets out of the way most of the time, a familiar syntax.
So far I have been building some smaller personal CLI tools and a few web apps (with the Lucky framework). I’ve also tinkered with running it in AWS lambda functions in a custom (albeit unfinished) runtime.
Coming from a decade of Ruby, due to the similar syntax and mindset Crystal is my go to for cases where I need performance or runtime-less execution (e.g. in containers from scratch that contain only the binary and dependencies, if needed).
Crystal's standard library provided enough functionality for me in the past to get away with only few dependencies per project, which is great for supply chain security and complexity. Some of it's highlights are:
- an ergonomic HTTP::Server and client
- OAuth / OAuth2 clients with token refresh
- JSON/YAML/XML parsing/generation/mapping
- JSON/YAML mapping to classes
- native templating similar to ERB
I also wished for a long time for Crystal to take off, but now having Go and Rust I don't see Crystal to gain too much relevance in the future, while it might be appealing to Ruby devs, Ruby is becoming better every day, but still Ruby devs stay in Ruby because of the ecosystem, the company where I work on already decided that no new project will be written in C, C++, Ruby, Java and instead use Go or Rust, I suppose that other companies are doing the same.
As a Ruby dev, that Crystal is "close but different" means I know I won't be able to use my Ruby code but at the same time I get too little in return to make it feel worthwhile.
For the very rare circumstances where Ruby is too slow for what I need to do, there's usually code in some other language I can wrap in an extension, and that code is far more likely to be C or some other more established language than Crystal.
To me Crystal is actually a more polished Ruby implementation. There was some annoying inconsistent in ruby stdlib that was corrected in Crystal, for example the modules for file/directory interaction.
Ruby was my main language for scripting, but I switched to Crystal completely after getting comfortable with it. Now it is really hard to switch back to Ruby.
Also, I thing compiled language is the greatest friend for older programmers. Over the time I'm getting more and more sloppy and very often make some typing error. Crystal being compiled language saved me countless time by refusing to compile when I made the mistake. Meanwhile in Ruby it can only be found after the script ran for a while, sometime it became a disaster because the action was destructive.
If you go for Go or Rust you would never go for something like Crystal regardless, as they have different focuses. Just like Ruby, Crystal optimizes for code legibility and expressiveness, which both Go and Rust never really cared about, as it's about stability + performance (in the case of Golang) or safety + performance (in the case of Rust). Both of Go and Rust have horrible syntax compared to Ruby/Crystal, which is a fine tradeoff to make in cases.
That must be subjective because Go's syntax is pretty fine to me.
To illustrate, when I started learning Go, I implemented Concurrent Hash Array Mapped Tries from some PhD thesis and the pseudo code algorithm actually looked pretty much like the Go code. A testament to how legible it is.
Then again, I have never tried ruby so I can't compare.
Apples and oranges. Ruby is very, very expressive compared to Go. Go's legibility comes from the simplicity of its syntax, Ruby's legibility comes from its expressive syntax.
Ruby was one of my first languages and for awhile I described it as 'programming in english' especially Rails with the ActiveSupport gem. I often just fired up IRB (the repl) and guessed what function names I could use on a given object and it worked. I think this is why a lot of DSLs ended up in Ruby (chef, etc) plus the method_missing idiom allowed for a lot of this sort of expressivity at the expense of sometimes being thoroughly confused as to where a method was defined.
Overall I ended up writing more performance oriented applications and ruby (at least at the time) wasn't worth the 'it's pleasant to read and write' vs 'this ruby script that used to take a day now takes 15mins written in a language more suited for the task'. The other language I'd consider less expressive just in a less optimized-for-the-human-while-writing kind of way.
Crystal seemed nice in the respect that it might be the best of both worlds, but like mentioned elsewhere here it's still fairly niche.
I looked at the activesupport gem and I think maybe the things we call general purpose languages are actually just really broad domain specific languages. From the gem description "Rich support for multibyte strings, internationalization, time zones, and testing.". I do testing, and occasionally I have a string, but mostly I only do numeric programming, I don't think of that as a niche. Fortran was/is for numeric programming I think of numeric programming as the default normal sort of programming. That probably isn't so much the case anymore, but we probably all think of our domain as the normal one. So expressiveness is probably highly relative to the domain. Julia seems very expressive to me. So do rust and modern c++. But I don't do much string handling, and I don't really even know what internationalization means.
Ruby's extreme expressiveness comes from a few things. First, everything is an object. Second, Ruby has lots of functional features that make it exceptionally concise to express certain things.
Here's a simple example I'll do in JavaScript first since most people can read it. You have an array of the numbers 1 through 10 and want to get a new array with only the odd numbers.
But there's a better way. Since everything is an object, including integers, you can just ask the numbers if they are odd instead of using the mod operator.
arr.filter { |num| num.odd? }
=> [1, 3, 5, 7, 9]
(Yes, method names can end in a question mark (?). By convention, methods that end in a question mark are predicate methods that return a boolean. Methods can also end in an exclamation point/bang (!). By convention, these bang methods are "dangerous" – they might do something destructive like mutate an object in-place instead of returning a new copy.)
We can make this even more concise though.
arr.filter(&:odd?)
=> [1, 3, 5, 7, 9]
There's a bit to unpack here.
:odd? is a Symbol
The & operator calls the #to_proc method on the symbol :odd? then converts the result into a block, which is passed to the filter method. In other words, the #odd? method is called on each element in the array no different than the previous example.
Here's another way to express the exact same thing:
arr.reject(&:even?)
=> [1, 3, 5, 7, 9]
This is just one simple example. Ruby can express many things on one line that would take several lines in other languages. This is what makes it expressive – it's the density and readability of operations.
Another one of my favorite language features is that parentheses on method calls are optional, so a lot of syntax that looks like it would be language-level operators are actually just regular method calls to objects in Ruby. An example of this is ==, which is an operator in many other languages. In Ruby it is a method call. Consider this expression:
true == false
This is calling the #== instance method on the object `true` (which is an instance of TrueClass), passing `false` as the first argument. Written identically:
true.==(false)
This is wild if you're coming from another language where things like this are operators, but it is completely natural in Ruby. It is extremely powerful.
Rust is pretty expressive. Horrible syntax... that's just a subjective measure. I wish programmers would get over it, because it's a huge hindrance to trying a large number of languages out there with lots of nice things. But people are gonna be people, so it won't happen.
Yeah, I agree that it's highly subjective. And I don't mean that people shouldn't chose something just based on one variable, it obviously depends. No language is perfect for every use case. Personally I use Rust for a lot of things, even though I'm more productive in other languages, but sometimes it's just a really good fit for the problem.
Pseudo-code, but this is how I see the difference in expressiveness and syntax:
3.times {
println('Foobar')
}
for i in 0..3 {
println!("Foobar")
}
Both of them makes sense though, so not a huge difference, but it's the same way in lots of ways in Rust in general.
> because it's a huge hindrance to trying a large number of languages out there with lots of nice things
I know exactly what you're talking about, as most of the time I work in Clojure, and trying to show Clojure to other programmers who are used to C-like languages and never heard of any lisp-like language is a constant struggle, as their first reaction is always "eww, parenthesizes everywhere!" even though their favorite language usually has the same amount or even more.
Ruby is fine until your large application starts to include more and more dependencies. There is no coherent API across, every library has it's own mini DSL and the mess starts to build up. Then it comes method_missing, monkey patching and strings and dictionaries passing in all the places. True horror. All Ruby's readability dissapears. Small apps look beautiful, but large ones quickly become a mess. On top of that resource utilization is atrocious.
But there are places where Ruby shines like RSpec, Rails, pry in runtime (I miss that one).
I've been learning Rust over the last few months, because it seems like a good language to have on my CV.
It's certainly a taste thing, but I am not enjoying it at all. I find it unpleasant. I think it's going to join the short list of "languages I can use but would prefer not to".
I've never learned Crystal, but if it's like Ruby, then I think I'd enjoy it just fine.
Exactly.. I was a Crystal enthusiast, but because of their slow development pace and small community, half-baked documentation (all understandable, no critics here) and the raise of Go (great documentation, fast pace development, huge community), was hard to keep focus on that. Go became my option to ruby, and I'm doing fine with that.
I think windows is the main thing holding crystal back. I make little games for jams and I'm always trying out new languages. I submitted a bug related to running processes on windows last november, it was just recently closed as completed. Although I spend almost all of my at-home time on linux, I know that I have to provided windows executables for jams. I plan to come back to crystal when I can write once and can compile the same code everywhere.
I stay in Ruby because my customers want Rails and because my own scripts don't need to be fast. Edit and run is one step faster than edit, compile, run.
As a very, very long-time member of the Ruby community
> an integrated RSpec-like test framework
Please no. Please, please stop blindly just cargo culting RSpec into your projects. Minitest is included with the language and has all the assertions you might generally want. The only thing RSpec really "adds" is an insane and quirky DSL for the sake of "reading like English", which is a terrible misfeature. Most of the rest are terrible, terrible misfeatures as well.
As an example, one of these misfeatures is built-in mocking. Mocking is great, you might think. Yes, mocking is a necessary evil in certain cases but it should be used very sparingly. What happens instead is that people don't bother to design their classes for easy testing and they mock the hell out of everything in order to make their code testable. So what's the problem? Tests are supposed to do two things: 1) discover bugs you didn't realize you wrote, and 2) allow refactoring where if the tests pass, your code is probably alright. Widespread mocking absolutely tanks both of these goals. Instead of testing the effects of code, 90% of the Rails tests I see in the wild test mock everything to the point that the tests simply confirm that the code is implemented the way it is currently implemented. No bugs can be unearthed by this method because none of the actual effects are tested, just that certain methods are called in a certain order. And refactoring is now insanely difficult because any change to the logic causes the tests to fail, even if the effects are the same.
This is just one example.
So please, I beg you, stop reflexively reaching for RSpec. Minitest is great, it has most of the things you need out of the box. And Rails has a test framework built on top of it that similarly already does all the things you need. And it's all just plain Ruby.
I wonder if the lack of libraries in a new language is just a few ChatGPT-integrations away. After all, Crystal is so similar to Ruby. I bet as soon as context/environment aware ChatGPT tools/agents that can compile / run tests / apply fixes until it works are available, this will become the reality: "Create this Ruby library but for Crystal" and in a few hours you can include it in your Crystal project.
While it may be possible to onboard Ruby developers fairly easy, it is still easier to find talent for an „established“ language.
For my personal projects, I gladly use Crystal, but in my professional role as a technical decision maker I need to factor aspects like this in.
Don't get your hopes up, Crystal is doomed to never become mainstream.
We can divide programmers into two camps, those that enjoy programming itself and those who use it as a means to an end. The latter greatly outnumber the former, let's say 9:1. That massive disparity in numbers is why only the languages that enable the latter group thrive.
Ruby is the perfect example, the language got a massive exposure boost due to Rails, but once the hype died down, everyone left. That's because beyond Rails, Ruby has nothing to offer to programmers who want to get stuff done. Nothing besides pain, of course.
To those who enjoy playing with languages, "did you know there are 10 different ways you can filter an array in Ruby?" ([1]) is joyful to hear. But when you're woken up at 3am to find a bug in production `arr.reject(&:even?)` is the last thing you want to see.
This sort of cleverness, ambiguity and implicitness in language design repels 90% of programmers, and that is the reason why languages like Perl, Ruby, Scala, and now Crystal are either dead, dying, or destined to die.
"The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague. In the case of a well-known conversational programming language I have been told from various sides that as soon as a programming community is equipped with a terminal for it, a specific phenomenon occurs that even has a well-established name: it is called “the one-liners”. It takes one of two different forms: one programmer places a one-line program on the desk of another and either he proudly tells what it does and adds the question “Can you code this in less symbols?” —as if this were of any conceptual relevance!— or he just asks “Guess what it does!”. From this observation we must conclude that this language as a tool is an open invitation for clever tricks; and while exactly this may be the explanation for some of its appeal, viz. to those who like to show how clever they are, I am sorry, but I must regard this as one of the most damning things that can be said about a programming language. Another lesson we should have learned from the recent past is that the development of “richer” or “more powerful” programming languages was a mistake in the sense that these baroque monstrosities, these conglomerations of idiosyncrasies, are really unmanageable, both mechanically and mentally. "
> when you're woken up at 3am to find a bug in production `arr.reject(&:even?)` is the last thing you want to see.
Really? Because even if you're seeing it for the very first time, it means exactly what you'd think it means. The only thing that's even potentially sort of tricky or non-obvious is the ampersand. And there's nothing implicit or ambiguous in the expression, either.
In a sense, all languages have implicit aspects. Can you characterize the problematic implicitness? (I guess python's TOOWTDI helps, when reading code).
I've been studying bidirectional transformations, and a survey paper notes that "relational" methods have died off (somewhat "declarative", where you specify how things should be connected, with the details implicitly worked out for you... something like set-builder notation), while various coding methods remain (where you code for forward transformation, and the backward transformation is automatically derived).
They hypothesise it's because relational tools (libraries and projects) are difficult to maintain - but I suspect it's because they are difficult to use.
Which is a shame: something that automatically "does what you want" seems like a great idea! But being difficult to predict and diagnose - not being in control - is not great. So, to rephrase my question, what characterizes "good" automation?
implicitness examples: precedence, type coersion, overflow, polymorphism.
But I suppose (e.g.) precedence doesn't arise that often; it "should" be familiar from high-school algebra; it can be made explicit (with parentheses or successive assignments).
"Explicit" would seem to be "obvious" - is that necessarily true? Too much code can obscure what is happening.
Familiarity helps implicitness be understood. Some would argue lack of familiarity is the only problem with the languages you mention. Certainly, it helps: as you become more expert in anything, you start using short-hand and jargon, and skipping details.
Adoption is easier when it's similar to the familiar. Hence, "c-like". And python, psuedo-code-like.
But are some things easier to grasp, by nature? I think, "intuitive" often means familiar. And there are things that all human beings are familiar with, like space, trade and language. Perhaps this is where intrinsically obvious and familiarity coincide?
Crystal won't take off because it's an "amateur" language, to have a good and properly supported language you need a lot of poeple and money, which Crystal has not.
I would not rely my business on a language where there is just one full time dev working on it and relying on donations.
This isn't quite true. Python was certainly Guido's passion project, and it started that way. But early in its life, it was funded by the United States government via CNRI (under Bob Kahn).
At every job Guido had subsequently, he had at least some buy-in to work on Python, and some coworkers were also paid to work on Python.
There was a core "PythonLabs" group (I think including Jeremy Hylton?) that apparently moved from CNRI to the a startup BeOpen:
At Google from 2005 to ~2014, Guido nominally had 50% of his time to work on CPython, but that's pretty fuzzy because he was really productive in any case. (I was his officemate for a couple years during that time.)
Yes and Python (released in 1991) took a lot more time to become popular compared to Java (released in 1995) which was corporate backed.
Python still seems to have funding issues. For example, JS (backed by multiple big corporations) performance has improved a lot compared to Python. The main Python implementation doesn't have a proper JIT as far as I know.
So it shows the difference between amateur languages and corporate backed languages.
No, its quite possible for a language to have strong backing before it takes off (C#, JavaScript, Go, Swift, Dart, Rust all did, to varying extents, from day one) and those tend to also be the langauges that take off quickly. Other languages struggle until they attract either a sufficient mass of distributed backing or the one key backer that gives it the level of support and visibility that supports the adoption curve taking off, and most never reach that point.
Crystal has been going on for years and there is still no major support, it won't change, basically if you're a new lanague you have a couple of years after that you just missed the train.
Not historically true. Python didn't take off for years after the initial release in 1991. Even in the mid 2000s people were evangelising Python like it was a new thing (https://xkcd.com/353/)
Or take Rust, which has gained adoption more recently. It took a while to get going after the 1.0 release in 2015. If you look at daily downloads of Rust crates (https://lib.rs/stats) as a proxy for adoption:
- 2015 to 2017 - not many downloads
- 2018 - first signs of real growth, reaches 1M per day.
- 2018 to 2021 - 10x growth, 10M per day
- 2021 to 2024 - 10x growth (projected)
So new languages shouldn't lose hope if they don't see adoption immediately. It's possible that they might be adopted later. But for language authors, it's the hope that kills you. You continue working on it even if maybe you shouldn't.
Sure, let me define "shouldn't". For those authors who hope to see their language become mainstream and achieve widespread adoption, they may want to cut their losses. If they're working on it for intrinsic reward they should continue, by all means.
I'm not sure that I agree with the poster you are replying to, that Crystal won't take off, but I'm pretty sure that the statement "Crystal has not taken off because Crystal has not taken off" is kind of useful/valid/worthwhile. I believe that this is how network effects work.
I also am pretty sure I read about someone here on HN who used Crystal to build the code for their profitable cookie baking/selling business (the baked goods). I can't recall the details exactly. Edit: Here it is: https://news.ycombinator.com/item?id=23433847
Their development team is slow ass. They live in an idealistic world where Windows users are not worth supporting. Bugs and features take too long to fix and ship. Feels more like a toy enthusiast project relying entirely on organic growth. It cannot survive in a world where the most successful programming projects have full-time corporate backed teams.
your entire complaint appears based around the timeline of windows support.
you are free to use windows as a principal OS, but please inform us of your biases when totally trashing a team and product for what is essentially a technical issue of significant proportions, not a product of laziness or slowness, as you allege.
all those features are offered by Go too.. The similarities among ruby and crystal are normally easy to spot in small projects, but their differences get really visible in big projects.
Crystal is expressive, 500 lines of Go commonly translate to 50 lines of Crystal.
The DX in Crystal is really closer to Ruby than to Go. It's essentially a very fast and type-safe version of Ruby.
> The DX in Crystal is really closer to Ruby than to Go.
Me as polyglot can say that is much easier to switch completely between languages and paradigm than write or talk similar (but not equal) languages. So it is maybe a great reason for beginners to start to play with Crystal as Rubyists, but as soon as you start to work daily with the language, other factors like good documentation, tons of examples, great standard lib, active community and development, etc are much more important than if its remembers ruby or not.
I think this argument is less about the concrete syntactical and semantic similarities to Ruby, but the shared general idea to focus on developer happiness. For example, code is easy to read, yet still expressive.
Maybe with LLMs it won't matter much anymore? Maybe in the near future we'll have copilot for crystal as vscode plugin and the rest doesn't matter much - just quality of this plugin?
> We're going to write a program in a "new" language that is a mix of Ruby and INTERCAL. We're going to take the "come from" statement and use it to allow "hijacking" the return of a function. Furthermore, we're going to do it conditionally. "come from <method> if <condition>" will execute the following block if <method> was executed and <condition> is true. In <condition>, "result" can be used to refer to the result of executing <method>.
And that was enough to get it to understand my example code and correctly infer what it was intended to return.
Given it took that little, I don't think you need much code in a new language before you have a reasonable starting point as long as you can give examples and document the differences, and then use that on a few large projects that has good test suites, and work through any breakage.
If the LLM understands the language it can aid in creation of the libraries and ecosystem because it can also translate code. I just tested it by having ChatGPT translate one of my Ruby scripts to Python for example.
I don't like Crystal all that much, but it's similar enough to Ruby that if ChatGPT can handle Ruby->Python, it can handle Ruby->Crystal with relatively little work.
But it doesn't need to handle it flawlessly, because every new library you translate and fix up gives you a new codebase you can use to finetune it.
At that point, why bother with high-level languages at all? A sufficiently-good AI should be able to read a specification / test suite and directly generate a binary which passes those tests.
Maybe. Or maybe it'll become validation/what-is-happening lowest denominator. Programming languages are also a good Intermediate Language between humans<->machines and machines<->machines apparently due to recent AI advancements.
IMO crystal's value is that I can do things normally associated with golang without the horrors of actually dealing with golang's absence of a good type system and other quirks like error handling.
I love Crystal, and have loved it for several years. It was my go-to for Advent of Code or small fun projects for a while.
Eventually, I got so tired of constantly jumping between browser docs and my editor because of the lack of a working and useful language server (yes, I know about Scry and Crystalline, but neither of them actually do autocomplete outside of maybe 2% of cases), and I got so used to an amazing tooling experience when writing F# (my other favorite language) that I just kinda dropped off using Crystal.
Things I would've written in Crystal, I wrote in F#. If I _really_ needed a static binary, I picked up Rust (though I love Crystal's syntax _so much more_ than Rust's).
If Crystal gets a good language server, heck yeah I'll roll with it. In the meantime, it just sorta feels like a missed opportunity.
I compile my F# application with --standalone and share a single exe file to my users. It still requires .NET to be executed (but note it works on many platforms thanks to Mono).
We've been happily using Crystal in production at Heii On-Call (free website / cron job monitoring, free critical alerting to iOS/Android apps, on-call scheduling) [1] specifically for higher-throughput components of the overall system, such as the API server [2] and the system that continuously does website monitoring [3]. For context, we do use Ruby on Rails for the main website frontend.
Really I find Crystal to be so much more ergonomic and pleasant to use than Go or Rust, though I could see that being a matter of opinion. Very concise, very readable, very fast once compiled.
Porting utility classes from Ruby is so easy too. I end up spending more time porting unit tests than porting the underlying code. And the testing story (Crystal's "spec" [4]) is really nice.
If you want to try Crystal without installing anything, I made a "crystal-docker-quickstart" project template you can clone [5]. You can safely try out Crystal and have your first static binary compiled in about 15 seconds via something like:
git clone https://github.com/compumike/crystal-docker-quickstart.git my_app
cd my_app
./d_dev
make && out/my_app
Lastly, I've found that the Crystal community [6] is small but friendly and helpful. It's at that magical stage of an open source community where people are extremely competent and responsive, but not (yet?) burned out from dealing with so many people.
Real world example for Crystal: our overnight sitemap generation (built using Ruby, as part of our Rails app) took 6 hours to generate 20 million links. At some point due to infra changes, it stopped running to completion. Since then, we ported it over to Crystal, with the main code virtually unchanged (library to help generate sitemaps has matching API). Just had to set up basic database mapping for a few tables. That version is now in production, and it now processes 30 million links in about 30 minutes.
Since a library is mentioned, it is possible that the Ruby library was particularly inefficient compared to Crystal. Crystal is definitely faster than Ruby, but I expect differences to diminish in IO-bound operations (which sitemap generation sounds like it would be).
It's not specifically Ruby, but also using ActiveRecord, loading bloated models, associations, and some business logic. There is also a large amount of memory bloat with the sitemap library when working with large datasets. The Crystal sitemap library has a way to avoid that same bloat.
With Crystal, all of those things are massively faster.
Crystal performance is close to Go-lang. It's just very fast across the board.
Consequently, just like Go-lang, it beats Ruby by an order of magnitude on pretty much any benchmark.
Newcomers often assume Crystal must be sluggish like Ruby because "how could a language with such a convenient syntax be so fast", but it really is not.
in fact Crystal concurrency is the same with Go, both using coroutine technique.
though Go should be more performance because it is built for that feature.
The biggest thing is the massive speed improvement compared to working with bloated ActiveRecord models, associations, plus some memory bloat. The Crystal implementation is massively fast dealing with database data.
For friends from China who are interested in Crystal,i create the Github crystal-china org, https://github.com/crystal-china, and I also hold https://crystal-china.org domain name, although there is still no time to build a website.
Let us discuss use Chinese in TG,Discord, and do something together.
Lucky core dev here! I've been using Crystal in production since about 2017. I love the language, but there's definitely some hard tradeoffs which are mentioned in this thread. Our larger app has some serious compile time issues. We deploy with github actions, and it takes about 30min to build the production binary, compile assets, and push the binary to the server. But overall, I've found my apps are way more stable, and really fast as well as running on much smaller machines. The tradeoffs for us are definitely worth it.
Absolutely concur. Build times are an issue, support is a problem, and I've fallen out of love with a lot of the "magic" you get from Crystal and Ruby. I'll take explicit imports over a globally shared namespace any day.
As a very anecdotal data point, I once created a pet project that interfaced with a CouchDB endpoint and inserted some keys with dates in a complex nested JSON structure and deployed it to Heroku. (It was ment to be called as a Apple Shortcut)
It took me half a day with Ruby, two days with Crystal and a little more than a week with Rust. The Crystal experience was very very smooth coming from Ruby. With Rust it felt like I was wrestling with the compiler.
And you had a ton of experience with Ruby heading into this comparative exercise?
I feel like a lot of us are hitting a point in our career where we have decades of experience doing it one way, and learning something new is slow, but make the mistake of thinking that's entirely a reflection on that new thing. (Myself included.)
Yes, I used to be a seasoned Ruby dev myself, not so much as I'd like these days.
I had of course no previous experience with Crystal or Rust whatsoever. The trickiest point with Crystal was mapping the JSON dynamic payload to a typed structure to be parsed, modified and sent back. On top of strong typing, Rust added its particular syntax and lifetime complexities.
It was a fun side project and, performance wise, being it deployed in Heroku and so on, I didn't get any clear advantage between Crystal and Rust because everything was just doing a couple of network calls to a CouchDB endpoint with minimal modifications to the payload.
I wanted to share this as anecdotal evidence for Ruby developers. Coming from other languages, the developer experience would arguably be different.
The minimum requirement for a language to take off: first class Windows support. The current preview support is good enough to start playing. A high-performance (C-like, LLVM based) and a very high-level language with Ruby-like expressiveness coupled with static typing and compile-time (as well as runtime) safety, and with the ability to produce standalone native executables make it really one of a kind.
Which is a valid choice, but it doesn’t mean that it isn’t successful or that it hasn’t taken off. Swift has achieved exactly the goal it was supposed to: become the official language for Apple platforms, replacing Objective-C as the entry point.
> If I need to write a CLI app, why, of all languages, would I pick Swift?
You probably wouldn’t. Other people might, for a variety of reasons including enjoying the language itself. I write CLI apps regularly and use several languages; Swift is one of them.
The original comment claimed a language needs first class Windows support at a minimum to take off. Swift proves that is not a hard requirement, regardless of the reason for its popularity: that it’s the official vendor-supported language for a family of platforms that have been profitable for developers.
That Swift isn’t as popular outside the officially supported platforms isn’t surprising and doesn’t matter. The point is precisely that it didn’t have to cater to a specific one (Windows) to achieve success.
Yes you don't need to be on windows if you have a quasi monopoly on 1 billion devices. But except swift and java there is no language that can pretend to it.
If it wasn’t for the fact M$ preloaded Windows on nearly all OEM computers due to some seriously mafia-esque business practices we wouldn’t have to consider that OS at all. Two can play at that metaphor game
The comment I quoted claimed that for a language to take off it needs first class Windows support.
If we’re going to limit the discussion to languages which “can be used for writing Mac, Linux, Windows apps”, then the argument is pointless. By definition, any language which can be used to write Windows apps has Windows support. That’s circular logic.
To address the argument, it only makes sense to discuss languages which don’t have¹ first class Windows support. Everything else is irrelevant.
Almost there. From now on it it's only matter of developer experience.
Like provide a installer which do all the necessary setup for you, including install Visual C++ Build Tools, LLVM and other stuff.
Then the community need to update the libraries to make them run smoothly on Windows, too. But this is not that hard.
My major annoyed point for using Crystal in Windows now is the native libraries, which need extra build steps that sometime not very straight forward. But I think we can overcome it with some extra effort from the community: provide the compiled lib/dll files beforehand so other people do not have to compile the native library all over again.
This will also take time, but unlike the previous challenges, it's mostly the DX now.
> The minimum requirement for a language to take off: first class Windows support
many open source languages “made it” without this qualification, and I’d suggest that Microsoft has a long history of primarily supporting its ecosystem languages on its platform.
Bash? Ruby earlier on? Node earlier on? Erlang? Elixir? Perl, earlier on? Clojure? Go earlier on? All the standard Unix utilities and their DSL's? (sed, awk, bc, jq, etc.) Apple's Objective-C/Cocoa, and Swift? Rust (earlier on)? Scala? Haskell?
Every one of these languages' Windows support tends to lag their *nix support, and most are fairly widely deployed (my definition of "making it"). Many may support Windows now to some degree, but did not originally and not for their entire incubation period.
Early Python did not have good Windows support. Neither did PHP... which is exactly why Microsoft basically copied PHP to make ASP.
Hell, even C/C++ was basically fleshed out on Unixes long before Windows even existed.
Linux (and Unix flavors before it, as well as things like BSD) is the premiere developer OS, period. It enjoys, by far, the most languages to choose from. Windows is the developer OS only for the Windows ecosystem of technologies, which is far too limiting IMHO.
The fact that you've even asked this question makes me think you are living under a Redmond rock.
If you didn't have access to WSL, you wouldn't even be able to fully enjoy most of these languages. Which is exactly why WSL exists! No one would need it otherwise!
In fact, the only open-source language I can think of that DID premiere with first-class Windows support was Java.
Honestly, windows isn’t something I can take seriously in 2023.
Developers should be using the same Linux distro their servers run, IMO. Does that sound harsh? So does dismissing a language due to “windows support”
There’s really a lot to be said for an easily readable relatively terse language that compiles to a tiny binary executable with low memory usage and can be run as a service, profiled, etc.
We use it extensively in production. I really would only ever consider Rust as an alternative, it’s that good.
Crystal should get more attention, from desktop application developers in particular. Crystal makes it much easier to maintain large performant code bases.
Basically working with Crystal is much more like working with Python or Ruby, but with performance comparable to C. Unlike Rust or Go where you get the performance, but you don’t get the huge bump in productivity b/c youre working at a lower level of abstraction.
Build support in general isn't great. The way external objects are linked is a bit hinky, cross compilation support is poor, and running multiple instances of the compiler at the same time on the same machine isn't supported.
> I remember when I first learned about type inference, one of my first thoughts was, why can’t we just create a statically typed language that does type inference for everything. The compiler would still have to figure out the types at compile time anyway, and I’d save myself a few keystrokes.
> Turns out this is what Crystal does. It does aggressive type inferencing, and only asks you to declare types when it can’t figure it out.
> Crystal does not have a global type inference engine
As the sibling comment points out, this has been a feature of many functional programming languages for a long time. If you infer everything, though, it becomes a little unfriendly in use.
When you make a change in one location and accidentally use the wrong type for something, it can affect global type inference and you get error messages in other locations. In essence, if you use only type inference, a change anywhere in the program can turn up as an error due to a type mismatch anywhere else in the program.
Most languages with powerful type inference aim for a middle ground where important types are declared and mostly-local types are inferred.
The underlying principle in Crystal is that data structures usually require explicit declaration while types of values passed around on the stack usually don't need to be declared. I find it makes it easy to reason about the result while still being very stream lined.
I find myself getting pretty immediately turned off by IMO kind of silly and unverifiable catch phrases languages use as marketing tools. Crystal uses "a programming language for humans", much like EmberJS's "for _ambitious_ web developers". As I skim the Crystal page, it's not clear to me what is supposed to stand out as a language that makes it "for humans" any more than any other language? It just seems silly.
I think Crystal shines with these focus points, except 3. which needs some love (but it's on our radar).
1. C bindings are nice and easy
2. The integrated spec framework builds on the syntax of rspec and is quite capable. Of course it's opinionated, and there are alternatives in the ecosystem.
3. There are some IDE plugins with basic features. Context awareness in source code is an intricate problem because it requires a lot of semantic analysis. So it needs a bit more effort to iron out.
4. Crystal feels like a dynamic language, yet it's fully statically typed. Features like type inference and union types make this possible.
LSP is pretty well supported in editors; it has emacs and vim and vscode support (a lot of other support as well). It makes it possible for any editor that supports LSP to be language aware (supporting the use of code completion, syntax highlighting, warning and error messages, and refactoring tools, per the Wikipedia article). All that is needed is for someone to implement a language server for the language. I used to use Perl a few years ago and even that had some (slightly buggy) LS implementations, which I found very helpful.
I haven’t used Crystal but I did use Ruby extensively for some time, and having read this and some other articles about Crystal, I’m guessing (2) is well taken care of because of the aim of matching rspec functionality, and (4) by means of being Ruby-influenced. Not sure about (1).
I bet it's hard to be motivated to build numbers 2 or 3 when number 4 isn't getting attention. That's why most languages start there, that's the raison d'etre.
I wish they had used the headline to say something about the language rather than the useless tagline "for humans". Is it imperative, functional, object-oriented? For embedded, scripting, system programming?
Is it new? A dialect of something else?
Nope, all I know is it's "for humans", and possibly computers can run it too (?)
Just a tip. I scrolled multiple screenfuls and actually know less now about this language.
Put the recipe at the top.
We want to know what it will produce. If you have a heart-warming story about how you learned it in the tootledge of Chef van Rossum, tell that story below the fold.
I want a sense of the flavor before your whole life story :)
- types are non-nullable, with nullable types represented as a union of a type and nil
- macros built-in including templating, AST inspection, type inspection, ...
- green threads ("fibers") for concurrency
- C-bindings built-in
- package manager built-in ("shards")
(also it is compiled and cross-platform with Windows functionality in some kind of preview/beta that the article author found mostly usable [I think debuggers and the interpreter were exceptions here], and it has a large standard lib)
Crystal has already "taken off" for me, in the sense that it does everything I need it to do. The standard library is replete with http client and server, as well as JSON, and a host of other useful bits, just have a look: https://crystal-lang.org/api/
I can use any/all command-line utilities with an IO buffer (image processing, other out-of-band processing)
The stock "DB" module supports Sqlite, MySql, and PostGres out of the box.
SPEED: anyone "sticking" with RoR is not paying attention here: Crystal is a compiler and spits out a binary executable with low memory usage compared to it's interpreted cousin. This binary executable can be run by systemd as a normal system service, started with command-line options, etc.
Apples-and-Oranges comparison: I've turned a nodejs-based app that used roughly 600mb memory into an executable using 12mb. The latter has also not leaked any substantial amount of memory after running for 5 months straight now (still says 12mb) so, take that as you wish...
For me, Crystal has already arrived. I don't need group approval for this to be the case. I can happily use version 1.7.3 for the rest of eternity, so I don't see this rug being pulled from out from under me in the near future.
Perhaps use private shard repos if this is your actual concern.
Compilation time was a huge pain point for me, too. Dev builds took 30s to finish, completely killed the productivity.
It's much better now though, after I compile the compiler myself, using glibc instead of musl, using LLVM 15 instead of LLVM 14, and disable the GC (for the compiler).
Dev builds now only take 5~8s, and release builds only take ~1 minutes instead of 2 minutes 30 seconds.
I’m honestly more interested in jRuby at this point. I think the coming combination of jRuby with fibers is going to become extremely compelling based on the early numbers that Charles Nutter was seeing. This is approaching BEAM levels of concurrency.
They seem to suffer from similar problems as Julia: Type inference for large projects seems to be extremely difficult and results in huge compile times / startup times and or runtimes.
There's been a lot of progress made over the past few releases for Julia's startup and compile times. With the upcoming release of version 1.9, we'll be caching native machine code from the precompilation phase of a package.
Packages will be able to specify precompilation workflows that are run when the package is installed and then cache the results from that leading to a huge improvement of the loading time for most heavy packages.
For me, the pitch is performance: Crystal would be several times faster, while using a tiny fraction as much RAM.
In my use case for https://heiioncall.com/ I ported a part of our system from Ruby to Crystal and we're seeing about ~10x faster throughput while using ~1/10th as much memory.
That changes the operating economics drastically enough that, for example, Heii On-Call provides free website and API endpoint monitoring.
Genuine wondering, why Ruby Rails for the Website and not Crystal. Considering you are using Crystal for totalreturn ( big fan ) and heiioncall as well.
Looks fine I guess, but (like Ruby) it's not as intuitive as Python, and I have a lot of hopes that Mojo will bring about a performant Python like programming experience
Ruby is Python minus white space mandates and having to try three times to get strings right. Also, backwards compatibility is not such a big deal. Of course, also minus the massive corporate support by the likes of Google.
Same. Python always seemed like it was cobbled together poorly. While to me, Ruby seemed to be elegant experience.
I used Ruby because I really don't like the "programming part" - but the "I'm creating a product ASAP, the language is the method of having to do so" part.
I drank the Go koolaid (hoping for a C++ variant) but felt like I was using C or Assembly (har) in having to recreate practically everything ground up.
I could never stand Go. So much boilerplate. Literally designed to appeal to the most inexperienced developer. Write 4 times as much code to do the same amount of work as 1 line of Elixir or any functional language... Gross.
Python was released earlier than Ruby (1991 vs 1993/1995), and Matz was familiar with it. But Ruby is clearly an attempt at an object-oriented Perl rather than a Python mimic:
> I knew Python then. But I didn't like it, because I didn't think it was a true object-oriented language---OO features appeared to be add-on to the language. As a language manic and OO fan for 15 years, I really wanted a genuine object-oriented, easy-to-use scripting language. I looked for, but couldn't find one.
Except that the walrus operator “:=” is an assignment expression, whereas in Pascal it is an assignment statement, like “=” in many languages. In Pascal, the boolean comparison operator is “=” instead of “==” (in the same many languages).
And there is C where the assignment expression is just “=”, there is no assignment statement, and many forgotten double equals led to bugs …
So far I have been building some smaller personal CLI tools and a few web apps (with the Lucky framework). I’ve also tinkered with running it in AWS lambda functions in a custom (albeit unfinished) runtime.
Coming from a decade of Ruby, due to the similar syntax and mindset Crystal is my go to for cases where I need performance or runtime-less execution (e.g. in containers from scratch that contain only the binary and dependencies, if needed).
Crystal's standard library provided enough functionality for me in the past to get away with only few dependencies per project, which is great for supply chain security and complexity. Some of it's highlights are: