Hacker Newsnew | past | comments | ask | show | jobs | submit | lgreiv's commentslogin

Slightly different approach, but appears to have the same overarching goals: https://github.com/bullet-train-co/magic_test

„Magic Test allows you to write Rails system tests interactively through a combination of trial-and-error in a debugger session and also just simple clicking around in the application being tested, all without the slowness of constantly restarting the testing environment.“

I will keep an eye on both now, probably they will compliment each other at some point in time.


Since trying it on a dev conference in Amsterdam, I am severely hooked on Tony‘s Chocolonely, which usually rates quite high in labor protection reviews [1] and has started to introduce traceability for the supply chains of its various ingredients. It also tastes unbelievably good.

[1] https://www.chocolatescorecard.com/


I'm actually quite happy to pay a bit more for Tony's chocolate because then I'll wind up enjoying it more and eating less of it.


It was mentioned in the show, in a good light, much as you describe.


A relevant (WIP) standard in this context could become 802.11bf „WiFi Sensing“ [1]. If plenty of common consumer devices will support it (maybe even some with an on-board camera), it may become hard to avoid.

[1] https://standards.ieee.org/beyond-standards/ieee-802-11bf-ai...


You should be able to organize accounts hierarchically using AWS Organizations, which allows to have cost centers and centralized billing (and some imposed policies over all accounts).


Very handy tool for sure. I really appreciate the efforts you made on in-line, repo and website documentation.


I wish Crystal would take off. It has so many things going for it (many of them mentioned in the article): performance, useful tooling such as an opinionated formatter, an integrated RSpec-like test framework, a powerful standard library, an awesome type system that gets out of the way most of the time, a familiar syntax.

So far I have been building some smaller personal CLI tools and a few web apps (with the Lucky framework). I’ve also tinkered with running it in AWS lambda functions in a custom (albeit unfinished) runtime.

Coming from a decade of Ruby, due to the similar syntax and mindset Crystal is my go to for cases where I need performance or runtime-less execution (e.g. in containers from scratch that contain only the binary and dependencies, if needed).

Crystal's standard library provided enough functionality for me in the past to get away with only few dependencies per project, which is great for supply chain security and complexity. Some of it's highlights are:

  - an ergonomic HTTP::Server and client
  - OAuth / OAuth2 clients with token refresh
  - JSON/YAML/XML parsing/generation/mapping
  - JSON/YAML mapping to classes
  - native templating similar to ERB


I also wished for a long time for Crystal to take off, but now having Go and Rust I don't see Crystal to gain too much relevance in the future, while it might be appealing to Ruby devs, Ruby is becoming better every day, but still Ruby devs stay in Ruby because of the ecosystem, the company where I work on already decided that no new project will be written in C, C++, Ruby, Java and instead use Go or Rust, I suppose that other companies are doing the same.


As a Ruby dev, that Crystal is "close but different" means I know I won't be able to use my Ruby code but at the same time I get too little in return to make it feel worthwhile.

For the very rare circumstances where Ruby is too slow for what I need to do, there's usually code in some other language I can wrap in an extension, and that code is far more likely to be C or some other more established language than Crystal.


To me Crystal is actually a more polished Ruby implementation. There was some annoying inconsistent in ruby stdlib that was corrected in Crystal, for example the modules for file/directory interaction.

Ruby was my main language for scripting, but I switched to Crystal completely after getting comfortable with it. Now it is really hard to switch back to Ruby.

Also, I thing compiled language is the greatest friend for older programmers. Over the time I'm getting more and more sloppy and very often make some typing error. Crystal being compiled language saved me countless time by refusing to compile when I made the mistake. Meanwhile in Ruby it can only be found after the script ran for a while, sometime it became a disaster because the action was destructive.


The thing is, any annoyance in the stdlib you can paper over with gems if you want, and still be able to keep using all the libraries out there.

As for the type checking, Ruby is getting more and more opt in type checking, so that reason is slowly diminishing as well.


> , but now having Go and Rust

If you go for Go or Rust you would never go for something like Crystal regardless, as they have different focuses. Just like Ruby, Crystal optimizes for code legibility and expressiveness, which both Go and Rust never really cared about, as it's about stability + performance (in the case of Golang) or safety + performance (in the case of Rust). Both of Go and Rust have horrible syntax compared to Ruby/Crystal, which is a fine tradeoff to make in cases.


That must be subjective because Go's syntax is pretty fine to me.

To illustrate, when I started learning Go, I implemented Concurrent Hash Array Mapped Tries from some PhD thesis and the pseudo code algorithm actually looked pretty much like the Go code. A testament to how legible it is.

Then again, I have never tried ruby so I can't compare.


Apples and oranges. Ruby is very, very expressive compared to Go. Go's legibility comes from the simplicity of its syntax, Ruby's legibility comes from its expressive syntax.


Can you give either an example or explanation of what you mean by expressive syntax? People talk about it but I don't really know what they mean.


Ruby was one of my first languages and for awhile I described it as 'programming in english' especially Rails with the ActiveSupport gem. I often just fired up IRB (the repl) and guessed what function names I could use on a given object and it worked. I think this is why a lot of DSLs ended up in Ruby (chef, etc) plus the method_missing idiom allowed for a lot of this sort of expressivity at the expense of sometimes being thoroughly confused as to where a method was defined.

Overall I ended up writing more performance oriented applications and ruby (at least at the time) wasn't worth the 'it's pleasant to read and write' vs 'this ruby script that used to take a day now takes 15mins written in a language more suited for the task'. The other language I'd consider less expressive just in a less optimized-for-the-human-while-writing kind of way.

Crystal seemed nice in the respect that it might be the best of both worlds, but like mentioned elsewhere here it's still fairly niche.


I looked at the activesupport gem and I think maybe the things we call general purpose languages are actually just really broad domain specific languages. From the gem description "Rich support for multibyte strings, internationalization, time zones, and testing.". I do testing, and occasionally I have a string, but mostly I only do numeric programming, I don't think of that as a niche. Fortran was/is for numeric programming I think of numeric programming as the default normal sort of programming. That probably isn't so much the case anymore, but we probably all think of our domain as the normal one. So expressiveness is probably highly relative to the domain. Julia seems very expressive to me. So do rust and modern c++. But I don't do much string handling, and I don't really even know what internationalization means.


Ruby's extreme expressiveness comes from a few things. First, everything is an object. Second, Ruby has lots of functional features that make it exceptionally concise to express certain things.

Here's a simple example I'll do in JavaScript first since most people can read it. You have an array of the numbers 1 through 10 and want to get a new array with only the odd numbers.

  arr = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
  arr.filter(num => num % 2)

  > [1, 3, 5, 7, 9]
In Ruby you could express this the same way:

  arr.filter { |num| num % 2 }
  
  => [1, 3, 5, 7, 9]
But there's a better way. Since everything is an object, including integers, you can just ask the numbers if they are odd instead of using the mod operator.

  arr.filter { |num| num.odd? }
  
  => [1, 3, 5, 7, 9]
(Yes, method names can end in a question mark (?). By convention, methods that end in a question mark are predicate methods that return a boolean. Methods can also end in an exclamation point/bang (!). By convention, these bang methods are "dangerous" – they might do something destructive like mutate an object in-place instead of returning a new copy.)

We can make this even more concise though.

  arr.filter(&:odd?) 

  => [1, 3, 5, 7, 9]
There's a bit to unpack here.

:odd? is a Symbol

The & operator calls the #to_proc method on the symbol :odd? then converts the result into a block, which is passed to the filter method. In other words, the #odd? method is called on each element in the array no different than the previous example.

Here's another way to express the exact same thing:

  arr.reject(&:even?) 

  => [1, 3, 5, 7, 9]
This is just one simple example. Ruby can express many things on one line that would take several lines in other languages. This is what makes it expressive – it's the density and readability of operations.

Another one of my favorite language features is that parentheses on method calls are optional, so a lot of syntax that looks like it would be language-level operators are actually just regular method calls to objects in Ruby. An example of this is ==, which is an operator in many other languages. In Ruby it is a method call. Consider this expression:

  true == false
This is calling the #== instance method on the object `true` (which is an instance of TrueClass), passing `false` as the first argument. Written identically:

  true.==(false)
This is wild if you're coming from another language where things like this are operators, but it is completely natural in Ruby. It is extremely powerful.


Clojure says “hold my beer”.


Rust is pretty expressive. Horrible syntax... that's just a subjective measure. I wish programmers would get over it, because it's a huge hindrance to trying a large number of languages out there with lots of nice things. But people are gonna be people, so it won't happen.


Yeah, I agree that it's highly subjective. And I don't mean that people shouldn't chose something just based on one variable, it obviously depends. No language is perfect for every use case. Personally I use Rust for a lot of things, even though I'm more productive in other languages, but sometimes it's just a really good fit for the problem.

Pseudo-code, but this is how I see the difference in expressiveness and syntax:

   3.times {
      println('Foobar')
   }

   for i in 0..3 {
      println!("Foobar")
   }
Both of them makes sense though, so not a huge difference, but it's the same way in lots of ways in Rust in general.

> because it's a huge hindrance to trying a large number of languages out there with lots of nice things

I know exactly what you're talking about, as most of the time I work in Clojure, and trying to show Clojure to other programmers who are used to C-like languages and never heard of any lisp-like language is a constant struggle, as their first reaction is always "eww, parenthesizes everywhere!" even though their favorite language usually has the same amount or even more.


Ruby is fine until your large application starts to include more and more dependencies. There is no coherent API across, every library has it's own mini DSL and the mess starts to build up. Then it comes method_missing, monkey patching and strings and dictionaries passing in all the places. True horror. All Ruby's readability dissapears. Small apps look beautiful, but large ones quickly become a mess. On top of that resource utilization is atrocious. But there are places where Ruby shines like RSpec, Rails, pry in runtime (I miss that one).


That's one of the aspects in which Crystal shines.

It retains most of the flexibility and DSL capabilities, but the type- and nil-safety allow you to leverage them without nasty surprises at runtime.

Not having to guess what a method or block returns and instead having the compiler tell you when you get it wrong makes an enormous difference.


Rust is spiky, like a crab.


I've been learning Rust over the last few months, because it seems like a good language to have on my CV.

It's certainly a taste thing, but I am not enjoying it at all. I find it unpleasant. I think it's going to join the short list of "languages I can use but would prefer not to".

I've never learned Crystal, but if it's like Ruby, then I think I'd enjoy it just fine.


It's funny how taste works. I love writing Rust. To the point that almost all of my personal projects are now in the Rust.


Exactly.. I was a Crystal enthusiast, but because of their slow development pace and small community, half-baked documentation (all understandable, no critics here) and the raise of Go (great documentation, fast pace development, huge community), was hard to keep focus on that. Go became my option to ruby, and I'm doing fine with that.


I can't suffer Go. C, Rust, C++, Ruby Python, Java, C#, VB.Net are all fine, F# looks cool too but I never used it.

I started to use Crystal in a small commercial project, as a web backend, and it was very fun and promising.

But the project got cancelled for completely unrelated reasons.

I long for an excuse to use Crystal again. Hard typing combined with Ruby carefree style and EXEs without runtime... really cool stuff.

I think Crystal will crack the Win32 nut eventually.


I think windows is the main thing holding crystal back. I make little games for jams and I'm always trying out new languages. I submitted a bug related to running processes on windows last november, it was just recently closed as completed. Although I spend almost all of my at-home time on linux, I know that I have to provided windows executables for jams. I plan to come back to crystal when I can write once and can compile the same code everywhere.


I stay in Ruby because my customers want Rails and because my own scripts don't need to be fast. Edit and run is one step faster than edit, compile, run.


As a very, very long-time member of the Ruby community

> an integrated RSpec-like test framework

Please no. Please, please stop blindly just cargo culting RSpec into your projects. Minitest is included with the language and has all the assertions you might generally want. The only thing RSpec really "adds" is an insane and quirky DSL for the sake of "reading like English", which is a terrible misfeature. Most of the rest are terrible, terrible misfeatures as well.

As an example, one of these misfeatures is built-in mocking. Mocking is great, you might think. Yes, mocking is a necessary evil in certain cases but it should be used very sparingly. What happens instead is that people don't bother to design their classes for easy testing and they mock the hell out of everything in order to make their code testable. So what's the problem? Tests are supposed to do two things: 1) discover bugs you didn't realize you wrote, and 2) allow refactoring where if the tests pass, your code is probably alright. Widespread mocking absolutely tanks both of these goals. Instead of testing the effects of code, 90% of the Rails tests I see in the wild test mock everything to the point that the tests simply confirm that the code is implemented the way it is currently implemented. No bugs can be unearthed by this method because none of the actual effects are tested, just that certain methods are called in a certain order. And refactoring is now insanely difficult because any change to the logic causes the tests to fail, even if the effects are the same.

This is just one example.

So please, I beg you, stop reflexively reaching for RSpec. Minitest is great, it has most of the things you need out of the box. And Rails has a test framework built on top of it that similarly already does all the things you need. And it's all just plain Ruby.

</soapbox>


Counterpoint: why should I care if it "takes off"? It is a very much stable, fast, useful language that can be used today.


You get more and better libraries if the language is popular. Your favorite APIs (such as AWS for example) implement clients in the language


I wonder if the lack of libraries in a new language is just a few ChatGPT-integrations away. After all, Crystal is so similar to Ruby. I bet as soon as context/environment aware ChatGPT tools/agents that can compile / run tests / apply fixes until it works are available, this will become the reality: "Create this Ruby library but for Crystal" and in a few hours you can include it in your Crystal project.


I would personally never use a chatgpt transliterated library


The FFI wrapper for c libraries is great, and one can use imagemagick and pngquant and such via an IO buffer wrapping a call to a command.

Crystal works great TODAY and will blow Ruby out of the water on performance. I simply can’t justify RoR in 2023 given Crystal


More, of course. But better? Percentage-wise, unlikely.


Maybe not if you consider good as a threshold, but the quality of good libraries will improve with a larger pool of skilled contributors


The more people use it, the more jobs there will be for it, and the more the ecosystem will flourish.


While it may be possible to onboard Ruby developers fairly easy, it is still easier to find talent for an „established“ language. For my personal projects, I gladly use Crystal, but in my professional role as a technical decision maker I need to factor aspects like this in.


Don't get your hopes up, Crystal is doomed to never become mainstream.

We can divide programmers into two camps, those that enjoy programming itself and those who use it as a means to an end. The latter greatly outnumber the former, let's say 9:1. That massive disparity in numbers is why only the languages that enable the latter group thrive.

Ruby is the perfect example, the language got a massive exposure boost due to Rails, but once the hype died down, everyone left. That's because beyond Rails, Ruby has nothing to offer to programmers who want to get stuff done. Nothing besides pain, of course.

To those who enjoy playing with languages, "did you know there are 10 different ways you can filter an array in Ruby?" ([1]) is joyful to hear. But when you're woken up at 3am to find a bug in production `arr.reject(&:even?)` is the last thing you want to see.

This sort of cleverness, ambiguity and implicitness in language design repels 90% of programmers, and that is the reason why languages like Perl, Ruby, Scala, and now Crystal are either dead, dying, or destined to die.

[1]: https://news.ycombinator.com/item?id=35836570


(self-reply) Addendum:

EWD 340 (Prof. Edsgar Wybe Dijkstra) [1]:

"The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague. In the case of a well-known conversational programming language I have been told from various sides that as soon as a programming community is equipped with a terminal for it, a specific phenomenon occurs that even has a well-established name: it is called “the one-liners”. It takes one of two different forms: one programmer places a one-line program on the desk of another and either he proudly tells what it does and adds the question “Can you code this in less symbols?” —as if this were of any conceptual relevance!— or he just asks “Guess what it does!”. From this observation we must conclude that this language as a tool is an open invitation for clever tricks; and while exactly this may be the explanation for some of its appeal, viz. to those who like to show how clever they are, I am sorry, but I must regard this as one of the most damning things that can be said about a programming language. Another lesson we should have learned from the recent past is that the development of “richer” or “more powerful” programming languages was a mistake in the sense that these baroque monstrosities, these conglomerations of idiosyncrasies, are really unmanageable, both mechanically and mentally. "

[1] https://www.cs.utexas.edu/~EWD/transcriptions/EWD03xx/EWD340...


> when you're woken up at 3am to find a bug in production `arr.reject(&:even?)` is the last thing you want to see.

Really? Because even if you're seeing it for the very first time, it means exactly what you'd think it means. The only thing that's even potentially sort of tricky or non-obvious is the ampersand. And there's nothing implicit or ambiguous in the expression, either.


> implicitness in language design

In a sense, all languages have implicit aspects. Can you characterize the problematic implicitness? (I guess python's TOOWTDI helps, when reading code).

I've been studying bidirectional transformations, and a survey paper notes that "relational" methods have died off (somewhat "declarative", where you specify how things should be connected, with the details implicitly worked out for you... something like set-builder notation), while various coding methods remain (where you code for forward transformation, and the backward transformation is automatically derived). They hypothesise it's because relational tools (libraries and projects) are difficult to maintain - but I suspect it's because they are difficult to use.

Which is a shame: something that automatically "does what you want" seems like a great idea! But being difficult to predict and diagnose - not being in control - is not great. So, to rephrase my question, what characterizes "good" automation?


implicitness examples: precedence, type coersion, overflow, polymorphism.

But I suppose (e.g.) precedence doesn't arise that often; it "should" be familiar from high-school algebra; it can be made explicit (with parentheses or successive assignments).


"Explicit" would seem to be "obvious" - is that necessarily true? Too much code can obscure what is happening.

Familiarity helps implicitness be understood. Some would argue lack of familiarity is the only problem with the languages you mention. Certainly, it helps: as you become more expert in anything, you start using short-hand and jargon, and skipping details.

Adoption is easier when it's similar to the familiar. Hence, "c-like". And python, psuedo-code-like.

But are some things easier to grasp, by nature? I think, "intuitive" often means familiar. And there are things that all human beings are familiar with, like space, trade and language. Perhaps this is where intrinsically obvious and familiarity coincide?


It looks nice, I've been wanting to try it. Regarding tooling, how is the debugger/debugging support?


It’s a bit of legwork, but it’s there.

https://github.com/amberframework/docs/blob/master/examples/...


Thanks, bookmarked!


Crystal won't take off because it's an "amateur" language, to have a good and properly supported language you need a lot of poeple and money, which Crystal has not.

I would not rely my business on a language where there is just one full time dev working on it and relying on donations.


> because it's an "amateur" language

Python was an amateur language until it was not.


This isn't quite true. Python was certainly Guido's passion project, and it started that way. But early in its life, it was funded by the United States government via CNRI (under Bob Kahn).

At every job Guido had subsequently, he had at least some buy-in to work on Python, and some coworkers were also paid to work on Python.

There was a core "PythonLabs" group (I think including Jeremy Hylton?) that apparently moved from CNRI to the a startup BeOpen:

https://www.computerhistory.org/collections/catalog/10273876... (I watched this whole video, there's also a couple Lex Fridman videos)

At Google from 2005 to ~2014, Guido nominally had 50% of his time to work on CPython, but that's pretty fuzzy because he was really productive in any case. (I was his officemate for a couple years during that time.)

https://en.wikipedia.org/wiki/History_of_Python


Yes and Python (released in 1991) took a lot more time to become popular compared to Java (released in 1995) which was corporate backed.

Python still seems to have funding issues. For example, JS (backed by multiple big corporations) performance has improved a lot compared to Python. The main Python implementation doesn't have a proper JIT as far as I know.

So it shows the difference between amateur languages and corporate backed languages.


> Yes and Python (released in 1991) took a lot more time to become popular compared to Java (released in 1995) which was corporate backed.

Being corporate backed is the real answer. Languages that became popular through grassroots support are the exception, not the rule.


Even Rust started as a "amateur" language until it was picked up by Rust.

Nothing is saying that Crystal won't be picked up just like Rust was, by some company.


> Even Rust started as a "amateur" language until it was picked up by Rust

?!?


Mozilla maybe?


What a brain fart. Yeah, I meant "picked up by Mozilla", thanks :)


This reads a bit like a tautology..."Crystal has not taken off because Crystal has not taken off".


No, its quite possible for a language to have strong backing before it takes off (C#, JavaScript, Go, Swift, Dart, Rust all did, to varying extents, from day one) and those tend to also be the langauges that take off quickly. Other languages struggle until they attract either a sufficient mass of distributed backing or the one key backer that gives it the level of support and visibility that supports the adoption curve taking off, and most never reach that point.


Crystal has been going on for years and there is still no major support, it won't change, basically if you're a new lanague you have a couple of years after that you just missed the train.


Not historically true. Python didn't take off for years after the initial release in 1991. Even in the mid 2000s people were evangelising Python like it was a new thing (https://xkcd.com/353/)

Or take Rust, which has gained adoption more recently. It took a while to get going after the 1.0 release in 2015. If you look at daily downloads of Rust crates (https://lib.rs/stats) as a proxy for adoption:

- 2015 to 2017 - not many downloads

- 2018 - first signs of real growth, reaches 1M per day.

- 2018 to 2021 - 10x growth, 10M per day

- 2021 to 2024 - 10x growth (projected)

So new languages shouldn't lose hope if they don't see adoption immediately. It's possible that they might be adopted later. But for language authors, it's the hope that kills you. You continue working on it even if maybe you shouldn't.


Another one: nix started in 2003 but hasn't really entered the common usage/popularity until the last 4 (or so) years as far as I can tell.


Define "shouldn't". Working on a language is a gift in itself.


Sure, let me define "shouldn't". For those authors who hope to see their language become mainstream and achieve widespread adoption, they may want to cut their losses. If they're working on it for intrinsic reward they should continue, by all means.


I'm not sure that I agree with the poster you are replying to, that Crystal won't take off, but I'm pretty sure that the statement "Crystal has not taken off because Crystal has not taken off" is kind of useful/valid/worthwhile. I believe that this is how network effects work.

I also am pretty sure I read about someone here on HN who used Crystal to build the code for their profitable cookie baking/selling business (the baked goods). I can't recall the details exactly. Edit: Here it is: https://news.ycombinator.com/item?id=23433847


Their development team is slow ass. They live in an idealistic world where Windows users are not worth supporting. Bugs and features take too long to fix and ship. Feels more like a toy enthusiast project relying entirely on organic growth. It cannot survive in a world where the most successful programming projects have full-time corporate backed teams.


your entire complaint appears based around the timeline of windows support.

you are free to use windows as a principal OS, but please inform us of your biases when totally trashing a team and product for what is essentially a technical issue of significant proportions, not a product of laziness or slowness, as you allege.


I wish Nim would take off...but alas the worst languages are always the most popular - JS, C++ ...


The most flexible languages with the most advanced development tools are always the most popular - JS, C++


Nim really looks awesome too! If I came from a python background, I could see me reaching for Nim rather than Crystal.


all those features are offered by Go too.. The similarities among ruby and crystal are normally easy to spot in small projects, but their differences get really visible in big projects.


Go is not a good comparison.

Crystal is expressive, 500 lines of Go commonly translate to 50 lines of Crystal. The DX in Crystal is really closer to Ruby than to Go. It's essentially a very fast and type-safe version of Ruby.


> The DX in Crystal is really closer to Ruby than to Go.

Me as polyglot can say that is much easier to switch completely between languages and paradigm than write or talk similar (but not equal) languages. So it is maybe a great reason for beginners to start to play with Crystal as Rubyists, but as soon as you start to work daily with the language, other factors like good documentation, tons of examples, great standard lib, active community and development, etc are much more important than if its remembers ruby or not.


Crystal core developer speaking:

I think this argument is less about the concrete syntactical and semantic similarities to Ruby, but the shared general idea to focus on developer happiness. For example, code is easy to read, yet still expressive.


Maybe with LLMs it won't matter much anymore? Maybe in the near future we'll have copilot for crystal as vscode plugin and the rest doesn't matter much - just quality of this plugin?


What will your LLM be trained on?

It only makes the problem even more acute I think.


Yesterday I told ChatGPT:

> We're going to write a program in a "new" language that is a mix of Ruby and INTERCAL. We're going to take the "come from" statement and use it to allow "hijacking" the return of a function. Furthermore, we're going to do it conditionally. "come from <method> if <condition>" will execute the following block if <method> was executed and <condition> is true. In <condition>, "result" can be used to refer to the result of executing <method>.

And that was enough to get it to understand my example code and correctly infer what it was intended to return.

Given it took that little, I don't think you need much code in a new language before you have a reasonable starting point as long as you can give examples and document the differences, and then use that on a few large projects that has good test suites, and work through any breakage.


You still need libraries and the ecosystem.

If it can't find those, then because there is no training data, best case scenario is that it will hallucinate APIs that don't exist.


If the LLM understands the language it can aid in creation of the libraries and ecosystem because it can also translate code. I just tested it by having ChatGPT translate one of my Ruby scripts to Python for example.

I don't like Crystal all that much, but it's similar enough to Ruby that if ChatGPT can handle Ruby->Python, it can handle Ruby->Crystal with relatively little work.

But it doesn't need to handle it flawlessly, because every new library you translate and fix up gives you a new codebase you can use to finetune it.


We need RLHF -> RLCF/RLIF/RLEF (Reinforcement Learning from Compiler/Interpreter/Execution Feedback).


But that's the thing, libraries are written and it should be able to just read/learn/train on it, that's all.


Ideally on source code so it knows standard library and how to use it with enough comprehension to be useful.


At that point, why bother with high-level languages at all? A sufficiently-good AI should be able to read a specification / test suite and directly generate a binary which passes those tests.


Maybe. Or maybe it'll become validation/what-is-happening lowest denominator. Programming languages are also a good Intermediate Language between humans<->machines and machines<->machines apparently due to recent AI advancements.


> 500 lines of Go commonly translate to 50 lines of Crystal

This is a vast exaggeration. And even if I grant it, it's still thousand of loc that you don't have to write because of available libraries.


IMO crystal's value is that I can do things normally associated with golang without the horrors of actually dealing with golang's absence of a good type system and other quirks like error handling.


Although they do try to make it easy to use C libraries:

https://crystal-lang.org/reference/1.8/syntax_and_semantics/...


Go's type system is anything but nice


I have some bad news for you. They are not updating their browser integrations for <= v7 and the newer ones will not work with local vaults. De facto, they are already deprecating standalone licenses/local vaults.


I'm stuck on macOS 10.15 for various reasons and so far 1password 6.x hasn't broken.

Taking your warning as a catalyst to find an alternative.


This is my stance as well. I have not chosen a successor yet, but I’ll have a look at Bitwarden, Keepass and the recently released Proton Pass.

Trusting Dropbox for sync (which I did) meant trusting a cloud service, too, but IMO it is a less lucrative target for hacks than a server that stores _nothing but_ credentials. Also, using DB made me less dependent on connectivity (LAN sync) and would let me switch providers quite easily.


I'm going to try KeePassXC & syncthing. I assume its going to be no where near as good as 1P, but between no extension support, no local vaults, secret security ops, I don't see a choice.


Also a recommendation for Strongbox if you're on a Mac. I've tried "all" since switching from 1Password a few years ago and it was the one I liked the most.


Thank you for the suggestion, I’ll check it out.


I've built a proof of concept myself in Ruby, an agent that extends itself with new abilities at runtime based on code suggestions from ChatGPT until it deems itself capable of fulfilling an arbitrary task given in natural language. It then persists that „specialized“ version of the task was completed successfully.

So far it was able to add an API to fetch me the local weather, return the capital of Venezuela, control brightness and volume of my MacBook and replicate itself at random locations (but tell me where, after).

That being said, I added multiple human-in-the-loop points in the assess/suggest/patch/execute cycle and (given the nature of LLMs) would never use it outside of a sandbox without these safety rails.


>fetch weather

>get capital

>control volume

>oh btw, it's also a self-replicating AI

https://tvtropes.org/pmwiki/pmwiki.php/Main/ArsonMurderAndJa...


I was just listing the goals I gave it in temporal order, but I’ll include a weak task for the giggles in the future when talking about the POC. Good suggestion!


History really rhymes. This is another step in Replit reproducing the path of Heroku (which is a good thing to say IMO).

A next generation of IaaS with great DevX would be a relief to everyone burnt by AWS‘s complexity overhead (for a great bunch of customer’s needs).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: