A full rewrite instead of finding a way to gradually increment already feels like a lot of risk. Asking your core dev team to potentially pick up an entirely new skillset in Rust would introduce too much on top and probably cause people to flake from the project.
Totally agree something like Rust would be good in a vacuum, but existing contributors and ecosystem would present problems. Having tooling built in the same ecosystem as the end product makes it way easier to contribute.
"Pandas alternative" kind of undersells it -- it's drastically faster and supposedly has a much more intuitive interface. The limits of how long you can keep doing things entirely in-memory (and postpone the move to something like Spark) get higher and higher.
The installation and first five minutes of any kind of product is hugely make or break. I keep my resume in LaTeX via Overleaf, but probably wouldn’t bother with it if I had to get LaTeX running locally, which has always seemed fairly complex to me (though I’m admittedly no LaTeX has expert and may entirely be wrong).
This surprises me. On most platforms it’s just a package download and install. On Mac, it’s macTeX. On Linux, it’s whatever your distro calls texlive via the package manager. On windows it’s mikTeX. That’s not exactly complex or requiring any sort of latex expertise. Linux can be the one that requires the most thinking if they don’t have one package that pulls in all of what you need, but I can’t remember it being more than a couple minutes of effort last time I did it on Ubuntu or fedora.
The difficulty is getting multiple collaborators to install and pin the same packages, where everyone might be using a different platform/distro.
Example: I might commit a change that compiles perfectly fine with my version of asmath, but it conflicts with the version of asmath in the style guide of some UC Berkeley department/lab.
It requires choices and knowing what to install and if things don’t work, troubleshooting the install can be difficult. For a first time task of “install latex”, it’s not the easiest. Especially for newer users. I e done it half a dozen times and I’m still not quite sure if I’ve done it right on my Mac (right away).
I wasn’t aware of a brew package; I will definitely check that out. I have always been using the texlive installer for macOS (MacTeX), which is very easy to use. Although the install instructions can be a bit long and important to read when Apple breaks things.
Is 10 gigs really that much nowadays? I have to think that if you're frequenting HN you're likely to have at least a terabyte in storage on your personal computer?
It’s not about the HN visitor… it’s about the collaborator or grad student who might be on an entry level computer with 8GB of RAM and 256 GB of storage. The entire system needs to be easy for them to install and maintain. And even if I have 1TB of storage, if I could avoid an extra 10GB of space in my backups, I’d appreciate it.
The fact that you don't seem to realize that downloading 10GB of stuff just to edit/generate PDF documents is completely bonkers just shows how out of touch Latex afficionados are.
As far as I'm concerned, the outputs are pretty good but until somehow really makes no-nonsense software that can do that in an efficient manner, it might as well not exist at all.
Fair enough. I just have a Nix Flake to handle this stuff for me now so I just do `nix build`, but obviously that's getting into territory that is super geeky.
“AI” (as absurdly broad of a term as it is) has legitimate use case. It isn’t JUST hype. However, because of the buzz, it’s being shoehorned into so many places it really just isn’t the proper fit for, and it’s hard to figure it where.
However, there ARE plenty of areas it IS the right fit for. Lots of “fuzzy” systems that would struggle to be rule-based and generalizable benefit hugely from LLMs and other fuzzy / intentionally broadly scoped tooling.
Source — I work at a “chat with your data” startup, and our product just categorically wouldn’t be worthwhile if the above weren’t true :)
I don’t closely follow the space, but Anker has always seemed like reasonable quality at a reasonable price. Maybe I’m easily marketed to or something. But I’ve had quite a few adapters and cables and such from them and have never had an issue.
I think Rust only really shines when you factor in the longer-term lifetime (pun unintended) of the code. If you’re just focused on how much time it takes to get something working — in other words, a PoC / MVP — it doesn’t seem surprising to me that it’s significantly slower.
The promise to me lies in the entire classes of errors you systemically prevent from happening (given no unsafe code) and just generally how much easier it is to write maintainable and bug-free code.
These mechanisms are part of a very broad set of tooling that slows you down short-term but pays off in huge quantities over any even medium-term timeframe in an actual business product intended to be long-living.
Granted, Rust has its tradeoffs just like any other language — from what I hear, refactoring and fighting the compiler in certain domains like gamedev gets annoying — but it seems much more positive than negative.
Totally agree. Maybe if OP phrased it as “Urgency” instead of “Scarcity”, I would have agreed. But these are two distinct concepts that just happen to cause similar outcomes. Almost similar to a concept of an “incorrect” or “false” abstraction in SWE land.
Totally agree something like Rust would be good in a vacuum, but existing contributors and ecosystem would present problems. Having tooling built in the same ecosystem as the end product makes it way easier to contribute.