In all fairness it's really hard to make money off of "reduce" and "reuse," and we've decided to use money as the system for allocating all of our resources.
I wonder about that - surely there must be a decent number of shoppers who'd prefer to buy products where you could reuse containers, for a start. Recycling perfectly good glass, metal or even plastic containers always feels terribly wasteful to me. And it has to be economically more efficient, so it should be possible to sell the product a lower cost.
Well I think you’ve hit the nail on the head. A functioning ecosystem and a functioning economy in our present contemplation of the term are not compatible.
It feels a little inevitable though, given the other rules of the system: Companies don't become more valuable by having a steady earnings forecast, they become more valuable by having an _increase_ in earnings that is forecast to continue increasing. How can everything always increase? Well when you reach market saturation on one product, start a new product in a different market. What if all the markets seem saturated? Find a new thing to commoditize, a new market to create... Any system where unlimited growth is rewarded is going to try to take over everything it possibly can.
In many ways, the growth obsession is a product of the Reagan-era deregulation craze, and the "Greed is the One True God" ethic that kinda took over Wall Street then. (Try reading some stories about GE before Jack Welch became CEO, and after.) Other big contributors:
- The huge sums of wealth which tens of millions of ordinary people feel they need to accumulate, to pay for their (grand)?children's college education, unexpected medical bills, many decades of retirement and elder care, etc. Compare that to (say) the 1950's.
- The huge structural and social barriers to creating small businesses in most economic sectors, compared to (say) 60 years ago. Back then, how many of the businesses in an average town were (at most) small-ish, family-owned companies?
- How d*mn much "money" is sloshing around the financial system these days, with the "at any sign of crisis, pour in more $trillions" monetary policies of the Fed.
> the growth obsession is a product of the Reagan-era
I hear you, but uncle Karl names and shames the growth-obsession in Das Kapital which predated Reagan by over 100 years. It's a part of the system that many people have incentive to deny, hide, or minimize, but endless growth seems to be a requirement of "the system of organizing our resources that prioritizes turning capital into more capital" aka capital-ism.
The physical continuing expansion of the universe eventually causes any photon to be so far redshifted as to be unobservable, so it actually goes the opposite way. One day long from now, an alien civilization might look out into their night sky and only see their galaxy, and be completely unable to tell the previous history of the universe from observing the sky.
There's also a possibility that a sizeable fraction of all stars exist in the intergalactic medium [1], having been ejected from the galaxies where they formed due to galactic collisions or encounters with their original galaxy's supermassive black hole. A civilization evolving around one of these stars in the far future would be totally unaware of the universe outside their own solar system due to cosmic expansion.
The other side to this is that given enough time, some civilisation will be oblivious to the rest of the universe. Their visible universe will just be their own galaxy.
If their physics is correct, they will figure it out. We did because we verified that expansion is accelerating even though our physics are incomplete. Until the late 90s, physicists accepted a cosmological constant of 0, which meant no acceleration. Turns out that that was wrong.
It may be possible that they figure it out via Quantum Mechanics because it seems that acceleration of the universe is related to the energy density of a vacuum / empty space. The problem is that they won't be able to empirically verify that I think.
The problem is that at such timescales, the CMB will have shifted sooooooooo much that there’s nothing they will be able to deduce, all light has redshifted to absurd scales, and worst of all, everything will be so far away that light will never reach the galaxy.
I could be wrong as I was young and not yet in the field, but my impression has always been that sometime in the 80s/90s as the whole "networking, world wide web, wowie!" moment happened, there was this idea that "maybe on a local computer everything is files, but on the network everything is streams. Hey, maybe everything is streams!?" and C++ just happened to be creating itself in that zeitgeist, trying to look modern and future-thinking, so somebody decided to see what would happen if all the i/o was "stream-native".
IDK, it'll probably make more sense in another 15 years as we clear away the cruft of all the things that tried to bring "cloud native" paradigms into a space where they didn't really fit...
I think it is more simple and technical that that.
The big thing is that they wanted type safe IO. Not like printf where you can print an integer with %s and the compiler won't have a problem with that, and it will crash.
Reusing bit shift operators for IO is quite clever actually. If you have operator overloading, you have type safe IO for free. Remember C++ came out in the 80s as a superset of C, these technical considerations mattered. std::println doesn't look like much, but it actually involves quite significant metaprogramming magic to work as intended, which is why it took so long to appear.
> Reusing bit shift operators for IO is quite clever actually
It's a miserable trap. Operators should do something in particular, because of the Principle of Least Surprise. The reader who sees A + B should be assured we're adding A and B together, maybe they're matrices, or 3D volumes, or something quite different, but they must be things you'd add together or else that's not a sensible operation for the + operator.
When you don't obey this requirement, the precedence will bite you as, as it does for streams. Because you forgot, these operations still get bit shift precedence even though you're thinking of this as format streaming it's just a bit shift and happens when you'd do a bit shift...
Streams looks like something dumb you'd do to show off your new operator overloading feature, because that is in fact what it is. It should have existed like Mara's whichever_compiles! macro for Rust - as a live fire warning, "Ha, this is possible but for God's sake never use it" - but instead it was adopted for the C++ standard library.
I have no specialist knowledge in this subfield, but after reading the article's arguments that basically if you could sic the entire bitcoin network on 2048 RSA it would take 700+ years, I have to wonder about perverse incentives.
Another thing that's missing is the lifetime expectancy, e.g. "for how many years does something encrypted in 2030 need to be unbreakable?"
The author doesn't seem to be a big authority, so has little to lose by staking their reputation on "you don't need it to be that good," whereas by the very nature of their authority, anyone in the resource you link is going to be motivated to never be wrong under any circumstances. So if someone with some reputation/authority/power to lose think there's a 0.001% chance that some new incremental improvements will allow for fast-enough breaking of 2048 bit encryption created in 2030 within a window where that would be unacceptable, then they're motivated to guess high. The authority in this case doesn't directly bear the costs of too high of a guess, whereas it could be very bad for, i dunno, some country's government, and by extension the org or people that made that country's standards recommendations, if some classified information became public 15 or 50 years earlier than intended just because it could be decrypted.
By "perverse incentives", do you mean something like: "it appears the cryptographic research department has hit a brick wall in terms of useful advancements, so we're reducing the budget and the department head will be taking a 75% pay cut"?
I mean like the incentives aren't aligned. So maybe you're giving an example but i'm honestly not sure. :)
in the space of cve or malware detection, the user wants a safe/secure computing experience with minimal overhead, but the antivirus / cve-scan vendor wants to claim that they're _keeping_ the you safe. so they're motivated to tell you all about the things they scanned and possible attacks / vectors they found. You probably would've been safe responding to only a subset of those alerts, but they have no incentive to minimize the things they show you, because if they ever missed one you would change vendors.
in the space of cryptography, the user wants secure communications that are unbreakable but with minimum hassle and overhead, but the advisory boards etc. are incentivized to act like they have important advice to give. So from the user perspective maybe it makes sense to use 2048 bit encryption for a few more decades, but from the "talking head" authority figure perspective, they can't afford to ever be wrong and it's good if they have something new to recommend every so often, so the easiest for them to do is to keep upping the number of bits used to encrypt, even if there's 99.99% odds that a smaller/shorter/simpler encryption would've been equally as secure.
I assume you’re aware, but for clarity: it’s not possible to sic the bitcoin network on anything, even cracking sha256, which it uses internally, due to hard-coded ASICs that incorporate specific quirks of the proof-of-work.
It seemed like the reason to use the Bitcoin network in the discussion was to form what a theoretical nation-state actor might possibly be hiding based on the traits of some thing in the physically-existent universe (instead of discussing things completely in imaginary-land).
I said this elsewhere too -- while in general I like things _like_ the poignant guide, and appreciate its existence, I never actually finished reading it.
To me it's an interesting touchstone work showing/reminding that the "invisibly neutral" tone we've collectively adopted is still an editorial choice and cultural moment.
I'm sure writing and communication styles will drift back and forth over the next few decades. Sometime soon (if not already) another generation of younger developers will coalesce around some document that feels authentically counter-cultural, like their own late-night jokes and dreams have been given just enough coherence to hook them in, and then I probably still won't get it because I'll be out of touch.... :)
Honestly at the time I remember getting weird twee/precious vibes from the Ruby community and I wasn't particularly interested in it or Rails. I only discovered _why's poignant guide later after I had to learn Rails on the job, and to be honest I never finished it.
I still strongly prefer the worldview, circumstances, mindset, etc. where that kind of content is written, read, and celebrated over today's focus on Influence and Professionalism.
I actually knew _why's real identity, but I refused to divulge it partly because I didn't want his performance to end (even though, sadly, I knew it couldn't last forever) and partly because _why is a fellow native of my home state of Pennsylvania. And I totally agree that his whimsical style is far more preferable than anything every written by any LinkedIn programming "influencer."
I seem to remember multiple incidents where Rails programmers complained that everyone at the conference they went to was playing Werewolf instead of sticking to the topic. It made me wonder what kind of community would lead to that.
As for _why, there is nothing special about him. If you need an overly precious twee white guy, we have a natural source of them called the city of Portland. You can go find another ten of them there. They have waxed mustaches and are running combination coffee bars and thrift stores.
That's because the rest of them are working in other fields (combination coffee bars, thrift stores and barber shops.) I am proposing hiring them as tech writers instead.
Also, I don't think identity politics particularly have anything to do with it. (Japanese people are even more hipster than Portlanders and of course invented Ruby, so you could probably find some of them too for diversity. The aesthetic would be different though because of cultural differences.)
I don't think that's true at all. If you go back and read his stuff, it ages pretty well in my estimation, largely because _why was apolitical. And he was way more informed by good humanities sources than any average Portland woke hipster.
I mean, he is pretty expressly anticapitalist in this work. The name "why the lucky stiff" is found in Rand. _why was regularly political, though it wasn't always a focus of his.
Even in the terms of the false dichotomy you've constructed here, I would much rather participate in a community of professionals who've organized themselves around sufficiently overlapping shared intents, than one accreted around the kind of twee, precious narcissism that characterized the early days of Ruby and Rails.
That comparison is informed by direct experience with both, and is the precise reason why my professional experience with both the language and the platform will to my dying day consist of one successful project a few months long.
A good professional community supports a wide variety of learning styles and levels of engagement, and tends to make a lot of resources easy to find for anyone who's willing to put in a little effort of research. The early Ruby and Rails communities did the exact opposite of this. Between "_why"'s guide serving a primary-reference role to which it is manifestly ill suited, and nobody much bothering to document anything effectively outside that, figuring out how to work with their garbage software was like pulling teeth - especially because, in a language constructed as a farrago of the worst ideas from Perl, Smalltalk, and Common Lisp, even reading the source is anything but a guarantee of understanding.
The structural exclusivity alone was bad enough, but the behavior of community leaders quickly demonstrated that the exclusivity was the point. That the "Poignant Guide" should be considered acceptable as primary documentation implies that the function of selecting for people willing to put up with that kind of nonsense was intended - maybe not as a matter of explicit design, although this would not surprise me, but "the purpose of a system is what it does". This system was created by narcissists to select for acolytes, and while I wouldn't quite call it a cult, neither am I prepared to say those who have are entirely wrong. And if it was a cult, it was a stupid cult, because Ruby is a language that makes programmers worse and Rails was never good technology; its sole unqualified success lies in having inspired software engineers to do similar things in better ways.
I'm sure by now I've upset some folks; there are some for whom no criticism of Ruby, Rails, or their leading lights can fail to read as a personal attack. If that's you who's reading this now, all I can say is this shows the difference between us: if you're cut out to be an acolyte, fine, go to it! I would rather be a practitioner in a community of practice. Granted this means my technical documentation comes without cartoon foxes and a level of pretense suited to an early 2000s Abercrombie catalog, but for documentation that actually documents that's a trade I'm happy to make.
It must have been awful when you were forced to engage with a whimsical community like that. I mean...I'm assuming you were forced, because why else would you harbor so much bitterness over the fact that there were once some people who did things differently than you would have liked?
My main criticism is just that your whole point comes down to "they did things in a way I didn't like!"
So what? They had different personalities or different goals. When I found _why's guide back in 2004 or whatever, at a boring job that involved a lot of sitting around, I learned Ruby as a side effect of poking through this odd little text. That introduced me to concepts I hadn't come across at school learning Java and C, and ended up nudging me into a different direction, so that by the next year I was writing my class projects in Lisp when I could.
If I'd instead come across a sober-minded manual documenting a proper "serious" programming language, I'd have skipped it. That would have sucked. I enjoyed programming a lot more in my own post-_why timeline, even though I agree in retrospect that Ruby isn't a great programming language for serious projects and the twee-ness of the guide itself was a bit grating (to me) at times.
Criticism is easy and mostly useless. There's so much criticism on the internet, seemingly because it makes people feel superior, and I find it exhausting. Just don't use the thing.
Any criticism boiled down far enough ends up in "they did things in a way I didn't like". It's not a meaningful response.
In this case, the criticism was in response to a perspective on these past events that is in my view overly informed by nostalgia, and while I don't make a general habit of snatching rose-colored glasses off anyone's face, in a public setting and especially in this public setting it is not unreasonable to do so.
I'm glad you found some good in the Poignant Guide. I had a similar experience with the Camel Book, a decade or so earlier, with similarly favorable outcomes - had my mom not proven amenable to being wheedled on the subject of an expensive birthday gift in which she could see no immediate value, I would likely not even be a working programmer today, to say nothing of someone able to grasp some rudiments of what may in a few more decades develop into a recognizable engineering practice. But I don't still use Perl or defend it, because beyond the most ephemeral and exploratory of contexts it is objectively godawful, and even in that context it lacks compared with tools that at least don't falsely advertise themselves as production-worthy. When people criticize it, I understand that they are criticizing it and not me.
And finally, on the point of whimsy versus professionalism - I mentioned the development of engineering practice, and that's a desirable outcome already too long in coming, because when we're trusted to build systems on which people's lives and livelihoods depend, we bear a responsibility of which a whimsical approach to the work constitutes active dereliction. Would you trust your life to a whimsical airplane, or your retirement to a whimsical advisor? Of course not! No one would, nor should they.
Other preparadigmatic engineering disciplines had the advantage of us here; a whimsically designed boiler, for example, will most likely maim or kill its perpetrator well before the bad idea gets much chance to spread, while we easily fall prey to the sorts of delusions that come from working with ideas and materials whose potential danger is much less immediately obvious. I would like us to be better at recognizing and extirpating such errors, and criticizing them where the opportunity arises seems like a decent way to advance that goal. I don't mind if you don't like it, but I would like you not to like it for better thought through reasons than these.
I think better criticism is, well, constructive. "Here's a better way to accomplish what they're attempting", not "That's all crap because I don't like it". And before you can do that, I think you have to understand what it is they're attempting.
I'm a professional programmer now, and I much prefer simple, straightforward manuals with clear descriptions. If I was trying to get work done, a bunch of whimsical cartoons would only get in the way. I would have no use for _why's guide today. Even way back when, I was annoyed by the ratio of text and cartoons to actual code & lessons.
But the world isn't made up entirely of professional programmers trying to get work done. There are teenagers who are curious about programming but intimidated by thick volumes and spare language websites pointing at API references. There are workers sick of their current career and looking for something new, and bored stay-at-home spouses. Whimsical guides to quirky languages are for them, not you.
I don't think anybody is pointing to _why's guide as a reference for people who need to write software for self-driving cars or aircraft telemetry in the immediate future. Nobody is arguing that all programming material must be whimsical and quirky. But there was room in the world for a cute little guide to catch newbie or non programmers and teach them a bit. The guide was just an on-ramp, and the community that surrounded it was just a bunch of newcomers fooling around, not Serious Programmers™. That's all it ever was. And in spite of that: a lot of good ideas came from that community, as others have pointed out elsewhere in this thread. A lot of them went on to be Serious Programmers.
I'm not under the impression that you're criticizing me. I read a chunk of _why's guide before I got bored, and then used Ruby for some hobby projects for a couple years nearly two decades ago, before switching to more robust languages: my identity isn't tied up in either of them. I feel like a bystander sitting on a park bench sipping coffee, witnessing you yelling at a group of kids dressed up as an anime characters or whatever, that they should put on a suit and find a real job goddamn it. Like dude, they're just having fun, and frankly it's none of your damn business. You don't like cosplay? Good for you. But maybe chill out a bit and stop yelling at strangers that they're living their life wrong.
> Any criticism boiled down far enough ends up in "they did things in a way I didn't like". It's not a meaningful response.
Objectively false. Although, granted, much criticism does fall in that category and it's generally worthless. However some criticism is "they did things in a way that was strictly worse than this other alternative that I will describe in actionable detail."
Interestingly enough, virtually all unsolicited criticism falls in the former category.
Your long rant is not actually interesting or factual enough to criticise in detail.
I’m only writing this reply to point out that you have already engaged in a lot of making footless inference about other people’s person, so please don’t be surprised if the same is done to you.
Your entire argument is that you personally don't like Ruby nor the ways of the Ruby community combined with a bunch of unsubstantiated assertions. There's nothing else there. What else should they address besides 'your person'
They actually did a pretty good job of elaborating a critique of my criticism in replying to the same comment you did, so I chose to continue the dialogue there.
There was a lot of quirkiness and egotism then, the cult of personality was rife, not just with project leaders such as DHH but ancillary dramas from folks like Zed Shaw.
The stuff from _why was always wacky but some of their projects were also really inspiring for younger programmers, of which I was one. I could take or leave the cartoon foxes but I remember seeing the code for the Camping framework when it was released and being amazed at how simple and elegant it was.
The ruby and rails community set a path that is followed by many to this day. We're all python programmers now but there'd be no Flask without Sinatra, or Django without Rails. The author of the excellent dependency management tool Bundler went onto work on Yarn for javascript and then created Cargo for rust. Mitchell Hashimoto wrote Vagrant in ruby before starting hashicorp, there's loads of examples like this
I disagree that the Poingant Guide was the primary documentation for anything, it was a quirky guide for new comers. The Pragmatic bookshelf had the definitive guide to the language, as well as the rails book and then a whole series of other publications. I also remember spending a lot of time on the core ruby documentation, which does a fantastic job of covering the extensive standard library.
Ruby itself was tolerably well documented, but that was a rare bright spot, and of limited utility when Rails leveraged the malleability of the language into near incomprehensibility even by the language's own low standard. And while you're not wrong that by spending enough money it was possible to obtain halfway decent documentation for Rails that if you were very lucky might still be mostly applicable after a couple of point releases, not everyone had a lot of money to spend in that way. Pay-to-play access to knowledge required for basic competence is another example of structural exclusivity, and one that stood in stark contrast to most contemporaneous projects of similarly high profile, by comparison with which Rails had all the openness of a Freemason lodge in 18th-century Italy - oh, they advertised themselves very effectively, that I grant you, but in terms of actually fostering development among those so attracted, I think a secret society in which membership could be a death sentence would probably do a lot better.
> The ruby and rails community set a path that is followed by many to this day.
I've already acknowledged this, albeit without the unwarranted gloss. I might instead have said that Rails' one unqualified success was as a fount of good ideas badly implemented. But since on reflection I'm pretty sure Rails' use of ActiveRecord popularized ORMs in web application development, I suppose it isn't even fair to say that all the ideas were good.
Hibernate predates ActiveRecord, if I'm not mistaken, and to this date I can't decide which of the two I hate more. They both do different things very wrong. So I'm not sure if you can blame ORMs on Rails.
Also I'm not sure what you mean by the documentation criticism. I criticised Rails elsewhere in the comments, but I can't really fault it for the documentation, which IMHO was always excellent. And then you also had Michael Hartl's excellent Rails guide (which was available online freely, at least at the time) which was what basically taught me modern backend development (including what automated testing is).
I'd have loved to know about such a thing at the time! One wonders why none of the people in the various IRC channels and forums where I then sought Rails advice saw fit to mention it.
In entire, albeit mildly grudging, fairness, I do have to concede that Rails introduced me to the concept of unit testing. But I'm still glad I learned modern backend development in the years immediately after Rails peaked, and while the worthwhile was being sorted from the nonsense among the many concepts and approaches that Rails does, for better or worse, deserve credit for having made newly popular.
But rails was modern backend development back then, it offered an alternative to a world full of mod_perl, PHP4, Java servlets with JSP and XSLT.
Having had the displeasure of working on all those stacks it's hard to overstate how transformative rails was at the time for me.
Built in unit testing is one thing, not having to write a java class to expose a custom function to an xslt processor in order to format a string is something else.
> But since on reflection I'm pretty sure Rails' use of ActiveRecord popularized ORMs in web application development
As someone mired in enterprise web application development in the early oughts, I can tell you this is kind of backwards. ActiveRecord was actually a simplification of the more complex ORM that enterprise software (mostly in Java at the time, but a little bit of Objective C in the banking realm) was using. I _think_ the term "ActiveRecord" was first used in the 2003 book Patterns of Enterprise Architecture, and it was described as a pattern you could use when you didn't need the complexity of a full-blown ORM. For people who had wrestled with Hibernate or WebObjects, ActiveRecord felt like a light-weight sigh of relief.
That said - even as someone who still works on RoR apps - I'm glad we've mostly moved beyond ORMs (primarily by moving beyond objects, which were never a very good fit for representing data in the first place).
Huh, okay, that's fair. ActiveRecord was the first ORM I worked with, and ActiveRecord in 2013 was easily poor enough to color my perspective on the category. I've heard Hibernate criticized before, of course, but I didn't realize it both predated ActiveRecord and was so much worse as to leave ActiveRecord even in its day looking good by comparison.
I still think Rails gets the blame for popularizing the concept, but I suppose that has to be mitigated by prior art making it so easy to popularize - "it's just like what you're used to, but won't make you want to kill yourself to use" is a pretty compelling pitch.
LOL. I feel like a geezer whenever I talk to people about what enterprise software was before RoR came along. XML. So much XML. Do you think that you should write XML in order to query data from a DB with a perfectly serviceable query language? People sure did think that in 2001!
You joke, but honestly, it's still the same class of problem. I don't think I should write TypeScript in order to query a DB with a perfectly serviceable query language, either, and I've seen so much time get spent on dealing with the headaches condign upon leaky abstractions to render the productivity benefits claimed by ORM proponents transparently nonsensical. And yet...
> I would much rather participate in a community of professionals who've organized themselves around sufficiently overlapping shared intents
I thoroughly enjoyed my involvement with early (US) Ruby and Rails folks from the first Rails conf to _why's unusual entertainment to Matz's calm and humble demeanor. People bounced ideas off each other and just enjoyed coding up interesting things. Dave Thomas and the Pragmatic Programmer group wrote what many of us used, not so much _why's guide which was still a fun read. I moderated a Ruby panel at the old Odeo HQ just before they pivoted. I didn't know the group gathering at that Ruby SF meeting would include not only Twitter but Github founders as well. At the time, tweets seemed pretty absurd to some of us but guess what happens when you try out ideas in a community that was into exploration?
I mean, it's usually preferable to be part of an ingroup than of its outgroup, sure. Otherwise, what value in the distinction? But the iron law applies here, too.
I wouldn't be so quick to claim Twitter, either, even among zero-interest-rate phenomena more generally. It might be easy to forget these days, but that's been harmful to society on net since long before Musk bought it.
I think you're missing my point. The early Ruby and Rails community I remember was a collection of very smart and explorative programmers who wanted to build cool stuff with this interesting language. People were trying out DSLs -- sure they could've used LISP -- but Ruby's metaprogramming was inviting and was a reason for the succinct Rails syntax which was a selling point compared to say Java's cumbersome approach.
The speed of trying stuff out (even if it wasn't super efficient) why startups used it. So it was a community of highly productive people sharing their love of building new things. That's my memory of that time period.
I'm not missing your point, but I don't see where it constitutes the counterargument that context and framing suggest you mean it to be.
What you're describing here is your perspective from within the small and insular group busily developing and advocating new technology, always focused on the next new thing that was cool and interesting and succinct and powerful. What I'm describing is my perspective from well outside that pale. Both can be true at the same time.
Then maybe you shouldn’t generalize your one viewpoint to say that early Ruby/Rails was not a worthy community for professionals or of use to people who practice jointly valued skills like entrepreneurship.
It’s also odd to see your long rant of whimsical vs professional when some of the most well-known companies were built with that community of so-called non-professionals. We have a difference in opinion on what constitutes “professional”.
The entire point I'm making is that, however well that community may have fulfilled those roles for people embedded in the social context of its ingroup, it did a lousy-to-failing job of the same outside that circle.
Speaking as someone who started out with Ruby and Rails and has since migrated to other stacks...
I would never conflate Ruby and Rails.
Ruby is a language that took interesting ideas and developed them in a unique way. Today, it doesn't align with my goals 100% anymore (I do like Ruby's focus on expressivity, but not when it comes at the expense of predictability), but I don't think that's Ruby's fault.
Rails, by contrast, is software that did a few things right (things should work mostly out of the box, automated migrations, testing built in) and way too many things wrong. Design and architecture are dirty words for much of the Rails community (and this thought very explicitly originates with DHH who still maintains that you don't need anything besides models, views and controllers), autoloading is hot garbage, "magic" libraries that start monkeypatching your code just because you add them to your Gemfile are a thing, the biggest auth library, devise, is an opinionated mess, writing proper (fast) unit tests is difficult and goes against the framework, and so on.
I would say that most of the good ideas that came with Rails have now been incorporated by better frameworks (even in Ruby itself, but in particular also in other ecosystems), so it's good that it was there, but I wouldn't recommend it any more.
But Ruby didn't originate with Rails. To my knowledge it's still used today by Japanese developers who don't particularly care about Rails.
If I've conflated Ruby and Rails, it wasn't by intent.
I do think Rails was like it was in large part by direct derivation from, and enshrinement of, Ruby's allergy to rigor - which isn't to say they're the same thing, but that the latter directly predated and heavily informed the former.
Yes, Ruby has always prioritised readability and "developer happiness" over predictability and rigour, but that doesn't mean that it necessarily encouraged incomprehensible meta-frameworks. Sinatra or similar projects show how things can be done more decently in Ruby, while still utilising its malleability in order to create a nice DSL.
When a main goal to maximize a subjective trait such as "developer happiness," you are almost certain to cause the opposite for a significant portion of your users. This is the curse of inherited ruby code.
Ruby gives incredible flexibility to individual developers writing code the way they prefer. For a large team, that can be a problem but I'm not sure that was Ruby's original target group.
it's probably a good starting position to assume that since we all the share the same closed-ish system called planet earth, there's interconnections between different systems. Certainly the border areas between desert and not-desert aren't very crisply defined. Certainly (reference in other threads) nutrients can be blown by the winds from desert into non-desert areas far away. Certainly there exist some animals who go in and out of desert regions (birds, butterflies, ...). It's a really good idea to assume that things on this planet are connected to each other.
Sure, I mean, that's my assumption, the question is more of "how important is that connection" in a given instance. And, I suppose down that line of questioning, do we have the knowledge/systems/etc to overcome any losses?
"animal models" is a fairly standard phrase in research: When people research depression, alzheimers, cancer, etc., they generally start with mice and work their way up through monkeys before coming to human trials. For many conditions there's specific "lines" of mice that have been bred or even genetically modified to exhibit those conditions in a reliable or extreme way. Depression is particularly challenging since you can't ask an animal how it's feeling, and frankly nearly all animals used in laboratories are understimulated, removed from their natural habitat, and probably a little "depressed". (see e.g. the "rat park" studies (https://www.psychiatrictimes.com/view/what-does-rat-park-tea...) that showed that rats were much less likely to self-administer cocaine if they were in an environment that let them have a more enjoyable/fulfilling/natural life otherwise.)
So anyway "animal models" just means "an animal mice/rats/monkeys/etc. that we have decided has enough of the same symptoms of the human disease that we can use it to study treatments of that disease", and it's fairly common for something to work in mice but fail in monkeys, or even to work in both mice and monkeys but not work or have very undesirable side-effects in humans. (side note: one of the least discussed things in pharma is how they source the first humans for trialling a new treatment, which does carry non-trivial risk to the human "guinea pigs" - it's generally people who are poor and desperate.)