I don't disagree, but it doesn't only have to go in this one direction. One of the most interesting things about Rust for example is how it tackles experimental implementations and has concepts in the compiler etc. that make unstable language features "first class". I'd say this will definitely yield better results than "well a couple of guys hacked around on some prototype forks of a compiler, and now we're stuck with the result".
Of course, they also make very impressive backwards compatibility guarantees for stable stuff (cf. Rust's "editions").
Rust has corporate sponsorship and a very experienced team of developers.
You won't get that level of attention to detail and commitment to getting the design right up front, and the willingness to maintain old APIs in the name of backwards compatibility in a single-person open source project published into a package manager done on someone's free time.
So you are probably arguing for very thick standard libraries which are maintained by the core language team, which is corporate sponsored, and a reduction in reliance on open source packages.
That also means as well that we shouldn't tolerate "shaming" of projects for taking a long time to fix and merge features since 95% of the work will be required to be done up front in thinking about the right shape of APIs.
I'm cool with all of that as long as the whole package comes along. The idea that a bunch of solo, unpaid open source maintainers are going to be doing good API design up front and maintaining perfect backcompat, while being incredibly responsive to PRs from the community is kind of "unicorn farts" levels of not going to happen in the real world. You sort of get what you pay for, and a bunch of unpaid solo volunteers are going to need to make breaking changes to fix their old mistakes and abandon maintaining their old tech debt. And if you paid nothing for it, really you're getting more than you deserve in that deal.
> So you are probably arguing for very thick standard libraries which are maintained by the core language team, […]
No, I'm not arguing that Nix should do anything in particular.
All I was saying is that the "we have to get it right the first time without much feedback" way obviously isn't the only one and there's empirical evidence of other working models.
As for PRs and so on you're really putting words in my mouth, and frankly, I don't like it. Just so you know. I have never made PRs to core Nix, but for Nixpkgs I have only had a good experience so far.
---
Edit: Or are we really talking Python? In that case, I could even less comment on PRs. But: Python has a large stdlib (it's "batteries included", after all). But also, Python often has found good ways to deal with its warts.
And I hope I don't have to argue that Python3k wasn't worth the trouble, right?
And frankly, there, I'd argue that growing to the point that python has you'll have to reexamine some more ways to gather data about community interest, for example.
From the outside, the process around the walrus operator and Guido leaving the BDFL post looks like a prime issue of either not having enough "wild information" early on or of the final decision ignoring a vocal part of a huge language community.
Of course, they also make very impressive backwards compatibility guarantees for stable stuff (cf. Rust's "editions").