You're absolutely correct. IMO, the reason is, to a large degree, ineptness of software engineers. It simply doesn't pay to hire or train high-quality SEs, if you're not google or Amazon (and even they probably only hire and pay the best to prevent competition).
Instead, the industry as a whole hands out more "productive" tools, i.e., the abstraction layers you mentioned, in order to build ever more trivial applications (that don't work properly in most of the cases anyways).
I think that we're in a downwards spiral at this point: Business doesn't make enough money per line of code to hire or train really good engineers. Tries to push out more software, resulting in more crap. Software quality sinks. Business makes even less money per line of code. Engineers throw the towel and are replaced by less experienced people. Rinse and repeat.
> You're absolutely correct. IMO, the reason is, to a large degree, ineptness of software engineers.
Hmm, software engineers? IMO the culprits are "greedy" managers who didn't like (and still don't like) common libraries, file systems etc. Motto: "It must be my standard, or else ..." Where's the common (license free and secure, encryptable) standardised file system, to exchange data between all platforms, including digital cameras and other gadgets? Where's the graphics API? Java tried to answer some of these questions, but failed miserably IMHO.
My experience: as soon as I learned to use some API or framework, at least two new APIs or frameworks showed up posing as the best thing since sliced bread. Sorry for the sarcasm, but I lost my optimism regarding software development some years ago.
I agree with what you wrote 99% but draw a different conclusion.
The 1% where I don't agree: most developers aren't engineers. These days companies hire engineers for stuff that is hard and important. But most do not need that.
And my impression is that that is a good thing. There's been a real democratization of development which means there's a lot of software being written. Most applications I see are fundamentally CRUD apps, and while I'm a bit fearful of the security of the banking apps, the massive frameworks are at least designed (implicitly at least) to reduce complexity and steer people towards some "best practices". Any kid today can build something that only a couple of decades ago (or less) really did need serious engineering.
We've followed the same path before. Rich people used to hire chauffeurs who didn't just drive but performed maintenance on the cars. Then people knew how to change tyres, gap their spark plugs etc. Now people know how to turn the key and steer around the road, hopefully not killing too many people in the process. Next, they'll only need to know how to step into the vehicle.
We should be glad software is going that way. It's not like that will reduce employment for engineers.
I see so much growth & positivity, so much can-do-ism. Everyone's doing great things, but in particular the web has brought about a much higher access to systems that seem much safer for users, much more sandboxed, with better tools/extensions/navigability/accessibility/hackability, that have much more open source activity going on than native (which has been to huge benefit in so many ways: to productivity, to learning, to exploring, to having fun, to creating social cultures & groups).
> I think that we're in a downwards spiral at this point
I'm mainly writing a reply just to say: I have huge hope we're on an great upward ascent & have made things increasingly better.
> IMO, the reason is, to a large degree, ineptness of software engineers.
I sometimes encounter less than optimal code or solutions, but the amount of times that it matters is unbelievably little. Focusing on really good engineers making really excellent decisions is, not, to me, a critical issue in most places. I think there's a lot of value in just finding the budget to support open source engineers, in trying to support good community efforts, like Web Incubation specs, like Bytecode Alliance. Time & care, about doing good things for community, over time, is the incomparable advantage, is the prime requirement of turning acceptable/passable/maybe-a-little-defect-y code into something that really works well & serves & is/has-become enduring.
> Instead, the industry as a whole hands out more "productive" tools, i.e., the abstraction layers you mentioned,
I see so little harm to using the good, well embraced, ultra productive, mid-industrial toolset we've grown into. And I see so little benefit to trying to go any other way. We should respect resource consumption, and some apps (webapp and native both) do bad jobs, but the layers mentioned seem like a vanish point of concern, far far far down where I think computing needs to be dwelling & caring & trying to do better. There's so few examples that there are real gains to be made from abandoning the layers we have; efforts like react-native show we can do basically the same high-production technioques at a native layer, but the rewards for doing it native have almost never materialized; just using the web stack continues to be good.
I think there are far more productive things to be debating & discussing, as we decide where to orient ourselves & the next steps of computing to. Worrying about these stacks has infected too much of the online discourse time spent, has become a mind-virus.
Instead, the industry as a whole hands out more "productive" tools, i.e., the abstraction layers you mentioned, in order to build ever more trivial applications (that don't work properly in most of the cases anyways).
I think that we're in a downwards spiral at this point: Business doesn't make enough money per line of code to hire or train really good engineers. Tries to push out more software, resulting in more crap. Software quality sinks. Business makes even less money per line of code. Engineers throw the towel and are replaced by less experienced people. Rinse and repeat.