As someone who has moved from native app dev to web dev I just feel my productivity and satisfaction has plummeted. The stack (html, js, css, browser functionality) that makes up the web is just not fit for the purpose of rich client applications.
The development of the thick leaky abstractions that make up web frameworks have consumed millenia of human man hours, and yet the experience of developing & using the resulting web applications is still trash.
> The stack (html, js, css, browser functionality) that makes up the web is just not fit for the purpose of rich client applications.
you get much better mileage from web pages than rich client apps. one thing i’ve learned and applied everywhere i worked is to let the web be the web, don’t force things that come from other platforms.
I’d say Flutter, a GUI framework backed by Google and it’s open source.
I’ve recently ported over a popular project called “llama.cpp” to Dart (language behind Flutter) and I’ve recently made YT video’s showing it running natively on macOS, Linux, Android, iOS, iPadOS and next up is Windows.
The official Ubuntu installer is made with Flutter too nowadays. But to be fair, last time I tried QT was somewhere in 2018, it might be a good option too.
But please don’t make web apps with Flutter. It has determinedly taken the pure-canvas route which makes it very unpleasant to use for a significant fraction of users, and all kinds of things just don’t work the right way and can’t. (I’ve written about the problems here on HN quite a few times, search and you’ll find them.)
I see Electron, Tauri/Capacitor and friends as a viable route.
Second choice in terms of DX is Flutter. There's nothing like that out there in terms of DX although there are tons of other issues notably around performance.
Third would be Qt but that's not a viable tool because of licensing.
The first two options have no such licensing issues.
Electron is just webdev but you have to ship a whole freaking browser along with your app. I haven't heard of Tauri/Capacitor, maybe I'll check them out.
As someone still in native, I got a similar vibe when switching from UIKit to SwiftUI. While this happened simultaneously with a lot of other changes (including using VIPER instead of MVC/MVVM), I'm also trying to use SwiftUI for one of my personal projects and find it disappointingly more difficult than the time I learned C++ by diving into someone else's badly written project.
Conversely, another side project is a JS game done in an old-school OO pattern and vanilla (no libraries no frameworks), and it's easy.
I want to like automagical solutions that the new frameworks keep touting, but everywhere I'm just seeing things that make it harder to work out what's going on. Half the stuff on the web should be a thin bit of pure HTML/CSS/image data with no JS at all, build server-side, and where the interactions are done. Like, the HN comment form I'm in right now is:
I’m not the GP but have been building websites off and on since the 1994.
Back in the 90s it was hacky, but the extent of those hacks were small enough that even hobbyists could memorise those edge cases. Whereas these days there are so many footguns and edge cases that even seasoned professionals find it impossible to memorise everything. The amount of trial and error it takes to build a modern site is immense. With native applications it is a lot easier to write unit tests to catch this stuff but how do you unit a CSS property alignment across 3 browser engines? The answer is almost always to hire QA experts because setting up automated tests has become so complicated and burdensome that it’s now a profession in its own right.
It doesn’t help that “web” is really a plethora of different technologies: CSS, HTML, JS, HTTP, TLS, and image libraries like SVG, JPEG, PNG and GIF. And a few of those have entire subcategories of crap to wade through, like cross site origin headers, subtle incompatibilities in ECMAScript, support of different web extensions, different HTTP protocols, CSS incompatibilities, differing viewport sizes, TLS ciphers, etc.
And that’s just user facing code. What about backend? SQL or No-SQL? Table indexes, web server solutions, load balancing, caching, CDNs, backend server side code, how you cluster your web services, where you host it…etc.
Then you have to secure the damn thing because it’s open to the public.
And all this just to produce interactive documents!!
But the worst thing of all is once you’ve finally wrapped your beard around everything, it’s all out of date again because the latest batch of 20-somethings got so fed up learning this shit as well, that they’ve just gone and rewritten half of the stack from scratch again.
It’s all massively over engineered because we keep on trying to polish this turd rather than accepting we have outgrown the web spec and need something more tailored.
And it’s only getting worse. With more and more software switching to Electron and a growing trend of native applications embracing WASM for portable modules, we keep doubling down on this turd — acting like if we spend enough time polishing it then one day it might turn to gold.
So as someone who’s multi-disciplined, I too find web development the least satisfying of all the software development domains I’ve worked on. It’s down right frustrating at times.
> Back in the 90s it was hacky, but the extent of those hacks were small enough that even hobbyists could memorise those edge cases.
You're either misremembering, or have some thick rose-tinted glasses on.
The late 90s were times of the Wild West Web. Every browser had custom behaviour and rendered pages differently. JavaScript and CSS were still new, and it took years for the implementations to be standardized. Websites had "Best viewed in Netscape" or "Built for IE" badges. Java applets, ActiveX, Flash, and a bunch of other technologies surfaced to make early versions of web "apps" actually possible. Even as late as 2005, projects like jQuery were needed to make web development easier.
I'd say that it wasn't until the late aughts, well after "web 2.0", that web development didn't feel hacky. By then, though, the JS ecosystem exploded, and frontend frameworks took off.
So I get the sentiment that modern web development is difficult, bloated and buggy, but the good news is that the web is fully mature now, and there are plenty of projects that make it possible to build simple and robust web sites. Web developers are spoiled with choice, so a hard task is simply selecting the right technologies to use. Unfortunately, sometimes this is out of our hands, or we inherit decade-old codebases that make maintenance a nightmare. But it's never been easier to write plain HTML/CSS/JS that will work across browsers, and deploy it as a static site that works reliably at massive scale. If you need more features, then complexity can creep in, but it is possible to minimize failure points with good engineering.
Agreed. The thing that has changed about the web is the expectations have increased. There is no way that people could make the websites users expect today with only technology from the 90s.
> There is no way that people could make the websites users expect today with only technology from the 90s.
That's my point though. The web should never have been bastardised into making applications. We should have seen the desires of web2.0 and built an entirely new software stack purpose built for online applications.
It wasn't fit for applications then and it still isn't now. It's just we keep trying to convince ourselves it is because we're too locked into the web now to ever move past it.
The expectations of apps have skyrocketed too which is why everyone is using Electron to cope. If GUI development were still as easy as the drag and drop builders in Delphi 7 and VB6 with a bunch of static 800x600 monitors and double-click to edit component code-behind, we’d be shoehorning fewer apps into HTML and making more desktop apps.
I remember how big of a pain in the ass Windows distribution (alone) was in the 2000s with entire businesses like InstallShield built around it. Tack compatibility testing on top of that and a desktop app for a single OS because a massive effort. With the web you could just upload PHP and off to the race. That slowly evolved into what we had today via jquery and electron and friends but what won was distribution and ease of development, despite all the handwringing people do about frontend. The grass isn’t greener on the other side and hasn’t been for decades.
It’s not even remotely competitive anymore. My favorite example is GoldenLayout: it takes me less than a few hours to read the docs and implement a full blown tabbed and split screen interface using a third party library and combine it with any framework I want like React, Svelte, or just vanilla JS. Each desktop framework have their own solutions to the problem usually in the form of commercial components but even in the best case, they take lots more time to wire together.
I was writing resizeable desktop applications in the 90s and it was easy then too. Differing screen resolutions is not a recent thing.
The reason for Electron's popularity is because:
1. web developers are cheaper and easier to hire than people with desktop software development experience
2. Electron is cross platform
Point 2 is a compelling reason. But it's not like other cross platform toolkits haven't existed. They've just been left to languish because of point 1.
If companies only hire web developers then of course everyone is going to invest time into writing Electron apps and banging on about the benefits of web applications. It becomes a self-fulfilling prophesy.
I say this with experience having watched the tides turn, first hand, as a software developer.
I was there too with a Sony GDM-FW900 monitor with 2300x1400 resolution and my experience was very negative. Both as a user and as a developer (though I was just a kid getting started in ‘99). Any software that wasn’t made by a major company was a crapshoot and Delphi layout was brittle for my own apps. Anything that got deployed to university computers with their bulk purchased monitors was mostly fine except for the deployment process.
There are more web developers because it was a lot easier, even before npm existed and frontend frameworks went out of control. That’d didn’t happen in a vacuum - Electron was really late on the scene as far as frontend tech goes. It was originally made to support an IDE and not regular business apps and still won.
I’m talking about the early to mid 90s of web development and you reply with a monitor released around 2000. That’s a completely different era.
> with 2300x1400 resolution and my experience was very negative.
That’s more an issue with OSs not doing scaling back then than it is a problem with desktop software.
You’d still have the same issue today if you run a super high resolution with zero scaling.
> There are more web developers because it was a lot easier
It’s has a lower barrier for entry but I wouldn’t call it easier.
> even before […] frontend frameworks went out of control.
So you’re basically agreeing with me then. The current state of affairs is out of control.
> Electron was really late on the scene as far as frontend tech goes. It was originally made to support an IDE and not regular business apps and still won.
Its origins doesn’t disprove my point. Lots of things start out as one thing and evolve into something else. That isn’t unique to the web nor Electron.
I thought we were talking about "resizeable desktop applications in the 90s"? The reason I bring up that monitor is that I had the misfortune of using dozens of desktop applications written in the 90s to control lab equipment and spent copious amounts of time manually tiling them in all kinds of positions and sizes on a high resolution monitor. Scaling was definitely not the issue thanks to great eyesight. I was lucky if they supported a non 4:3 aspect ratio. Anything that wasn't a graphical or CAD or other app that did its own rendering or was developed by a large dev team was a crapshoot.
Lots of absolutely positioned buttons clipped by a resize, toolbars that didn't hide their buttons behind a menu when the window was too small, uncollapsible panels with minimum widths that exceeded the main content, and so on. Most of them were about as resizable as a water balloon when you smash it on the ground.
Lab equipment software has hardly been the pinnacle of desktop software. Even native desktop evangelists moan about the quality of some of that software. So I wouldn't use that as your benchmark given they're already known to be generally below par. In fact I'd go further than that and say lab software is notorious for having terrible UX.
I can assure you that plenty of good software did exist. I know this because I wrote some of it ;)
> You're either misremembering, or have some thick rose-tinted glasses on.
> The late 90s were times of the Wild West Web.
Or we are talking about different eras. I'm on about early 90s.
> Every browser had custom behaviour and rendered pages differently.
They did. But there was less of spec to have to concern yourself with. The biggest day to day differences was around frames (which was a pain in the arse) and tables (which was easy to work around).
> JavaScript and CSS were still new, and it took years for the implementations to be standardized.
You didn't need Javascript most of the time and CSS incompatibilities were easy to remember (I'm talking about cognitive overhead here)
> Even as late as 2005, projects like jQuery were needed to make web development easier.
That's in the region of 10 years after when I'm talking about. A completely different era. By that point the web had already turned into the shitshow it is now.
> I'd say that it wasn't until the late aughts, well after "web 2.0", that web development didn't feel hacky.
I couldn't disagree more. By that point people had Stockholm syndromed themselves into believing web technologies were actually good. But it wasn't.
> So I get the sentiment that modern web development is difficult, bloated and buggy
Its not a sentiment. It's a fact
> the good news is that the web is fully mature now,
No its not. We're just Stockholm syndromed into thinking its mature. But half the technologies we use are constantly churning. That's not the definition of mature.
> Web developers are spoiled with choice, so a hard task is simply selecting the right technologies to use.
The hard part is finding something that will still be around in 5 years time.
> or we inherit decade-old codebases that make maintenance a nightmare.
I've worked on plenty of decade-old codebases and I'd never rank web development in there precisely because of the aforementioned churn. Web tech goes out of date so quickly that it never gets to live past a decade...never mind multiple decades. It's simply not a mature platform to write software on despite what you claim.
> But it's never been easier to write plain HTML/CSS/JS that will work across browser
Who writes plain HTML and JS? There's so much bloat required to get anything to look modern that nobody writes plain web sites any longer (In fact I did for one of my open source projects and people hated it and rewrote it in Vue).
It was much easier to write plain HTML et al in the 90s. In fact that was exactly how web development back then was done.
> and deploy it as a static site that works reliably at massive scale
That's literally how sites were originally written. It's not a new invention ;)
The web was intended to be a static collection of documents. What we've since done is tried to turn it into an application framework. And that's what it sucks at.
> If you need more features, then complexity can creep in, but it is possible to minimize failure points with good engineering.
Sure, but again this isn't a new invention. This was the case right from the inception of the web. It's just gotten a hell of a lot harder to do good engineering for the web.
> Or we are talking about different eras. I'm on about early 90s.
I think you're misremembering then. There _was_ no web development to speak of in the early 90s. The web was largely a niche technology until the mid-90s. Mosaic released in January '93, Netscape in October '94, and IE in August '95. By the end of '93, there were a total of 130 websites[1], most of them from universities and research centers. By the end of '94, a whopping 2,278 websites. JavaScript first appeared in September '95 (Netscape), and CSS in August '96 (IE).
> You didn't need Javascript most of the time and CSS incompatibilities were easy to remember (I'm talking about cognitive overhead here)
Depending on what you're building, you still don't need JS most of the time today. The difference is that today all browser implementations are ECMAScript compliant, and the core functionality is much more capable than in the 90s, so you can get by with just sprinkling JS where and when you need it, without resorting to frameworks, build tools, libraries, and any of the complexities commonly associated with modern frontend web development. This is without a doubt, an objectively better state than what we had in the 90s.
Of course, actually relying on external dependencies would make your life easier, so the difficult task is picking the right technology to use from a sea of poorly built and maintained software. This is the drawback of a platform exploding in popularity, but it doesn't say anything about the web itself.
As for CSS, how can you honestly say incompatibilities were easy to remember? Netscape was initially pushing for its own competing styling format, JSSS[2], and it didn't officially support CSS until version 4.0 (June '97). Even then, not all CSS properties were supported[3]. So it wasn't even a matter of remembering incompatibilities; developers literally had to target specific browsers, and even specific versions of browsers. Vendor prefixes were required for pretty much everything, and are still used today, though thankfully, core CSS features are widely supported, and they're only needed for advanced features. There's no way that all of these incompatibilities were easier to deal with in the 90s.
> That's in the region of 10 years after when I'm talking about. A completely different era. By that point the web had already turned into the shitshow it is now.
jQuery appeared precisely as a response to the lackluster state of JS in browsers, and to make development easier by not worrying about browser incompatibilities. My point is that up until then, web development _was_ a shitshow.
> Its not a sentiment. It's a fact
Funny how I can disagree with a "fact" then...
> The hard part is finding something that will still be around in 5 years time.
It's really not, unless you're chasing the latest hype train. jQuery is 17, React is 10, Vue is 9, etc. And like I said, you don't strictly need any of it. If you write standards-compliant HTML/CSS/JS, it will serve you for decades to come with minimum maintenance. You've been able to do the same since arguably the late 2000s.
> Who writes plain HTML and JS?
Many people do.
> There's so much bloat required to get anything to look modern that nobody writes plain web sites any longer
That is factually not true.
> That's literally how sites were originally written. It's not a new invention
I'm not saying it is. My point is that you can still do that today.
> I think you're misremembering then. There _was_ no web development to speak of in the early 90s. The web was largely a niche technology until the mid-90s. Mosaic released in January '93, Netscape in October '94, and IE in August '95. By the end of '93, there were a total of 130 websites[1], most of them from universities and research centers. By the end of '94, a whopping 2,278 websites. JavaScript first appeared in September '95 (Netscape), and CSS in August '96 (IE).
My first website went public in 1994. Before then I was writing stuff purely for a private intranet. So I'm definitely not misremembering.
By 1995 I had released an online RPG (it was very rudimentary but it worked).
By around 1997 (give or take, this was a hobby project so cannot remember the exact year) I had a full 3D web site available via VRML. Wasn't much of a success because most people didn't have 3D capable graphics cards back then. I think it was a couple of years before 3D accelerators became the norm.
1998 I was experimenting with streaming HTML chatrooms (that required a lot of hacks to get working because we are talking pre-AJAX here) and bots written in Perl.
For most of the 90s I was on the cutting edge of web technologies. So I remember the era well.
> This is without a doubt, an objectively better state than what we had in the 90s
Is it though? Better capabilities doesn't always equate to something being objectively better. Particularly if those capabilities are a complete clusterfuck to code for, as the current web standards are.
True elegance of an ecosystem isn't about raw capabilities, else we'd still be writing everything in assembly. Its about the ease of which it is to accomplish a task. I'd argue that the current web isn't elegant in the slightest. A polished turd is still a turd.
> Of course, actually relying on external dependencies would make your life easier, so the difficult task is picking the right technology to use from a sea of poorly built and maintained software. This is the drawback of a platform exploding in popularity, but it doesn't say anything about the web itself.
The problem isn't the choice. The problem is that "the right technology to use" is more about what's in vogue at the moment than it is about that's mature.
When you look at other popular technologies, you still have choice but there's also mature stacks to choose from. The moment anything web related becomes "mature" (and I used this term loosely here) the next generation of developers invent something new.
> jQuery appeared precisely as a response to the lackluster state of JS in browsers, and to make development easier by not worrying about browser incompatibilities. My point is that up until then, web development _was_ a shitshow.
It was. And it's a bigger shitshow now. Glad you finally understand the point I'm making.
> Funny how I can disagree with a "fact" then...
That doesn't mean I'm wrong ;)
> It's really not, unless you're chasing the latest hype train. jQuery is 17, React is 10, Vue is 9, etc. And like I said, you don't strictly need any of it. If you write standards-compliant HTML/CSS/JS, it will serve you for decades to come with minimum maintenance. You've been able to do the same since arguably the late 2000s.
jQuery isn't recommended any more. React isn't popular any more. Vue is probably the only item there that has merit and that's still less than a decade old.
You talk about "decades" and cannot list a single framework that is still in widespread use and more than 10 years old.
> Many people do.
Many people also solder their own CPUs. But that doesn't mean anyone does it for stuff that actually matters.
> That is factually not true.
Yes it is. Simply saying it isn't doesn't disprove my point.
> I'm not saying it is. My point is that you can still do that today.
And you can still hand solder your own CPU today. But that doesn't many anyone does that for professional sites.
The only reason people stick up for the current status quo is because they either don't know any better or Stockholm syndromed into living with the status quo.
As someone who's written software in more than a dozen different languages for well over 3 decades, every time I come back to writing websites I always feel disappointed that this is what we've decided to standardise on. You're points that its capable aren't wrong. But that doesn't mean it's not still a shitshow. Raw capability alone simply isn't good enough -- else we'd still be writing all of our software in assembly.
So yours was one of the first 2,278 websites? Congrats.
I don't see how any of your accomplishments are relevant, but thanks for sharing.
So your point is that the web when JavaScript and CSS were in their infancy, before web standards existed and were widely adopted, before AJAX and when you had to use "a lot of hacks" to implement streaming... that _that_ web was somehow easier to work with than the modern web? That sounds delusional.
VRML, along with Java applets, ActiveX, Flash, and a myriad other technologies around that time were decidedly not web-native (i.e. a W3C standard, implemented by all browsers). They only existed because the primitive state of the early web was incapable of delivering advanced interactive UIs, so there were competing proposals from all sides. Nowadays all of these technologies are dead, replaced by native web alternatives.
> Better capabilities doesn't always equate to something being objectively better. Particularly if those capabilities are a complete clusterfuck to code for, as the current web standards are.
Which particular standards are you referring to? Native HTML5/CSS3/ES2015+ are stable and well supported standards, and you've been able to target them for nearly a decade now. Their capabilities are obviously much greater compared to the early web, but this is what happens when platforms evolve. If you dislike using them, then I can't convince you otherwise, but I'm arguing against your point that the state of the web was somehow better in the 90s.
> The problem isn't the choice. The problem is that "the right technology to use" is more about what's in vogue at the moment than it is about that's mature.
That's a problem caused by the surrounding ecosystem, not the web. How is this different from VRML being replaced by X3D in 3 years? The good thing is that today you can safely rely on native web technologies without fearing that they'll disappear in a few years. (For the most part. Standards still evolve, but once they're widely adopted by browsers, backwards compatibility is kept for a long time. E.g. HTML4/CSS2/ES5 are still supported.)
If you're talking about frontend frameworks and libraries, again: they're not a standard part of the web, and you don't have to use them. If you do, it's on you to manage whatever complexity and difficulty they bring to your workflow.
> True elegance of an ecosystem isn't about raw capabilities, else we'd still be writing everything in assembly. Its about the ease of which it is to accomplish a task.
I fail to see how all the improvements of the past 20 years made things more difficult. The capabilities have evolved because user expectations have grown, and complexity arises from that. But if you were to build the same web sites you were building in the 90s with modern technologies, like your streaming HTML chatrooms site, you would find the experience vastly easier and more productive. This is an objective improvement.
> jQuery isn't recommended any more.
Because it's not needed anymore, because JS has evolved leaps and bounds since 2006, and implementations in all browsers are standardized. It's still the most popular JS library by far, and used by 77.3% of all websites[1].
> React isn't popular any more.
It's in the top 10 most popular JS libraries. And how come you're judging based on popularity anyhow? Above you were criticizing choosing technologies based on what's "in vogue at the moment" over "what's mature". React is a _mature_ UI library, and is a safe choice in 2023, unless you're chasing the latest hype train.
> You talk about "decades" and cannot list a single framework that is still in widespread use and more than 10 years old.
JavaScript frameworks as a concept are barely a decade old. React isn't a framework, it's a library. Like I said, jQuery is the most popular library and is 17 years old. Underscore (2009), Bootstrap (2011), Lodash (2012), and many more, are still in widespread use today.
But my point is that _today_ you don't strictly need any of them to build advanced interactive experiences. If you do want to, though, there are many to choose from that simplify development of modern UIs, without being a "clusterfuck" to work with IME. htmx, Lit and Tailwind are all lightweight, well maintained, and help with quickly iterating without resorting to full-blown frameworks. If you do want a framework, Svelte is now 7 years old, so quite mature, and is very pleasant to use.
> Yes it is. Simply saying it isn't doesn't disprove my point.
I thought the fact that you're reading and typing this on a forum built with simple HTML, CSS and minimal amounts of JS would make this self-evident. (The fact it uses a bespoke backend is irrelevant; this could just as well be served by a mainstream backend stack.)
But to save you a web search, here are other examples courtesy of ChatGPT[2].
> As someone who's written software in more than a dozen different languages for well over 3 decades, every time I come back to writing websites I always feel disappointed that this is what we've decided to standardise on.
Nice humblebrag again, but if you'd be willing to accept that the web has grown exponentially since the days you were building websites before JavaScript and CSS existed, that there are orders of magnitude more web developers and software now than back then, and that the core web technologies are more mature and stable than they've ever been, then you'd be able to see that the status quo is not so bad.
I have more issues with the modern state of centralized mega-corporations and advertising ruining the web than anything I can complain about the technology itself. But that's a separate topic.
> So your point is that the web when JavaScript and CSS were in their infancy, before web standards existed and were widely adopted, before AJAX and when you had to use "a lot of hacks" to implement streaming... that _that_ web was somehow easier to work with than the modern web? That sounds delusional.
My point was that the amount of hacks required these days has grown exponentially.
> VRML, along with Java applets, ActiveX, Flash, and a myriad other technologies around that time were decidedly not web-native
Ofcourse they weren't. I never implied otherwise.
> Nowadays all of these technologies are dead, replaced by native web alternatives.
Indeed. Technologies that are exponentially harder to write the same code in. Hence my point: modern web tech is a shitshow.
> Which particular standards are you referring to? Native HTML5/CSS3/ES2015+ are stable and well supported standards, and you've been able to target them for nearly a decade now. Their capabilities are obviously much greater compared to the early web, but this is what happens when platforms evolve. If you dislike using them, then I can't convince you otherwise, but I'm arguing against your point that the state of the web was somehow better in the 90s.
You're fixated on that point and it's not what I said. I said it was easier to grok in the 90s and has just gotten worse over time. Which is a fact.
I also said the current web is an unfit clusterfuck that people are Stockholm syndromed into believing is good. Everything you've posted thus far reinforces that Stockholm syndrome point.
> > React isn't popular any more.
> It's in the top 10 most popular JS libraries. And how come you're judging based on popularity anyhow? Above you were criticizing choosing technologies based on what's "in vogue at the moment" over "what's mature". React is a _mature_ UI library, and is a safe choice in 2023, unless you're chasing the latest hype train.
I haven't worked with a single engineer, how hasn't bitched and moaned about React. And I've managed a lot of engineering teams over the years.
Vue is a different matter.
> JavaScript frameworks as a concept are barely a decade old. React isn't a framework, it's a library.
It's both. The term "framework" has an pretty meaning in software and React falls under that heading quite comfortably. What's happened, and why you're confused, is that kids have overloaded the term with "web framework" to mean something more specific. React on its own isn't a "web framework" in the trendy web sense but it's still 100% a "framework" in the stricter software development sense.
This is actually another great example of the lack of consistency in the web ecosystem.
> But my point is that _today_ you don't strictly need any of them to build advanced interactive experiences.
You never had to. You're making another strawman argument because you're not only claiming I'm saying you need these frameworks (you don't) but also making it sound like this is something that's only come about because of the modern web (which isn't true).
> I thought the fact that you're reading and typing this on a forum built with simple HTML, CSS and minimal amounts of JS would make this self-evident. (The fact it uses a bespoke backend is irrelevant; this could just as well be served by a mainstream backend stack.)
HN is far from your typical website. lol
> Nice humblebrag again, but if you'd be willing to accept that the web has grown exponentially since the days you were building websites before JavaScript and CSS existed, that there are orders of magnitude more web developers and software now than back then, and that the core web technologies are more mature and stable than they've ever been, then you'd be able to see that the status quo is not so bad.
It's not a "humblebrag", it's an illustration that my opinion comes from years of experience using a multitude of different technologies. Honestly, I think you need to diversify your experience too because your comments fall firmly into the Stockholm syndrome bracket I described by the fact that seem completely unwilling to accept that we could have all the same power of the current web but massively more simplified and elegant if we were to redesign things from the ground up. There are so many footguns that developers need to learn simply because of the way how the web has evolved. And all you keep harping on about is that "its powerful" -- sure. But so is assembly. Yet literally no-one advocates writing commercial desktop software in assembly.
The problem here is trying to convince someone that the domain which they earn their living from is a shitshow, is simply always going to be met with opposition because you have no impartiality. Whereas people like myself and the OP do. And that's why we make the comments we do when we say that the web is unsatisfying to develop against.
Writing GUIs can be easy. It’s just hard on the web. Some complexity is unavoidable, but the web stack is so broken it makes things 5x harder than it needs to be.
Do you have concrete examples that are much harder in the web than in native? When I started developing, I very much liked the web technologies for not being in my way - you have to learn their pitfalls and so on, but once you're up and running, you can do anything you want. Compared to that, every time I touch a native toolkit feels like an absolute nightmare the second you leave the trodden path. Even things like "changing the color of specific parts of a control" are often times either completely impossible without fully re-implementing it, or lead to bugs that make you wish you didn't try.
The development of the thick leaky abstractions that make up web frameworks have consumed millenia of human man hours, and yet the experience of developing & using the resulting web applications is still trash.