This is false. Many websites are using Javascript to render the websites in full to give the users a better experience, such as rendered a SPA (Single-Page-Application) to prevent unneeded amounts of data loaded on each page request and only load exactly what you need, to give a faster, smoother and higher quality experience for the user.
Sure the mom down the street who wants to blog about the her kitchen recipes wont be needing it, but a lot of websites "do"
Websites are only bloated because of the MB of javascript to make them applications (images do not count as they need to be rendered either way and can be cached). Most of the websites I visit every day don’t display that much content. Once gzipped it's a tiny file.
Yeah and if you visit a site everyday, the JS is cached, too. After the initial download, it shouln't be an issue. This is besides the fact that it should not take MBs of JS to make complex applications.
The issue is what people are choosing to do with the tools, not the tools themselves.
HTML can be cached, too. Even if JavaScript permitted some clever way to safe network traffic like for example delivering the site as a torrent that is then downloaded from other visitors [0], then we can standardize that and implement it as a part of HTML or so.
Except that the current trend in webdev is to deploy small incremental changes multiple times per day and cache-bust the assets that change on each of those deploys. So the user is probably actually downloading the JS asset at least once per day Monday through Thursday (because no deploys on Fridays).
Well right now the problem is that the tool is broken. We have a long term non fixable issue (outside of replacing the hardware) with non trusted code running locally.
I would be absolutely astounded if there was any SPA which moved fewer bytes over the network than its plain HTML equivalent. HTML can be written compactly, and the repeated structures compress well; any increase in the size of the data over JSON will be more than made up for by not needing to load megabytes of javascript.
I cannot remember how many times I have visited SPA websites that break the browser url history.
Also, if a SPA fails to load a request for any reason, try reloading it. Ops you start over.
And I am not talking about some people that don't know what they are doing. At times I've had issues with Google's new developer console, gsuite admin, analytics, product hunt and others.
Nonsense. Bad tools are bad tools as evidenced by the fact that literally no one can use properly at scale.
Evangelists of bad technologies always like to trott out the nonsense about "a good carpenter never blames their tools"... well of course they don't: they buy the tools and always buy the appropriate ones. Further, if they did complain the customer would wonder why they bought bad tools in the first place.
A carpenter would never try to build a building with a "JavaScript" equivalent in the first place. They recognize the tool as garbage (perhaps after trying, and failing, to use it) and switch to something more appropriate.
If you took away JS from Facebook and Twitter, they'd literally be a better experience. They'd be forced to give up much of what makes them painful to use. At least Twitter had a character counter, that was a useful JS feature, but they broke even that.
And if you need a diagraming tool, maybe a website isn't the best solution.
Sure the mom down the street who wants to blog about the her kitchen recipes wont be needing it, but a lot of websites "do"