Hacker News new | past | comments | ask | show | jobs | submit login
Everything Easy is Hard Again (2018) (frankchimero.com)
336 points by todsacerdoti on April 17, 2021 | hide | past | favorite | 258 comments



Confession: as a recovering programmer who made a career change to non-programming, it took at most 30 minutes to say "fuck it" and go with Squarespace.

I've never been on the web side of things, but I knew enough HTML in the early aughts to put up a basic informational website. After digging in to some sites I admired, I decided that it was too much distraction from my actual work to roll my own.

Granted some of this complexity comes from the legitimate need to support mobile gracefully, but damn are there layers upon layers upon yet more layers of stuff just to get some pictures to show up nicely with some text and look consistent across devices.

Props to everyone who does this for reals and does a good job achieving that consistency. For my money, I'll hire it out.


FWIW, services like Squarespace are devouring the VPS and small-site design industry. Rolling your own services, managing them with cPanel, and paying local kids to build and design it is a quaint throwback.


It’s non existent. It’s weird how people were obsessed with coal miners losing their jobs, but right here in tech we literally saw a profession vanish. There is no ‘website developer’ anymore, not really. It just ... went away.

That group had to shift overnight (in relative terms) to app development. This might be more foreboding than people realize. Aside from those who need to write complex queries stitching big data together or generating reports, will we need a typical backend engineer ever again? How complex are schemas ever going to get that a UI can’t do it (and generate the corresponding api)?

It happened to the first version of the frontend developer and I think backend is next. Then frontend again at some point, and Data people have another thing coming when the backend folks need to find another job when theirs goes away.

If tech is going to replace industries, it’s going to start with replacing parts of itself first. We’ll be the first to know how this is going to go down.


I'm not a web developer, but at work I run a couple small app servers for internal tools.

They have business logic specific to our product, and owned by the company, so although I'm doing a lot of pointless "plug together HTML" work for my learning, I'm also doing stuff that, AFAICT, has to be original code written by me.

Did I manage to successfully duck out the entire web trend already? When I was a kid, the web barely worked. When I was in college, web apps were a strange novelty Google was working on. When I started my career as a developer, I heard murmurs that web was the only kind of developer left. 10 years on, I'm still working steady and still not a web / mobile dev. I learned a little TS and started using Rust for backends, and that's about it. I knew SQL from college. The web trend came and went and I only learned how HTTP works in the last 5 years. Doesn't seem like the hardest thing in the world.

Is it just a combo of luck and privilege that I missed it?

> If tech is going to replace industries, it’s going to start with replacing parts of itself first. We’ll be the first to know how this is going to go down.

Probably. At first it was funny - Software was a coal vein that just kept going deeper and deeper the more we dug, and high-level languages kept lowering the barriers to entry, making more programmers who each made more money.

Some day the vein's either got to run out, or widen so much that everyone is a programmer getting paid nothing. Since I have enough money to save, I'm hoping to retire before the sigmoid inflects fully.


I’d pay close attention to it. We don’t talk about the death of website development as a career because app development was available for everyone to transition to, so it mostly got ignored. But it happened, and there was nothing anyone could do about it. It’s a dirty little secret that if we honestly told new people switching to web dev ‘hey, yeah, just so you know, our jobs as we knew it kind of vanished once’, they’d think twice. The other dirty little secret is we almost sent all the work to India once. The new dirty little secret is there’s going to be apps like Airtable which will be another rug pull when Enterprise stops giving a shit about custom branding.

If businesses didn’t start moving their apps off Desktop and into the browser, what the hell were we all going to do? We’d have to goto the Java bootcamps, instead of the web app ones.

Pay attention because I don’t think business wants a fancy new UI and backend every few years (especially in the Enterprise space). I think we’re going to be done with this too.

They will one day just load all that crap into Airtable and standardize business tools industry wide.


Air tables is a sign of this gs to come but we are not there yet. Our head of sales wants to use an air tables like product but upper management will not yield, we need a real CMS. I think this sort of thinking will delay the adoption of such tools.


Yeah, not just devs, but armies of designers, product people, managers all distinguish themselves with minute product differentiation, a large majority of the industry runs on _not_ standardizing data and UI. There's more inertia here than just whenever the accountants notice the cost difference. You'll see people retry outsourcing before you see standardization if I'm understanding this conversation correctly


Will we not need some sort of interface for the ordinary user to perform perhaps unordinary action on said data (the same data that’s in airtable)?

The command line can be daunting until you play with it a bit and I know many web devs try to avoid it as much as possible while working.


Wordpress


App Development is the next to go; there's already a handful of competitive tools that will build an App, complete with cloud services, without any need for writing code.

And yah, there's only so many ways to write CRUD, auth, profile, and presence; backend is already commodified as B2B products.


I've been hunting for good low code CRUD tools. I'm not having a lot of luck finding any good ones.


Try Appsmith, I'm a founder of it. Easy to build CRUD apps using pre-built UI components that talk to any database or API. Check out our repo here: https://github.com/appsmithorg/appsmith


And yet I’ve never worked on a product that was just simple crud.


Then you aren’t thinking hard enough.


The difference between web devs and coal miners is that coal miners were losing their jobs due to the government regulating them out of business, web devs were losing their jobs due to underlying changes in technology.

We have a precedent for dealing with government intervention in established industries which usually involves providing support to transition to other industries. Fisheries management is full of examples of government paying fishermen to fish less for example. Yet somehow coal miners became a target of derision by a certain branch of American politicians who gleefully cackled about regulating them out of work.


Meanwhile the web development jobs market is booming like never before. WordPress and Squarespace aren't the answer to everything.


A lot of back-end people have already moved into ML or blockchain and I expect the trend to continue.

Devs have been so resilient because learning new things has been a part of the job description for decades.


TIL, I don’t have a job? Guess it’s an Office Space situation. Better get my own stapler.


Can you move your desk to the basement ? And while you're there....


Web devs learned how to program, coal miners didn't.


This is an essential part of software design today in General. If a part of your system takes time to develope, it should be an instinctive signal to look an alternative. When all of your time is spent on creating value for the customer instead of tweaking technology, you are on the right path. If you find yourself tweaking, you need to stop and look for alternative approach. This does not mean hoarding more tools as I have seen many do, because tools have a maintenance cost.


what kind of work did you get into after programming out of curiosity? rare to see people go the other way


Not the OP, but I left professional programming to become a writer (in Czech, not in English; pardon my mistakes).

It actually has something in common, you still write structured text for living, though your target audience are now people, not machines. Some readers commented that they find my style clear and understandable; maybe it is a carry-over from programming, where you cannot be ambiguous.

But I find writing for people much more enjoyable than writing for computers. As a computer programmer, you mostly receive negative feedback: something stopped working and the users are often angry or frustrated. As a writer for people, you receive more than a few thank-you e-mails from your readers, which brighten your day.


> As a computer programmer, you mostly receive negative feedback

FWIW there are different types of writing and different types of programming.

Writing copy for a bank will probably generate a lot of feedback about mistakes that need to be fixed.

Writing code to reduce annoying monotonous tasks for coworkers so it takes ten minutes instead of two hours will get you a lot thank yous from your coworkers, and will also brighten your day.

It's all about what your job is, and how close you are to the people you are affecting.


This is a good observation.

The software that I spent most of my time with was a communication tool. So whenever something broke down, people had problems calling or texting one another. This pissed them off, naturally.

OTOH now my bestsellers are about half-forgotten events from world history. Aside from a weird e-mail I received from an even weirder Stalin-apologist, people generally enjoy this kind of reading and do not feel angry about it.

If, on the other hand, I were an investigative journalist and dug out dirt on politicans and mafia, the tone of the e-mails from readers would definitely get rougher, to say the least.


Writing human-language and programming indeed have a lot of skill transfer. They're both writing after all. But most importantly it's the ability to clearly communicate ideas and organize thoughts.


Hi, I'm really curious, what do you write exactly for a living? Do you write fiction books, non-fiction books, blog posts, technical documentation?


1. Political commentary for a local newspaper. They pay reasonably well. 2. Popularization of science and technology for a few web outlets. 3. Books about interesting historical events, these are by far the most popular and earn me majority of my writing income. 4. I dabbled a bit in fiction, but long forms like novels seem to be out of reach for me. My strength is in shorter texts that can be written in an afternoon. 5. A free blog that comes with an e-shop where my books can be bought.


Mind linking any of your work?


This is my profile at Goodreads.

https://www.goodreads.com/author/show/18339067.Marian_Kechli...

Everything I write is in Czech, though. Automatic translation from Czech to English sucks.


Thank you! And don't worry I'm Czech :)


rare to see people go the other way

I suspect this is literally true, but only because we don't see them. I've heard second-hand the story of a top developer at a famous unicorn who just couldn't keep up with all the stimulants everyone was using to code more and burned out, bought a cabin in the woods, and became a hermit.


> all the stimulants everyone was using to code more

That's interesting. Do you mean Adderall/Ritalin?

That stuff has a relatively fixed useful half-life. It doesn't work for as long as people might think. It also doesn't actually make thinking clearer. It makes people feel as if they are thinking more clearly. I don't feel like looking for it, but this has been studied and reported, somewhere.

If you are referring to crank, well, the half-life of that stuff is even shorter...


Never tried any of them myself and don't know which they were referring to, but the other replies are probably in the right direction.


Cocaine, MDMA and LSD seem to be favourite drugs around town, here in BC. I mean, besides the usual marijuana, tobacco and beer.


Amphetamines, cocoon, and modafinil are the likely targets. Other than that look up stimulant nootropics.


OP: I build furniture now, but I still lurk here because the news is interesting. Sometimes there's even overlap :-)


Really? I see a consistent trend of friends and ex coworker exiting the technologies and engineering space to go toward either agriculture, manual crafts and artisanat.

I thought is was obvious. It’s even some type of cliché.

But... now that I’m writing this. All those folks are either French, German or Central European. I currently lives in the US and I see more folks doing bootcamps to archive better income and address their debts.

Hmm. I would be curious to see number about that, because obviously it’s highly anecdotal.

Exemple :

- starting to roof painting company

- starting a small farm business ( direct to farmer market type )

- starting a goat herding business

- becoming a wood worker

- becoming a car mechanic in a communal garage

- becoming a teacher ( of non tech things )

- become a bootcamp teacher

- becoming a kumbutcha brewer

I know myself that I will burn out ( in a soft way ) of tech roles.

To clarify : those folks where able to do it because :

- their state offer reconversion package

- they have tech money


looks like OP got into woodworking based on his profile: https://www.longwalkwoodworking.com


Frank is right, of course. Staying up to date with the tooling, best practices, and user expectations of the web requires an unreasonable amount of attention if making websites is only a small part of your service offering.

One reason I prefer frontend libraries like Vue and Svelte is they feel closer to the grain of the web (HTMLesque templates with JS and CSS sprinkled in), and provide a reasonable level of abstraction and magic. The learning curve and paradox of choice is also much easier to navigate than React especially for solo devs who aren't working on a huge app with a team.


Let’s not just pick on frontend. Has anyone seen what it takes to run something on AWS? Devops is nearing similar levels of insanity, and often for applications that won’t even have 50 users (seriously, all these companies that advertise for AWS experience for a tool that is going to be fucking internal with less than 50 users).

It all builds up in me to be honest. You get the frontend complicated, then the deploy/infra gets complicated for the needless cloud dependency, and then the backend people don’t want to get left out of the party so they manage to rewrite their shitty php app in Go.

It’s like a shitty third world country. Not only do the roads suck, but there’s no water or electricity half the time, and you need to bribe everyone to get something. Instead of it becoming a wasteland, the population somehow still increases (more people actually entering tech). Now you really can’t change anything because there’s too many people.

How the wicked live.


> Has anyone seen what it takes to run something on AWS?

I've given up every time I've tried it on personal projects. It's just a complete poorly document mess indeed.

At work we host our stuff on AWS but we have an ops team that deals with all the garbage. All I care about are APIs, endpoints and that one thing can talk to another.


Yeah, so much for the fantasy of cloud platforms eliminating the need for ops staff. The guy who used to setup and manage your Linux VPSs just switched jobs to manage your AWS infra and charges extra for it.


Finally a voice of reason. The front end isnt the only thing that has become complicated. How many services are running on k8s? Now how many of them actually need it?

The list goes on. The front end isnt that complicated relative to back end, there is just one ecosystem vs 10 for the back end.


I recently got asked to consult for a project. I warned not to go with AWS that the tools they were wanting to put online were not well suited to AWS.

Then I got sent a 15 page deployment document outlining the AWS infrastructure they had gone with. They paid megabucks and got a rube goldberg machine. To run a zend based PHP site developed 15 years ago that has never had any significant maintenance.


And this is exactly why any of my hobby projects that involve JS are done with plain JS, with as few libraries as possible. I might pull in specific libraries, but I don't want to pull in a giant framework that will be outdated the next time I decide to work on that particular project.


> ... are done with plain JS, with as few libraries as possible ...

I was having the same opinion and practice. However, I'm kind regret it as well.

Even if you done everything with plain JS, eventually, one day some idea will float into your head such as "Hey... I want to automatically compress the script file/snip", "Hey I wonder if I can polyfill all my script", "Automatically bundle assets?", "Compress assets images?" etc.

Then, you start to learn Webpack/Gulp/Grunt (if the last two are still alive), and commit your entire soul to it few minutes later.

I think the reason for the messy front-end tech is, the Web itself is messy. You have to consider a lots of things, networking, file management, cache management, cookie, user-side storage, security etc. Some eyes saw the problem and then built a framework to address it, then others discovered some other issues in the framework ... the circle of life.

I guess we'll eventually settle down on something when people finally figure out what they want from the Web technology, or when the Web is "dead" (no dramatic changes anymore, like what happened to Desktop Applications today).


I have lots of tiny hobby projects. I never want to do any of those things you mentioned. Just a simple page, with the minimum amount of javascript to make it work.


> Hey... I want to automatically compress the script file

The trick is to write so little JS that its bandwidth usage does not even register. I'm recently going as far as to include licensing headers and extensive comments in my website JS, see e.g. <https://xyrillian.de/res/chapter-marks.js>.


I can't find a link at the moment, but there was an article on Hacker News a few years ago defending the use of plain text web pages with minimal styling. The page loaded amazingly fast, rendered well on both desktop and mobile, and had fast interactions with the page. The author had also included the full text of Moby Dick, more content than is ever served in a typical webpage. It was a fun and cheeky demonstration that it isn't the content of a webpage that results in poor performance, but all the frameworks, ad targeting, and client-side compiling of a webpage.

(Side rant: I refuse to use the term "client-side rendering", because dang it, "rendering" makes an image. "Client-side rendering" as web developers use the term doesn't actually render anything, but instead compiles the page down to HTML, which is then rendered by the browser.)


I'd put that as a difference between hobby projects and work projects. For work projects, there is some amount of time that can be allocated to continual development. The project itself has value, and needs to stay up to date, and so sure, it may use the framework of the day.

For a hobby project, if I personally don't have fun with it, then nothing gets done on it. If I decide after a few years that I want to pick a project up again in order to add a new feature, the last thing I want is to find out that framework X needs to be replaced with framework Y, and so the two weekends I had expected becomes two months of weekends to rewrite the whole thing.


I must say it's a valid point. After all it's a hobby project, having fun is the first priority :D

However, one could also say "Well, it's a hobby project, so I'll just slap this framework on to save my time..." or "What? I have to manually deploy this hobby thing every time I made a change? I'll just put it on a CI and let Webpack pack the thing for me"...

My method is this: I will start the project with all new tech/frameworks/lib/"display: grid", then I publish the project, if it turns out to be a success, I'll maintain it (including upgrade the framework, bug fix etc); if it's not, I'll just put it aside and start another one. This way, I can touch/learn many new stuffs that might later benefit me in my day job without any risk.

So, I believe it eventually boils down to personal preference (based on objective information on hand, of course) :D


Makes sense. For me, I like to get hobby projects for my own personal use. The "publishing" involves pushing it to github, then doing absolutely nothing additional with it. But because I use it on my own, I may want to add more features to it later.

For frameworks, I'm sure they save time once you are familiar with them, but that also requires evaluating which frameworks to use and how to go about using them. That ends up taking a lot more time than throwing something together with plain JS.

That definitely makes sense, especially if you're using the hobby projects to learn about frameworks that you'll later use elsewhere. I tend to do the same thing on the compiled side, trying out different libraries and languages. (For example, picking up Rust for last December's Advent of Code challenges, then porting some of my other projects over to make some performance comparisons.)

Part of this is also that my dayjob has very little interaction with JS. If/when I use JS in my dayjob, I'm usually picking something minimal so that it can be used/expanded on by others without framework-specific JS knowledge. My goal is to make sure that I leave behind a codebase that can be read/understood by as many people as possible, which may mean picking a software stack that has a bit worse usability, in exchange for ease of finding developers.


Another thing I find weird is how bloated static typical site generator tools are. The delivery medium of static HTML is timeless. But the odds that a static site generator with dozens of dependencies will still work N years from now? Grim.


just write html by hand. With html5, implicit and auto-closing tags, it's really not more difficult than markdown. Then your "generator" is simply the cat program, that appends a common header and footer.


They all started off lean and mean. Then after adding feature after feature they become bloated. Then the next lean-and-mean static HTMl generator becomes vogue.


Bashblog[1] is pretty lean, and doesn't use any dependencies.

[1] https://github.com/cfenollosa/bashblog


One of the reasons I like Hugo: it's a single executable that will basically work forever. There's no need to update to a newer version if the one you have meets your needs.


'Typically' meaning...? Jekyll? Gatsby?


I have a couple of older static sites (one using Jekyll/Octopress, one using Middleman), and if I ever do a bundle update something is guaranteed to break. Not sure if that's the "bloat" the parent comment is talking about though.


I've been using Perl Template::Toolkit since around 2001 and it's even better since version 3 was released. Worth a try. You don't even need to know Perl to make use of it.


Yes. I was on Jekyll 2 years ago, I don’t think I’ll be able to compile it in two years. It’s already on an old version of Ruby if I remember, if not Bundle or Gulp or... and it’s just a simple website.


I've had this exact issue with my Middleman sites.


That is why I've changed to Hugo (from Hexo & Gatsby). It does exactly what is supposed to, it doesn't import any dependencies and it is quite fast. Besides, no runtime is required.

I thought about plain HTML, but writing blog articles with it is a bit of a pain. Hugo was quite the right balance.


agreed, but for me the appeal of static site generation is that the input is typically just markdown and some templates so even if a particular generator ceases to be supported it should be trivial to drop in a replacement in whatever the currently fashionable language is.


A micro-library like https://redom.js.org/ does make things easier though. I have also given up on mega-frameworks - just too many things to keep in mind as I get older.


Here's my "femto-library" for creating DOM dynamically from JS:

    function ce(tag, attrs={}, children=[]){
     var el=document.createElement(tag);
     for(var attrName in attrs){
      el[attrName]=attrs[attrName];
     }
     for(var child of children){
      el.appendChild(child);
     }
     return el;
    };


Easier through ignorance seems like cheating IMO. you're basically saying yes things are harder but I just ignore them. How would a newbie decide what's safe to skip?


While agree with your conclusion, I don't agree with your argument to reach it. Any software development sits on such a tall and wobbly stack of dependencies that no one human can even understand every layer, let alone build it. For most projects, I don't hand-solder together transistors into the logic layer, write my own bootstrapping compiler out of hex assembly instructions, or directly interface with graphics drivers to display the program's output. That doesn't make these be bad programs or bad projects, just because they are focused on only one part of the program space.

My concern isn't in "ignorance" or "cheating", but in spending time to learn a technology or framework that may quickly become obsolete.


This should be industry standard practice.


I think one driver of this cycle is:

1) need for capability that old method doesn't meet, so new tool is made

2) dev working on both simpler and more complex sites, wants everything to use the same tools

3) even sites with simple needs now get done using tools powerful enough to satisfy needs of complex sites

4) experienced dev doesn't see a problem, because they are conversant with those more complex tools, and also don't often get assigned to do simple websites anyway, but anyone doing the (considerably more numerous) simpler websites now is getting their groceries by flying a SpaceX rocket from home to the grocery store.

5) now that websites can do more, and very occasionally do, someone thinks of something else they want a website to do, and the cycle repeats


I think you forgot one driver, which IMHO is a major one:

Imagine you are just getting into the business and are confronted with complexities, which would let you come to a grinding halt for the next 6 months until you have at least a foggy idea of what is going on. (Angular, Elm, PHP, React, ...) And once you worked through your initial list of "need to know", 5 more have popped up, like a hydra where chopping off heads just results in... more heads staring at you.

So, in this situation, it is quite reasonable to try and "get ahead of the curve" instead of lagging behind the ever changing world. You decide to write your own framework (based on your current and imperfect understanding), ignore what is already out there and pass the ball to anyone else. They have to look what you did now and read your stuff and scratch their heads, while you can be productive. And it feels good. You feel like a messiah. You can publish even a book, maybe and hold some enlightened talks.

This is the result. New generations are literally forced into re-inventing the wheel to even stand a chance at getting something done.

And its not only in web design/javascript/web-assembly/call-it-how-you-will.

The same happened e.g. in the Microsoft ecosystem (desktop programming). MFC -> ATL -> VB6 -> .NET WinForms -> XAML -> Silverlight and back to XAML -> .NET core -> however it will go on (I detached at that point, having run 'too many laps').

And since someone mentioned Linux... what is the canonical way to write GUI apps in Linux these days? Tcl/Tk? xlib? xcb?, SDL2, wayland over X? Gtk, that huge google framework? QT? SDL2? Motif? Pure wayland? Also a huge mess and people "innovating" just to be ahead of the curve. Yes, there are always reasons, if you need them to be...


> And since someone mentioned Linux... what is the canonical way to write GUI apps in Linux these days? Tcl/Tk? xlib? xcb?, SDL2, wayland over X? Gtk, that huge google framework? QT? SDL2? Motif? Pure wayland? Also a huge mess and people "innovating" just to be ahead of the curve. Yes, there are always reasons, if you need them to be...

There isn't a canonical way - Tcl/Tk, Qt, Gtk, SDL, Java, pure graphics backend calls all continue to work. All as long as there's someone out in the world with enough dedication to them to keep them up to date. Linux ecosystem is not at mercy of Microsoft or Apple, thankfully.


Linux is still very much GTK, Qt, and to a lesser extent, Tk. There hasn't been nearly as much rotation of core libs as there has been on Windows or the Web.


The delineation of this argument is what made me switch to Linux. On Windows, things that should have been getting easy were getting harder for me: software management was getting worse, updates were getting worse, and everything that should "just work", just wouldn't. My experience on MacOS was just as bad, with all of the 32-bit apps I used gone in the wind. Now the 64-bit apps are up on the chopping block, and I had to look for other options.

The learning curve for Linux was larger than Windows or MacOS, but I only learned the truth. By cutting away abstraction, I can actually understand what my system is doing, and be a better citizen of my operating system. Linux is arranged masterfully, and makes the Windows filesystem look like a joke in comparison. It's also faster than any of the Macs I own, with the freedom to plug in a modern Nvidia GPU and play whatever games I want.


I love Linux based operating systems and I've used them pretty much exclusively for at least a decade now, and I agree with a lot of your post, but

>By cutting away abstraction,

I don't think I agree with this take. You haven't cut away abstraction, you've changed to a different kind of abstraction. Linux's 'everything is a file' abstraction is just that, an abstraction.

>Linux is arranged masterfully, and makes the Windows filesystem look like a joke in comparison

It's arranged more simply and straightforward, masterfully though, I dunno...

Every flavour of Linux uses a different layout for its root directories, stores configs in different places and generally is not consistent from distro to distro.

User configs are also scattered in random places at times. They could be in .config they could be in .local they could be in .$APP_NAME they could be in the app's directory, they could be in /usr/ somewhere. Maybe in a couple different places at once. Who knows? It's a mystery until you track it down.

Dependencies and libraries are a huge pain to deal with, especially when you start mixing and matching distro provided libraries with custom built ones.

I'm not trying to disagree with your post or anything. I love the way Linux lets you have control over your computer and doesn't try to stop you. But, it's definitely got some issues. I've only touched on a few related to your comment itself.


My early web dev experience was mostly just figuring out where the hell config and log files for apache, php and mysql were. If the docs told you the default location of the config file, you could be sure that on your distro they are somewhere completely different.


>My early web dev experience was mostly just figuring out where the hell config and log files for apache, php and mysql were

Yeah but once you track it all down you feel like a developer superstar genius...

Then you confidently sit down and proceed to repeat the process on another distro and immediately feel like a complete noob again as nothing is where it should be and you realize, you mastered nothing.


There's also a lot of sub-ecosystem issues which are more prominent on Linux, like the long-standing surprise that NPM doesn't have a global rc file in /etc:

https://github.com/npm/npm/issues/533

The argument from the maintainer is reasonable that distros can just patch it, but realistically, everyone wants a newer version of things than what their distros ship—a similar tug-of-war ends up happening in the Python world, where distros like Ubuntu and Debian ship patched versions of pip/setuptools, and then users are encouraged to self-upgrade those tools to the upstream versions.

Anyway, this kind of thing isn't necessarily anyone's fault, but it is just a pretty different experience from Mac or Windows where you tend to have things be more self contained, like "here's where my Python installation lives because it's the path I put in the box when I ran setup.exe."


Node is definitely a bit of a mess, it's misuse of the Linux system doesn't really come as a surprise.


Re: abstractions, I think what seperates Window/MacOS abstractions from Linux/UNIX abstractions is obfuscation. The aggressively closed-source nature means that *nix abstractions, to a reasonable degree, anyway, reflect the underlying structure in a way that closed source abstractions can't. Of course the file abstraction is an abstraction is an abstraction, but it's also implemented transparently as a concept in the kernel --- Whereas Windows manipulates their interface so that the interface _is_ the implementation, as far as the user can see and as far the user should care. At least, this is my experience, and it fits well with what I'd expect from the operating systems' respective development models.


> Every flavour of Linux uses a different layout for its root directories, stores configs in different places and generally is not consistent from distro to distro.

As a very casual Linux user (macOS is my daily driver), are there any efforts to solve these "death by 1,000 cuts" arbitrary (I assume) differences?


Not really no. The differences between distros are essentially rooted in philosophical beliefs defended by near religious zealotry. Entire identies are defined around the differences between distros.

A bit of an exaggeration, but well...pretty much that.


These differences mostly come down to package manager variations, which should be one of the main considerations when picking out a distro in the first place. My personal advice is to use Arch/Manjaro with pacman, and avoid all snaps/flatpaks. The AUR is designed for people to integrate these apps like Spotify and Bitwig into the proper locations, instead of snaps and flatpaks making their own folders and such.


See File System Hierarchy Standard[0], or `man hier`.

[0]: https://refspecs.linuxfoundation.org/fhs.shtml


> You haven't cut away abstraction, you've changed to a different kind of abstraction.

Yes, Linux depends upon abstractions. Everything in computing is an abstraction, including those 1's and 0's that tell the processor what to do. (Anything below that is electrical engineering.) The differentiators are the complexity and stability of that abstraction. When people talk about "everything being a file", they are referring to lower level abstractions that are relatively simple and stable. Likewise, when people talk about HTML/CSS/JS they are referring to lower level abstractions that are relatively simple and stable. This is less true (sometimes significantly less true) when people start discussing frameworks/libraries, whether that's in Linux or web development.


Simplicity is harder to do than it looks. When you see something simple that actually works, something made to be easy to understand, that can take a lot of work to arrive at. So while it may not be much to look at, few people can actually accomplish that. Unix is a good example. It looks easy, but when you see people try to improve on something the failure rate is quite high.

Let me give you an example of something that used to be clear and trivial in Unix that isn’t anymore because people «improved» it: telling your system about name servers.


i strongly disagree with this:

Every flavour of Linux uses a different layout for its root directories, stores configs in different places and generally is not consistent from distro to distro.

there are differences, but it's 90% the same in the major distributions.

and while you do have a point about user configs, it is getting better. .config and .local are getting used more and more and are an improvement over every app dropping stuff in $HOME.


>there are differences, but it's 90% the same in the major distributions.

Yes and that 10% can cause some unexpected, sometimes hard to fix problems. A poorly written build script or installer that expects a specific distro's directory layout can cause some problems. Especially if it's a large convoluted build system that hard codes the directories in multiple places.

I've actually had to deal with this, it sucks. That was as an end user trying to use an app, not as a developer.

You run into this a lot trying to build small libraries and stuff. There's unfortunately a lot of poor programming habits out there. Directories inevitably sometimes get hardcoded and this causes problems because of the inconsistencies between distros.

>and while you do have a point about user configs, it is getting better. .config and .local are getting used more and more

And yet there's hundreds of apps that will never be updated or changed to conform to standards.

>are an improvement over every app dropping stuff in $HOME.

Tell that to snap. It insists its folder must sit nicely in my home folder and nowhere else.


And yet there's hundreds of apps that will never be updated or changed to conform to standards.

i suspect many of those are no longer actively maintained. at best they receive security patches from distributions.

Tell that to snap. It insists its folder must sit nicely in my home folder and nowhere else.

my (low) opinion of snap aside, the bad here is that the location is not configurable. as a library of applications, i would want snap in $HOME. or i would put it in $HOME/lib or an equivalent. maybe .local would be appropriate too, but i understand it's for app specific user-data, so it doesn't look like whole apps should be in there. (edit: elsewhere i just read that .local is supposed to be like a user specific /usr/local, so snap should fit in there somewhere)


I don't want to turn this into a snap bad mouthing thread, but yeah. You can't even symlink it or it causes problems. It really bugs me because I don't tend to allocate too much space to my home partition and keep most of my stuff on data partitions. So if I get a bunch of snap apps, it starts eating up all the space in home.


Can you wrap its invocation in a script that sets a snap specific version of the $HOME environment variable? If it's perverse, it ignores $HOME in favor of reading the passwd entry itself, in which case I'm happy to have never encountered it.


i hear you. in fact, i do use data partitions as well, and if my snap directory grew to much i would want to move it too.

as a workaround try to use a bind-mount


I don't use Snap. The only real "bloat" in my home directory is my Bitwig folder, which isn't that bad considering I use the program on a near-daily basis.


> and while you do have a point about user configs, it is getting better

Are people finally following the XDG standard? [0]

[0] https://wiki.archlinux.org/index.php/XDG_Base_Directory


i have no hard data, but my impression is that more and more apps are


> It's arranged more simply and straightforward, masterfully though, I dunno...

It's easy to make "complicated". It's very hard to make "simple and straightforward".

Whether that's an indication of mastery is not a closed question.


> I don't think I agree with this take. You haven't cut away abstraction, you've changed to a different kind of abstraction. Linux's 'everything is a file' abstraction is just that, an abstraction.

If I try to interpret the bit about abstraction, I lean more towards "less opaque" being the intended meaning. On Windows and Mac, it's harder to see what is happening. On Linux, it's all exposed and there to tinker with.

> It's arranged more simply and straightforward, masterfully though, I dunno...

I would argue that simple and straightforward is masterful!


I don't know if you are talking about Linux the kernel, Linux the entire distro, Linux the filesystem (out of tens of supported filesystems), but "arranged masterfully" is big statement. From 2 days ago:

https://news.ycombinator.com/item?id=26821298


I wish wish wish I could move over to Linux but its font rendering situation is terrible [1] I've tried Manjaro and Ubuntu to no avail.

I have no idea what causes it or if people on Linux are not as sensitive to it or its a specific hardware setup which causes terrible fonts but once you read Windows fonts stuff looks so much better to me. Apparently its a hot debated topic but its so obvious to some it seems.

[1] https://pandasauce.org/post/linux-fonts/


Hah: I am very particular about fonts myself but what bugs me most isn't sub-pixel hinting (which is "good enough" with the latest freetype IMHO) it's the size of fonts. Windows is so far off the mark on 99% of computers and it drives me crazy!

Repeat after me:

    72pt is one inch tall!
    72pt is one inch tall!
    72pt is one inch tall!
The "point" scale of fonts is actually a precise, real-world measurement like inches, millimeters, etc. Fonts may differ in how much space each glyph takes up but in general they should at least be close to the correct size.

Test it: Open up a word processor and load some reasonably normal font (e.g. Arial) and make some text that's 72pt. Then hold a ruler up to your screen: Is it about 1 inch tall? How far off the mark is it?

If you're using a modern, high-DPI display it's probably waaaaay TF off, haha. Even with regular displays it'll be off and there's actually no way to fix it because you can't tell Windows the precise DPI of your display! It tries to guess based on the DDC data given by your display but then averages the X/Y together to scale fonts. So if your X DPI is higher than your Y DPI it'll always get it wrong. Always!

Macs cheat in this area by always having displays of a fixed DPI. For the longest time all Apple displays were precisely 96 dpi (with perfectly square pixels) but recently changed it to 220-ish and made adjustments in their font rendering engine to match. I have no idea how they're handling non-Apple displays these days.

Linux, on the other hand does it right (at least X11 and KDE do) in that it scales fonts very accurately based on the DPI presented by your display via the DDC protocol (or if your monitor doesn't provide that you can set both X and Y DPI manually).

Gnome and GTK applications have the same problem as Windows though in that they only take one DPI setting and if the DDC protocol gives separate X/Y values it'll just average them and the results can be... Ridiculous. You don't really notice the weirdly-stretched fonts until you open a KDE application side-by-side with a GTK one.


On top of that mess you describe here lives the web designer pixel mess.

Web designers loved to do their web pages with pixel measurements instead of real lengths in millimeters or points. Which is a horrible idea because of the different DPIes.

DPIes became bigger and bigger over time and browser developers were between two fronts of bullshit. The OS told them a bullshit DPI and web pages gave them a bullshit pixel length design. What did they do? The invented a third layer of bullshit to fix the other two (a little): the fake pixel. "Pixels" in web pages today have not the same length like the screen's real pixels, they are somehow scaled.

This is all broken beyond repair.

I wished that browser developers would have implemented it the right way. A pixel is a pixel and an inch is an inch. Many sites would be broken after that and the web designers would be forced to fix that and to use for example millimeters or points for lengths. And in a perfect world the OS would then use the correct DPI to display the rendered result.


I have never seen a display with non-square pixels, except maybe on cameras. Where are those typically installed?


CRT monitors will naturally display non-square pixels, depending on the resolution. Sometimes LCD's are run in non-native resolution due to GPU or driver issues, which results in a stretched display where the input pixels are not square.


CGA displays had non-square (logical) pixels. The ratio for 320×200 resolution is 1.6, but it is shown on a 4:3 (1.333) screen.


Where can I read more about how 72pt ~ 1 inch? Where does one g to for a "course on typography"?


This was posted recently: https://tonsky.me/blog/font-size/


Wow, first time I hear that as the main and apparently only argument to using Windows over any Linux distro.

With HiDPI displays all over the place, is subpixel rendering really still a thing? Pixels have gotten so small, I can't see individual ones anymore even if I'm trying. They could leave out anti-aliasing altogether in fonts and I'd hardly notice.

Or I'm just getting old.


I think that people are just too fussy. I have anti-aliasing disabled on my 142 dpi laptop screen and it looks better in most applications provided that screen fonts are used. Yes, you can see the pixels and you get the odd rendering artifact. On the other hand, the fuzziness of anti-aliasing is a steep price for edges that appear smoother.


Most desktop environments will offer subpixel AA, and with GNOME and KDE, I think it's enabled by default.


FreeType and slight hinting + subpixel antialiasing looks great to me, really... Windows has strong pixel-fitting and at the same time softer outlines. That is after using the ClearType tuner to tweak it to my liking. I'm not particularly fond of that look. You talk about hardware. The gamma of your screen makes a difference - antialiasing relies on it. The ClearType tuner has a page or three about that. I guess on Linux you better set your display gamma to the standard value, there is no good reason (except visual impairment or something) to set it to anything else anyway.

By the way, font rendering in Chrome for Linux used to be really awful (no subpixel AA or something). It was fixed a few years ago - it looks like any other app now.

I omitted some detail that you can easily find in your favorite search engine.


I’ve changed entire distros, back in the day, based on font rendering. However, you’re on crack if you think Windows fonts are great. MacOS makes everything else look like children’s scrawl. I’ve been running on Macs for about 6 years now (with a PC connected to the same external monitor) and I still sometimes just stop and admire how nice the fonts are.


I have a 4K and 5k display, and I find it as pleasant as my 2019 MacBook. I’m not sure if it’s because I’m not sensitive to it, or what. (I’m using Fedora with Gnome 40.)


>> The learning curve for Linux was larger than Windows or MacOS, but I only learned the truth. By cutting away abstraction, I can actually understand what my system is doing, and be a better citizen of my operating system. Linux is arranged masterfully, and makes the Windows filesystem look like a joke in comparison.

What windows filesystem (FAT, FAT32, exFAT, NTFS) look like a joke and compared to which Linux filesystem? Ext2? Ext3? Ext4? ReiserFS? ZFS? It’s pretty clear that you never used Linux in the 90’s from your comment. And the main problem with Linux at the time had nothing to do with the filesystem but mainly with the immaturity of the complete system. I am very curious to understand what great truth you understood by using Linux. Would you kindly teach me this great knowledge and illuminating experience that I somehow missed?


> What windows filesystem (FAT, FAT32, exFAT, NTFS) look like a joke and compared to which Linux filesystem?

I was referring more to the layout of the root directory (oftentimes referred to as "the filesystem", but now that you bring it up I also hate the Windows filesystems. FAT32 needs to go out to the pasture, and it wouldn't even exist if exFAT support was so broken on lower-end systems. NTFS is fine, but pathetically bare-bones. I'm a BTRFS evangelist, but even ext3 makes NTFS look old by comparison.

> It’s pretty clear that you never used Linux in the 90’s from your comment.

This is true.

> And the main problem with Linux at the time had nothing to do with the filesystem but mainly with the immaturity of the complete system.

"at the time" meaning 2018 or the 90s? Maybe 30 years ago Linux was immature, but now it's the benchmark of maturity. Very few *nix systems have share it's pedigree.

> I am very curious to understand what great truth you understood by using Linux. Would you kindly teach me this great knowledge and illuminating experience that I somehow missed?

Nope, probably not. If using Linux wasn't an enlightening experience because you last used it in the 90s or because you have an FAT fetish, I can't really help you.


Linux is also stable. The binary interface isn't going to change. It's possible to write Linux software that you can boot directly into and it will never stop working or do anything you don't want it to do.


Because there's a BDFL at the top. Linus understands that the whole point of a kernel is having a stable binary interface to run programs on, anytime somebody wants to break userspace he goes on a rant and rejects the idea, rightfully so.

Many other projects do not follow this way of thinking or don't have a BDFL. They break API's because... someone felt like it? Every once in a while I dig into the changelogs or migration guides of projects (sometimes they don't even have a migration guide which is just awful) and you'll see things like changed names of functions, swapped argument orders or even complete removal of functions. But why? We don't know and my guess is it usually boils down to someone "feeling like it".


Yeah. I suppose without Linus's leadership Linux just wouldn't be the same project anymore. The world has come to trust him because he makes good decisions. We depend on his core values such as "don't break userspace". I wonder what it's gonna be like in the future...

> They break API's because... someone felt like it?

I empathize with them. Designing software is essentially philosophy. The interfaces reflect the developer's current understanding of the world. When this understanding improves, people are tempted to update software because they feel like the software is wrong and unelegant when it doesn't match what's in their minds.

It takes a really good engineer to reject this impulse and take responsibility for interfaces people came up with in the 90s.


I have been having a terrible time staying away from conda in windows, but the ridiculous amount of time and spent trying to install some python modules using pip is just not worth it.

I’d be lost without wsl.


> My experience on MacOS was just as bad, with all of the 32-bit apps I used gone in the wind.

Were you forced to update to Catalina? I'm still on Mojave right now. My 32-bit apps still work, happily unaware of the fact that they're no longer supposed to.


I would love to to use Linix as my daily driver, but keep needing to go back to Windows for Ableton and gaming.


I switched to Bitwig, which has a native version for Linux. The workflow is very similar to Ableton Live, with a little bit of extra configuration. As for gaming, it's a case-by-case situation, but Valve Proton is good enough for 80% of the games I like.


Yeah I've heard that Bitwig could be a good solution, and is made by the original developers of Ableton.

But what about 3rd party plugins? As far as I understand, most of these won't work on Linux either.


You are spot on. And now, try OpenBSD. It's the next step in removing useless abstractions.


I'm an avid Linux user, completely unfamiliar with BSD. Could you shed some light on how it further removes abstractions? I don't mind trying it out by booting it from a usb


> Could you shed some light on how it further removes abstractions?

By not running hundreds of useless programs, I guess. If you launch htop inside OpenBSD, it'll fill half of your terminal, and the rest is empty.

The easiest way to try it is to install it on a virtual machine. You'll see that not much is needed to get to a point to launch firefox from an xterm.


but that's the case for some linux distros too, e.g void


Top itself is enough, you don´t need htop.


I find htop's ability to view the tree structure of a process quite useful. Name your threads!


Totally agree. But I'm somewhat used to silly things about htop, like selecting a process to kill using the arrow keys, or a default refresh rate of less than 1 second.


Linux is a large ecosystem, supporting many different functions (embedded to some extent; desktop; server; supercomputer clusters, etc). For any one of these, the other 'stuff' represents cruft. The BSDs (I am partial to NetBSD) tend to give you a basic substrate of 'the system', and you add what you need via 'packages'. It's a different philosophy, but no less valid.


What I found frustrating is how fast a front-end project is getting old.

Taking a NPM+Framework project of 9 months old I often have a low change of install-run at the first time. Always there's an update, an incompatibility, or something that brakes the build process and I need a couple of hours to fix it again.

Is not possible to work like this in the long run. Front-end projects are investments of time and money and I want them to be durable and maintainable for a long period.

P.S. I started digging into clojurescript/webassembly for this reason


There's a dirty secret that's fallen out of favor in the Node community, but that helps dramatically with this problem: it is usually a good idea to vendor dependencies for front-end projects (basically any project you're making that isn't being published to NPM).

The trick with this is you have to avoid dependencies that compile C code or native binaries as part of their installation process, since those binaries might not be portable. I try when possible to avoid anything that's using node-gyp.

But if you are vendoring dependencies and they're compatible with the system you're compiling on, then you will never need to worry about libraries breaking or changing underneath your nose.

You will still need to worry about security vulnerabilities, but that's just the case with all software. At the very least, 10 years from now your project will still compile.


By “vendoring” dependencies, I’m guessing you are talking about pulling them out of node_modules (into a js/vendor dir), and checking them into your repo, repeating the process every time you want to update. I will make an offering to Cthulhu if it’ll ensure we won't be using webpack and node 10 years from now in order to build those old snapshots without considerable effort.


> pulling them out of node_modules

No, even further. Just commit your node_modules folder.

There's this idea that vendoring has to be really complicated, but committing node_modules was actually official advice early on in Node's history. And what that means is that if you clone your repo, you just run `build`, you don't have to change or configure anything at all. The only dependency you have in your project becomes a node binary -- and honestly, you can vendor that someplace as well, node is reasonably portable.

There are a lot of advantages to this, not the least being that if you're demoing or working on a project someplace without an internet connection, having an up-to-date Git repo is all you need, you just have your entire project. If you switch branches, your project is ready to go.

Of course, if you have something like 1G of dependencies, checking them into Git starts becoming kind of cumbersome. But if possible, avoid that, for the same reasons you would avoid JS dependencies that compile native binaries. Having gigabytes of dependencies is going to be a nightmare for debugging and security, no matter what language you're in.

But I have built some pretty complicated projects where vendoring is fine. It doesn't balloon the size of my git repo, it doesn't make the project less portable, and it regularly saves my butt when I forget to check out the right branch before getting on a train without an Internet connection.

And all you need to do is not add node_modules to your gitignore.


Not a node dev myself but why not simply reference dependencies with fixed version? According to semver (that is probably what most node projects use anyway), you are free to use latest minor and patch revisions and leave the major fixed. Shouldn‘t this prevent most compatibility problems?


> why not simply reference dependencies with fixed version?

This is the second best strategy, but:

- node makes it kind of annoying to do this, because unless you do a clean build every time (which is much slower) `package-lock.json` will get largely ignored.

- I have seen multiple projects break with semver. Just because people are supposed to avoid breaking changes in minor releases doesn't mean they actually do.

- Even if you're downloading the same version, there are scenarios where natively compiled binaries can bite you. I've run into issues where node-gyp just wouldn't work unless I downloaded a completely separate compile toolchain onto Windows and restarted the computer. If you can avoid that and have all of your dependencies be pure Javascript, you will eventually save yourself a lot of horrible dev-ops work when the "quick" install on a new VM turns into an hour long process of watching Visual Studio tools download (tools that to GP's point may not be available or compatible 10 years from now).

- And finally, sometimes you want to do work across multiple branches without an internet connection, and vendoring your dependencies allows you to check out a branch and know you have the correct dependencies even if you don't have a connection to run `npm install` with.


Ok I see; thank your for the explanation. So out of interest, I did some research and found that approx. 30% of packages depend on native implementations further down... so this seems to be a huge problem...

Additionally I looked at the sqlite3 package and not too surprisingly it uses node-gyp as well. So even checking in node_modules might cause the problems you described.

Out of interest: which backend language do you recommend to reduce such problems...?


I use Node. I don't think Node is wildly better than anything else, but I'm used to it and it's good enough, at least for now. I'm unaware of any back-end language that forces dependencies to stay in the same language, but something probably exists somewhere.

I agree that node-gyp is a problem. It's possible to avoid, but can take some work (particularly when you start dealing with databases that basically have to be in compiled languages). The one saving grace is that if you have a set hardware that you're running on, you can sometimes vendor node-gyp dependencies. Unfortunately, that comes with some weird edge cases, and sometimes your dependencies break because your OS changes. I don't recommend it, but I understand that sometimes it's unavoidable.

I am hopeful that native WASM will solve some of these problems, but I don't know how realistic that hope is.


Are you using packgage-lock.json/yarn.lock?


The simple page you could make 20 years ago is the simple page you can make today. With a few minor tweaks, it will work as well as it did 20 years ago.

If you want to make a complex web app today, that's easier than it was 20 years ago. The tools are infinitely better.

If for some reason you decide to use the tools meant for complex web apps to make your simple page, you're going to feel like everything has gone horribly wrong. But why are you doing that?


I agree with you, but the problem, the big problem, is that you're looked down if you propose to use simpler tools.

It is horribly difficult to not use these advanced tools for simpler problems. At work my team does just f**ng CRUD forms, and for doing this we have the most overcomplicated stack I've seen in my life. Go backend with React, Redux, Rxjs,custom.webpack craziness, and 32 tons of internal weird libraries no one would willingly use ever even if pointed with a gun.

If I propose to just use Rails for this and be done with it, I'd be declared and heretic and expulsed from the frontend team.


Rails FTW. I’ve kept checking on other tools, but there’s no stack out there that is even on the same order of magnitude of productivity as Rails for doing simple LOB CRUD apps. Which, it turns out, is a LOT of apps. I think the problem is that there’s a common type of person drawn to programming which HATES being “told what to do” with Rails’ convention-over-configuration approach. Whatever. While they’re still writing wonderfully-typed class files in two different applications on the front and back ends, I’m done with my app, and writing the next one.


This. So much this. I miss rails after joining a huge company.


Without knowing your situation, that sounds like job security to me. "If we go the simple route I'm obsolete"


I think it is just the opposite. In my particular situation at work, if we go the simple route (it's just a 2 step CRUD wizard, so that'd be Rails/django/etc) other 5 people might be redundant.

We need a team of 5 people to mantain this project (and a few smaller ones) because of the architecture (SPA + Go microservices) and the tools (React, rxjs, redux, data loaders, graphql, typescript safe actions whatever, an in-house-built crazy i18n system, custom webpack, some 10s of other open source libraries, and some other 10s of internal custom libraries).

It's crazy. Really crazy.


It’s not job security. It’s insecurity and peer pressure and identity validation usually with a sprinkling of adhd medication (surprise, we all know what that kind of code looks like) powering people through intense coding sessions where they over engineer things obsessively.

I’m getting pretty sick of it myself.


I echo this. Currently working on CRUD app in golang. What an absolute mess and much slower than using established technologies.


I don't blame the language tbh. I blame the frameworks and tools (or the lack of) for web development.

Look, I love node and javascript. But I can't deny it is is total mistake to use it in a business context for web dev when you have django, rails or Laravel. Same for Go and many other trendy tools.... Reinventing the well is not a good business unless you're in the wheels business.


The language is quite weak. It doesn't even have enums.

I blame the industry who fell for golang's marketing and started shoving golang into places where it doesn't belong.


This. I've already stated it a lot. Go is great if you're building a CLI application, a DNS server, a database, some core infrastructure service, Kubernetes extensions/plugins, etc.

But people learn it, get hyped, and want to use it for everything.

So now we're spending months doing things that already exist in popular frameworks.

Go is not for web development.You're not Google.


I'd argue that golang's application is even less than what you mentioned.

CLI app perhaps, and some simple use case here and there. Databases and infrastructure services need more expressive languages.

I agree with you otherwise.


By database I mean the actual database servers. Like postgresql or the actual thing storing data to disk, those things are usually implemented in C/C++.

By infrastructure services I mean things as DNS servers, network queues, etc.


> By database I mean the actual database servers.

Yes I understand :) golang is still a terrible language for them though. Rust, Java, and C# are strictly superior alternatives.


We have been using WordPress for 15 years and it's been wonderful and easy the whole time. Occasionally if a plug-in gets compromised, it's very easy to roll back and fix it. We now have a standard set of safe plugins anyways.


>The simple page you could make 20 years ago is the simple page you can make today. With a few minor tweaks, it will work as well as it did 20 years ago.

Which doesn't address the concerns of the author though, because the key is not that you can make the same page in 2021 (sure you can), but that nobody will pay your for making it in 2021.

Whereas if you knew how to make a pair of shoes in 1999, or to use a CS example, how to write good C in 1999, people would still pay you for the same knowledge...

>If for some reason you decide to use the tools meant for complex web apps to make your simple page, you're going to feel like everything has gone horribly wrong. But why are you doing that?

That's not the author's concern. His concern is that the fundamental technologies and best practices get thrown out every 5-8 years, and people in web frontend have to relearn tons of stuff and throw out hard-earned knowledge, plus add all kinds of stuff that was never a thing to stay afloat the current practices...

One could chalk it up to "well, of course you need to read to stay ahead", but the complain goes beyond that, to the volatility of front-end concepts and practices that makes this far more demanding that most areas of development (see also the related "js fatigue").


I don't know if I really agree. I want to. But flash was supported by much better tooling than what I see in most spots today. Dreamweaver, for all the hate we have it, seems laughable simple compared to common frameworks today.


You can still use your old copy of Dreamweaver if you really want to. I think Adobe still sells new versions too. I wouldn't advise using website builders in general (Dreamweaver was always kind of sketch), and I certainly wouldn't advise using Adobe software in general (they're an awful company with overpriced products), but I can't imagine modern Dreamweaver has gotten particularly worse than it was before.

You can't use Flash, that's fair. But my goodness, you wouldn't want to. The small amount of incidental complexity we've gained from forcing pizza shops to stop laying out their interface in Flash is worth it. And outside of the plugins that were absolutely the correct decision to remove, everything else you want to use is still available.

What we're getting at is that with a few exceptions (HTTPS, Java plugins, Flash), virtually all of the old APIs that you used to use on the web are still supported and are the exact same to use, if not better. This is not the case everywhere with every language and platform. But the web has done a reasonably good job at being backwards compatible.

You're worried about deploy processes, and you want to deploy a free site on Netlify? You don't need to learn Git, you can hand-code your site in static HTML or generate it using any program you want and just upload it as a zipped folder. Your Dreamweaver builds will still run today. Font faces? Still work, you don't need to worry about FOUT. You miss table layouts? Hackernews is using that crud right now. The complexity around build tools, processes, and frameworks is all optional. The browser doesn't care.

If you're missing the old experience of building websites, then just do that. I maintain https://anewdigitalmanifesto.com. It is hand-coded HTML that I wrote in a text editor. It has no Javascript, no build process, no minification, nothing. And it works fine; no one has ever complained about it to me. If you're adding complexity to your engineering process and it's making your job harder instead of easier, then take a step back and ask yourself why you're adding that complexity in the first place. Is it solving a problem? Or do you just feel like you need to do things "properly" based on the current development style? Because again, the browser doesn't care.


Oh, you would want to use it. The games and similar mini software just don't have anything comparable to it.

The loss of flash is very unfortunate.


The Ruffle project is trying to make a flash player with wasm. Mayne that at least can return.


I loved Flash, deeply, I learned to program in Flash. But I am slowly in the process of doing a 180 on this, where before I felt like we had lost something with Flash that nobody had ever replicated. People have different things they liked about Flash. I liked the animation-centered workflow (and I still think there's room for innovation in this space with modern tools). I miss the ability to export vector animations.

There's a technical hole that I still think could be filled by other programs today. But often, I hear people lamenting the loss of community and the loss of universal publishing.

And on that note, aside from the animation-centered workflow, I'm less certain that the rest of that stuff is actually gone. Unity publishes to websites just as well as Flash did, if not better. Unity even lets you publish to Linux, which Flash never really supported well.

And honestly, I'm not sure the community is dead either, it's just moved on to platforms like Scratch, Pico 8, Itch.io, and Roblox. I see very similar energies when I interact with people on those platforms. It's a different community, it's younger kids with different norms, the old generation isn't as welcome anymore. But I'm not sure the reason for that is because Flash died. Sometimes generations grow up and younger generations go act creative in different spaces.

So yes, there is a very specific workflow for a very specific type of game that no longer works on the web. I've used Unity, and I don't really like Unity, I hate it's resource management, I hate that we're still using proprietary software after so many years. But I think more people are making games with Unity today than were making them back then on Flash. It's more accessible.

I'm very, very slowly changing my opinion on this. Part of it is that Flash is still available today under Animate, and while it is still chained to Adobe I've still never heard a really good breakdown of how it's worse or what features it doesn't support anymore. As far as I can tell, you can still use Actionscript 3.0 with Animate. It is proprietary and expensive, but so was Flash -- so if it's purely a technical problem, then I'm not 100% sure that you couldn't still make a Flash website/game today.

I still think there's a hole here that could be better served by Open Source toolchains, but I'm becoming less convinced that it is as big of a deal as I previously thought. If you're trying to make games, now is a really good time to be alive. It could be better, but if you give me the option of making games in 1999 or making games today, I'm not going back to 1999.

And aside from that, even if Flash is a hole, it's still very specifically a hole for games. Even during its heyday, building static websites in Flash was a sin. Almost everything else you want to use to build a static website is still available. Just not that specific sinful part :)


I suspect some parts of the flash and similar ecosystem is replaceable. My view is more that the "design first" graphical tools are basically no more.

This is more than just games. Basic form based workflows peaked with vba and access. Early 2000's dream of components that you could drag in and then wire up to the side seem to be gone.

Now, in fairness, I lauded their death, to an extent. Jsx is a step up when you don't have a designer you are working with. Css, as a concept, is basically void of you aren't collaborating with users on how to style things.

And I agree that the devops has improved. Within reason. Some things are easier, but that ease brought in a sort of Jevon's paradox where, since it became easier to ship more and more, we now ship way more and more. But, are we delivering more?

And I think that is hard to answer. I think the truth is we are hitting more and more of the tail. Which is good. But individually, nobody takes up now and more of the tail, so it is hard to see.

(Granted, I suspect accessibility is losing to this race. :( Basic affordances that you got with fieldsets, labels, and basic html are commonly a second thought on so many custom components.)


Roblox is not comparable to flash in the ease of development. You don't have kids creating mini games and sharing them either. Scratch is not comparable either.

The ease of creating something is just not there in general. The current tools are harder to use for beginner and beginner achieves less with more effort.

And open source have bad track record in terms of creating user beginner friendly tools.


I'm not sure what you mean by this. Roblox is literally a game engine, the entire point is for people to create mini games and share them. That's basically all that Roblox is, if you go to the homepage it's just a list of games you can join.

And you can validly complain about the technical quality of tools like Scratch, but as much I personally hate to admit it, Scratch is more user friendly than Flash was. I personally know kids who are making stuff in Scratch that would not be able to handle Flash. It's extremely crude stuff, Scratch is ridiculously limited. But they're still making stuff and sharing it, and getting their friends involved in the process, and sharing hacks with each other to get around its limitations, and finding other projects that impress them and reading their source code.

I do get what you mean about Flash's technical side that made it interesting. I miss an animation-first workflow, I wish someone else would build a tool that experimented with that. There is definitely a hole in my game-dev process that Flash left behind. But in theory, you could still be using Animate today and exporting your games to HTML5. Nobody has explained to me what's broken about it.

But if everyone in the old Newgrounds scene moved to Animate, or if it was magically released for free tomorrow, would that bring back the community? I'm not sure it's that simple.

I don't think that kids stopped being creative when Flash went away. I think that people (myself included) who grew up on Flash got older, and the spaces are different now and we're not being invited to them. Which... I don't know if that's really a technological thing, or even necessarily a problem that needs to be solved. Were there really a ton of 35-40 year olds making stuff on Newgrounds back in the day? I am increasingly learning about community spaces from the kids I know that I would never stumble on by myself on the Internet. And they seem to be pretty vibrant, they're just not my spaces anymore.


> If you want to make a complex web app today, that's easier than it was 20 years ago. The tools are infinitely better.

That’s true if the web app is a SPA and uses React and doesn’t require much accessibility and uses Redux or doesn’t manage state and, and, and...

Most web developers are limited to those conditions and most currently posted front end jobs are limited to those requirements. Technically what you said is subjectively correct but it depends on a lot of factors and constraints for the average developer.


I'm able to make what people call a "web app", yet I honestly have no idea what React and Redux are, besides buzzwords, and what they are supposed to solve.

But then there's also the problem of people relying on JavaScript at all for websites that would've been fine as a bunch of fully static HTML pages. It's as if these days you can't do software development without overengineering everything into oblivion.


You haven't at all responded to your parent comment whose premise was that many frontend devs cannot utilize the freedom that you are apparently enjoying.

IMO it's not the topic here if JS should at all be used. You won't catch me arguing with that -- my answer is almost always "NO!".

The topic was: "but can you make web pages like 20 years ago in the current frontend dev jobs market?" -- the answer that is "no" as well IMO.


What do you classify as a web app?

React and other frameworks help you manage the state of your UI without having to worry (As much) about efficiently re-rendering the UI.

Redux and other state libraries help you manage the global state of your JS app so your entire app can easily access and update common data.


Yes, many things are deliberately over engineered and that is largely due to limited capabilities of a specific tool or technique.


> If you want to make a complex web app today, that's easier than it was 20 years ago. The tools are infinitely better.

20 years ago, you could make a complex web app in Java, ActiveX or Flash. (There were also more obscure options.) People would install a plugin for your app.

Now, it all uses Javascript. There are a lot of advantages, but it's difficult for me to say the tooling there is infinitely better. I think a reasonable case could be made that the tooling is worse than any of the three I mentioned.


As an end user, I use "tools" from 20 or more years ago to retrieve and extract the needed information from today's "complex web apps". Networks and server software are so much faster today, "yesterday's" (smaller, simpler) software seems to work even better today.


Amen. If a static web page does what you need, one should not be unwilling to just make a static web page.


One tiny little extra requirement from the client could make you regret that.

Clients don't care about what tools you use. They care more about flexibility.


The difference is that now everybody wants complex web apps, a split backend/frontend, a SPA, PWA, done usingthe javascript framework of the week, etc…

Yes, doing complex things has gotten easier, mostly due to the improved tooling, but the number of things developers are asked to solve (for no real reason other that "it is now doable") have skyrocketed.


Because sometimes you want to learn the fancy tools but only need to make something simple. When the time comes to make the complex web app, you are now more prepared.


This attitude doesn’t have a ceiling. The same person compelled by that impulse also manages to use an unnecessary data structure that introduces complexity. I’ve seen todo-app-like functionality done with serious data structures that made the code needlessly complex.

Why am I here bitching all the time? Because I don’t have it in me to fight someone at work. The underlying tension can be reduced to me saying ‘you know this is crazy right?’, followed with ‘you don’t get it because you are not a real programmer’, and alas, those are both fighting words from both sides.


Learning fancy tools is outside the scope of building a simple website. Simple site is usually HTML/CSS/JS.


But... so are the complicated sites, once you've reduced it to its essence. Everything else are (excessive, complicated, arguably unnecessary) abstractions on top of that


I'm not sure this is quite true. 99% of early 2000s websites can be made by an unskilled operator automatically, by using something like Squarespace or Wordpress. The other 1% are the hard projects -- desktop quality applications that need to run on 5 platforms and 3 Javascript engines. Most people that do frontend engineering are being paid to work on those hard problems, so the job is going to feel harder than it did 20 years ago. There is no money in the easy things; it's all been automated away.

To some extent, the tooling does make things harder than it needs to be; it's converged to a very strange local maximum that's very far away from the global maximum. The problem, I think, is a complete lack of integration along the entire stack. You write your code in a programming language whose source code is sent to the user to compile, and there are hundreds of minor variants on how the user will interpret that code. You have to defensively handle it all. But, developers want to reuse code, and those runtimes don't really support code reuse, so you have to bolt it on with a fake "compile" stage, where you concatenate all your dependencies together and split it up into chunks to be served to the end user's compiler at just the right time. The language that is used for these tools is a little on the outdated side, so this compile stage takes 30 seconds on one CPU core, leaving your commodity-grade 32 core workstation 96% idle while you sit around waiting. And, people don't even like this language for writing their code, so they write it in a different language that is compiled to that first language. After all that, you have code that can run on users' machines, but that's only like 30% of the problem. You have to serve that code to them, preferably from a datacenter that's physically close to their terminal. You have to serve them ancillary assets, like instructions for how to format the data your app interacts with, and images. Like the author mentions, there are a million different image formats, and you have to pick the right one for the end user, relying on a single line of text like "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.128 Safari/537.36". That's hard, so there is some service you can buy to do that for you, completely unrelated to that aforementioned "build process". The TL;DR is that there are hacks on top of hacks, and moving parts on top of moving parts, that it's a wonder it works at all. But we're in this state where we can't even fix it, because there is no one party responsible for the end-to-end experience. All you can do is bolt on more and more components and hope that you get closer to a local maximum.

The takeaway is that in a distributed system of glued-together components, no one entity is responsible for the success of your users. And, those that manage to build success for their users will do it all through very careful glue, that can come apart and blow up at any time. That means they have a never-ending job that consists only of unnecessary toil. At some point, the person that decides "FUCK IT" and throws this all away and builds some sort of integrated experience is going to make a lot of money. But there are many lifetimes of work ahead to achieve this goal, and you'll be dead before you finish, so why even try? That's where we are. Good luck.


20 years ago we were criticising the syntax in JSP and ASP pages that allowed to embed code inside an html template. Today it seems that you are crazy if you are not writing a JSX SPA embedding JavaScript inside a JSX template. We are doing “There and back again” for I don’t know how many times.. And believe me I kind of like React and JSX, but I liked also embedding my dirty code in HTML ages ago.


> simply npm your webpack via grunt with vue babel or bower to react asdfjkl;lkdhgxdlciuhw

I am sorry to spam, but this is exactly the problem intercooler.js and now htmx were designed to solve:

https://intercoolerjs.org/2016/10/05/how-it-feels-to-learn-i...

90+% of the websites being built (and 90%+ of 99% of the websites being built) could use a much simpler, traditional HTML-oriented REST-ful model at a fraction of the complexity of frameworks being used today.


At some point in my organization's press toward Ansible, I came to the realization that Ansible the product is yet another layer of abstraction that is there for its own sake, and of dubious value. I converted a simple script to patch stuff, something that is trivial to write and run, and the yum module behaves different enough from the command-line yum that I have to learn a different way to get and parse the output. AWS logs also behave the same way, impossible to take up and read, quickly, trivially. Why do people read logs? To find out what happened, and do so quickly. Someone will argue that logs are so verbose we need to make them machine-friendly vs. people-friendly, so we can make tools to process them. Somewhere in the past 5 years we've gone toward making things more tool-friendly that humans can no longer interact with them in meaningful way. Some time in the future when something else supplants Ansible, shell will still be there, and still works. Meanwhile, I crib from Ansible docs and StackOverflow just to get things to work the way it did, and the pay-off is ... what exactly?

Edited to add:

Years ago Solaris10 converted the rc boot scripts to what was systemd precursor, SMF. I drank the koolaid, yes! we can build dependencies now, we have service-level kernel events now! I can get rid of daemons and watchdog scripts now! The innards of SMF was indecipherable XML, dependencies grew, you could no longer find a good system view when you ask SMF, and you couldn't easily find and fix what's wrong with the service file. At the time, it was designed to be un-messed-around-with by keyboard-happy warrior greybeard sysadmins, judged to be a source of instability and inconsistency.


I agree ansible is bad, but it is NOT the state of the art. Honestly its kind-of old and dead at this point replaced by stuff like CDKs.

Programming in YAML sucks. YAML based solutions will always come up short because you cannot develop proper abstractions so you end up with a big bowl of copy-pasta amd indecipherable work-arounds.

IMO the modern web is all based around adding types to systems because we've realized the extra toil types require make large systems have fewer bugs and more maintainable.


when i evaluated ansible and saltstack a few years ago, i went with saltstack because the ansible example felt much more like programming in yaml, while the salt example was much more declarative.

i am still not happy with yaml, but i haven't seen any better alternative than saltstack yet.

cloud development kits seem to target cloud APIs only and don't look like they could work for just a bunch of computers


What are CDKs?


cloud development kit if i am not mistaken


> another layer of abstraction that is there for its own sake, and of dubious value

Wait until you see Spring Boot.


I'm with you on logs. The recent trend of "machine-readable" logs, encapsulated as JSON structs, adds so much complexity to the process, and makes them unscannable to the human eye. And yet for general use cases you're not getting anything a regex couldn't parse out of the log prefix.

In 2010 I could search a terabyte of logs with grep -F in under a minute. With a "modern" setup you can't even see your logs until you have Elisticsearch up and running.


> encapsulated as JSON structs, adds so much complexity to the process

Or XML, you need to have written a parser (or pay Splunk to do it for you), you have to know how deeply nested your param of interest is. AFAIU Splunk has problems with too-deeply nested JSON, it was written at the time of unstructured logs. I can say this with confidence, for most of my problem cases, a good start to finding the culprit was a good tail -f and grep. I doubt myself and frequently ask if I'm one of those idiots who'd rather have faster horses. I mean, those are brilliant people making money hand over fist with their log analytics and event management tools, they know what they're doing, right? Right?


This is the my experience as well. For an activist website that I maintain I just use a PHP script to install everything and to ensure that everything has proper permissions. The site uses Wordpress and patching config files in PHP is simpler than in bash. I am glad that I have not bothered with Ansible, as I would most likely ended up with YAML programming which is way worse than bash.


Perl was made exactly for that.


That would require another language to know to maintain the script while Wordpress already requires some PHP knowledge. PHP is very OK for that. The only significant gripe is that even with PHP 7 the language does not have a utility to start an external program directly without going through shell and the need to escape arguments.


I often feel a lot of things became more complex for the sake of it. You'd think by now it'd be just a press of a button and your software is deployed and the clients can start using it,but no, that'd be too easy. Instead,even getting a simple WordPress website up to speed requires endless list of plugins,some undocumented configurations and CSS overwrites in random places. And it just gets worse and worse.

When you need to spend a few hours just to get the development environment somehow ready before first line of code is written, you know things aren't going the right direction


On the other hand the one button experience is as far as it has ever been. I just pushed up a website on Amplify in a few clicks, and the only reason it isn’t less is that some people want options.


> simply npm your webpack via grunt with vue babel or bower to react asdfjkl;lkdhgxdlciuhw

Nice summary. While I understand that yarn was a good thing to integrate frontend dependencies management just like we do it for the backend, Webpack is still a mystery to me. Just trying to add a simple JS library to make it work is a headache sometimes (DataTable with Rails for example).


I fought with webpack for years, then ViteJS came out.

It works out of the box, is uber fast, support Vue + React and does everything webpack did for me.

I stopped thinking about my tooling and could go back to spend those CPU brain cycle on solving my clients problems.

Peace of mind.

Thanks Evan You.


ViteJS is nothing short of a revelation. Recently migrated all my webapps from webpack5 to vite and it took considerably less time and effort than migrating from webpack 4 to webpack5 did, just removing babeljs reduced the dependency footprint by orders of magnitude. I just had to change a few filename extensions and even though my apps all use a custom react-like, JSX, css-modules, postcss, tailwindcss, d3.js, etc it just worked, I'm still kind of in shock over it. I'm always the one saying "webpack isn't that bad, stop being lazy" but no longer having so many config files and a 1k line package.json polluting project root is a thing of beauty.


I encountered the same trauma after building a react site and told myself surely there must be something more simple.

Check out parcel! Zero configuration required, just add a couple of commands to a package.json and you’re off.


Phrases like "the content is what is important" come to mind.

The whole project of graphic design has the same kind of abyssal qualities as other arts with a technical element - you can always go deeper, more specific. And in a competitive market it's really tempting to sell yourself on one-upping the techniques of others. With a computer in the mix, you can just add specification without end and it will soak everything up like a sponge.

But each layer of that you add gets a little farther away from "creative medium" and a little more towards "bells and whistles production". The essence of it comes from the content, and this is true of everything on the Web too, despite all the interference on and around the platform. So it's more like a case of modern development being "you have to describe the medium you want to work within" because the base layers are this morass of vocabulary that isn't conceptually coherent.


> "the content is what is important"

The only problem there is that users have come to expect certain UI elements on any given web page and those things are far beyond "content". For example, let's say you want to allow comments on articles. Now you have opened an enormous can of worms by requiring logins, storing of passwords (hashes! with salts! using modern algorithms!), permissions management, password reset mechanisms, collection of emails (for password reset), and personal data.

You can handle all that the old fashioned way and implement it yourself or you can look into the great wild of the Internet to see what solutions already exist. That's where the rabbit hole begins!

Now you've got a back-end OAuth infrastructure supporting your website. You're using a login module that you were able to "easily" install via npm. You used an oauth2 module in your back end and everything seems to work fine except now your JS code is getting a little crazy with all the "time saving" npm stuff you're using so you start to look at "bundlers" like WebPack...

Welcome to hell, my friend. This is modern "minimal" web development.

Oh but perhaps you could use a static site generator instead! Surely someone has created the perfect Markdown/ReStructuredText tool for generating your perfect web page that has all the features you need!

There happens to be one that's close. It does everything you need except that one little thing. So you reach into the npm bucket... Then you end up having to deal with bundlers again.

It never ends!


>users have come to expect certain UI elements on any given web page and those things are far beyond "content".

Usually I visit pages to see if the content is worth my time. If the 'UI elements' somehow contribute to that content, great. That's rare.

Most pages I hit are hiding poor content in swathes of katchi-vatchi. Do I really care if the headline slides down that big photo? Am I really going to scan those glaring sidebars?

Once again I sigh and mouse-up to Firefox's 'Reader view' to cut through all that bandwidth-wasting crap.


Right. There's an element cleaving both ways of the developer being obligated to do complex and disempowering things for users for various reasons, and then users rejecting their disempowerment and trying to work around it.

And I think if there's a future here it belongs to targeted protocols that decouple the use case from the UI, and filters like Reader view that reformat content to the medium the user wants to work in.


Previous discussion from 2018: https://news.ycombinator.com/item?id=16346039


I definitely understand the frustration towards the growing ecosystem of web app technologies, but I think people really disregard the necessity of the complexity.

There are certain things that basic HTML/CSS simply cannot do, or if it can, does so in a very hacky way. For basic websites, you can absolutely get away with more basic templating, but as soon as you enter the territory of clean looking UI components that are both visibly appealing and functional, you are basically required to implement the complexity somewhere the choice of framework then is just determining how you want to structure that complexity. When you try to make a performant web application with a lot of interconnected, moving parts, the reason for a lot of the "bloat" becomes very apparent.


At what level does “clean looking ui” overtake unnecessary complexity


Sometimes clean looking UI requires complexity.

If I need to let a user sort a list of items, is it more UI/UX friendly to make them press a button multiple times until an item is in the right place, or is it better to let them click and drag the item to the right part of the list?

The latter requires a lot more work but makes the experience a lot smoother.


You have one other option, the most difficult of all, to eliminate the use case.

Have you looked at a modern airplane? Have you followed from year to year the evolution of its lines? Have you ever thought, not only about the airplane but about whatever man builds, that all of man's industrial efforts, all his computations and calculations, all the nights spent over working draughts and blueprints, invariably culminate in the production of a thing whose sole and guiding principle is the ultimate principle of simplicity? It is as if there were a natural law which ordained that to achieve this end, to refine the curve of a piece of furniture, or a ship's keel, or the fuselage of an airplane, until gradually it partakes of the elementary purity of the curve of a human breast or shoulder, there must be the experimentation of several generations of craftsmen. In anything at all, perfection is finally attained not when there is no longer anything to add, but when there is no longer anything to take away, when a body has been stripped down to its nakedness. -Antoine de Saint Exupéry


I highly encourage you to look at a picture of the inside of an airplane cockpit.


I have gone back to plain CSS files and dropping Sass and other preprocessing techniques just because the past decade of managing build chains and dependencies to marginally improve writing styles that change minimally between redesigns has soured me on more complex build processes.

By the time you actually do a redesign that would take advantage of the fact you use mixins, css variables, etc., the whole design language changes, and meanwhile you'll have to change preprocessors anyways.


“I wonder if I have twenty years of experience making websites, or if it is really five years of experience, repeated four times.”

This sums up how I feel about my entire career as eloquently as possible.


The source for the page is as clear as I expected, which made me happy. I feel the same about programming for applications on PCs... things were simple, then they got complex with the arrival of windows, then simple again with VB6 and Delphi, and now they're a mess again. (Not to mention mobile)


Customer needs for most websites basically haven't changed in two decades.

Outside of a few complex SPAs doing interesting things, structurally we're building the same things we were building back then (blogs, brochure sites etc) — it's just that the way we're building them has become (optionally) much more complex.


20 years ago one could assume 1024x800 and it would mostly work. Then smart phones came with small vertical screens and touch input. Then tablets made it necessary to support screen rotation. Then retina displays required to support dpi-aware images. Then dark/light mode came and general color management...


> 20 years ago one could assume 1024x800 and it would mostly work.

IE 5 and 6 and Netscape Navigator 4 would all beg to differ. Having made simple websites two decades apart, CSS is much less hassle nowadays.


Which is why I keep happily doing .NET and Java Web based development with SSR as always, APIs now return REST, GraphQL, gRPC instead of XML-RPC/SOAP, or whatever makes the FE team happy.

I know enough of Web FE development to jump in and fix a couple of bugs when needed, and on my own consulting gigs as side job, I only do native desktop development as kind of therapy.


In my area it seems everybody wants some kind of web app or VPN. In the good old days things like file sharing and inventory management could be done on the LAN but with COVID everybody is scattered all over the place. My clients have this expectation that they'll be able to use any random smartphone to scan a QR-code in the Phillipines and have that magically publish inventory updates to accountants Texas. Everybody wants to bring their own phone or computer. Everyone wants all their data to be accessible through the web. It's chaos I tell you, and a security nightmare to boot.

Personally I think of the web as a highway: great place for a billboard, not such a great place to be storing or mutating sensitive information.


It's good to remind ourselves that all of this is optional.

Most websites could be a simple WordPress do. Many brochure websites should be static. There are benefits to straying from that, but they come at a cost.


> setting up the system and maintaining it seemed to be more effort for an experienced person on a small project than doing the work without it.

I love articles like this that put into words what I am feeling but am not articulate enough to put into my own words.


I wonder how much of the problem here is solo developers and big tech developers exchanging advice in the same places.

As a solo dev I relate to this frustration with over-engineering but I can imagine its benefits for big teams.


Everything this author complains about has a purpose.

They don't have to use compiled, minified and obfuscated css, vectorized graphics, and user-agent detected custom fonts

but if you don't, then you are unoptimized for other realities

websites in the past used system fonts, you can still do that.

websites in the past used low quality raster images, or high quality raster images. those websites don't look good at higher resolutions. conditional logic to fix those sites is messy.

you can still make sites like that.


“I had fifteen years of experience designing for web clients, she had one year, and yet some how, we were in the same situation: we enjoyed the work, but were utterly confused and overwhelmed by the rapidly increasing complexity of it all. What the hell happened? (That’s a rhetorical question, of course.)”

Exactly how I feel these days.


There used to be a great system called Rancher where you could self-host quite complex applications across multiple servers. It was just perfect and very easy to use. Unfortunately they decided to embrace Kubernetes and it became too complex and not longer useful. I wish they continued to develop the old version.


Potentially controversial view: it really doesn't matter how you do it and it probably never did. Dya think your audience knows if you're using tables or grid or flexbox or php or React or Flash or Perl or a huge image that looks like a web layout? Nope. If it works in a reasonable way, they're probably happy.


HTML without all the warts and boilerplate. No cascading style-sheet. No ribbon. No hamburger menu. No drop downs. No frames. No cruddy forms, no JavaScript, No modes, No show codes. Just live layout and editing with plain, bold, italics, links, underline, strikeout, and text formatting that I used to see built into pretty much every Application on the Mac. The "browser" needs to be a full fledged editor. It needs to provide sensible defaults and guides for layout of text, charts, graphics, animation, and interaction. We had all this before the web. The good parts of retro we want. The bad parts of the web, we can do without.


> For instance, last week I was reading a post about the benefits of not using stylesheets and instead having inline styles for everything.

Back in the day, this probably meant one of a handful of things:

* manually writing all the HTML and putting the inline styles there

* some kind of PHP thingy where you would be essentially rolling your own CSS variables

* some kind of Perl thingy where a verySmart dev is trying to maintain an entire CMS using only regexes

Today, the user could be "having inline styles" in one of 10,000 frameworks-- Vue thingies, React thingies, or perhaps a build flag in a Javascript thingy that spits out a CSS thingy...


As someone who only does a frontend/full-stack project once a year or so, I do feel like some of this tooling mess is getting better.

On previous projects I remember the frustration of futzing with Webpack and Redux. Now it seems like React has matured and simplified nicely and Next JS is a beautiful framework with a great balance of flexibility + power with some reliable guardrails to work within and boostrap a basic React project.


Coupled with the expectation that those same things should take less time than they used to and be 100% predictable.


Without clicking the link and reading comments – let me guess: it's about JS/HTML/CSS stack, right?


I think the example of table / grid layout is a weak one. Table layouts were not bad because of what they produced, but because html tables are semantically different, and shouldn't be used for layouts. But they still solved the same problem that grid layouts solve.


Building a non-trivial website in React with halfway decent design practices is 100000x easier than doing so in raw html/css/jQuery (for argument, the vanilla API back in the day was impossible to use).


> Things have gotten messy, haven’t they?

I have to give him credit though, his web page has zero javascript.


I'm wondering what he could have learned in the same time it took him to write this article. I get it, it's harder to navigate the vast jungle of tools and technologies. I also remember 2001 when my only reference to HTML was the help files of Home Site 4.0.

We have it much easier today.


Slow is smooth. Smooth is fast!


TLDR: there is not one right way to do things and IQ follows a normal distribution.


I'm going to get downvoted into the the ground for this, but I think it has to be said:

Front-end infrastructure and tools design unfortunately does not attract the cream of the CS graduate crop (and even less so Front-End design work).

For some reason, cream of the crop CS graduates usually veer towards systems, compiler, DB, back-end infra, language design, back-end, ML, etc ...

But not front-end.

The net result: the front-end ecosystem looks like the first BASIC program written by a 10 year-old who just got his first computer: it's a mess, it's unprincipled, it's grown organically, it's driven by fashion and immediate business needs, it's an unmaintainable tarpit, and it's increasingly a nightmare to build with.

I'm just ranting, I don't know what the solution is, but I gotta say: the WEB in 2021 is in a very sorry state.

[EDIT]: and if I try to think of a root cause for this state of affairs, I believe a lot of the blame can be attributed to Javascript, a language which sort of gave the "slapped together quickly because we need something now, damned be the warts" tone for the entire ecosystem.


Eh, I think it has more to do with the fact the front end is the most visible to the product people. You get a lot of poor technical decisions to bend to the whims of the product design. Backend is usually free to do as they see fit.


Yep, I decided to focus on backend to avoid dealing with requests to move pixels around arbitrarily.


For me it's kind of the opposite, the current state of front-end is clearly the result of bored overqualified CS graduates in overstaffed companies. I keep my sanity by working with people with a design background when doing front-end work because they usually rightly call out the unneeded complexity.


Regardless of whether or not your observation about CS graduates is true, I don't think that has anything to do with it.

The difference is that with the backend, you have choice over what technologies to use. There's competition between languages and ecosystems, so a better more elegant one can replace a worse crufty one.

But with the frontend, there is no competition. You have to use HTML, CSS, and (roughly) JavaScript.

If, when the web had started, it had been based on some kind of primitive bytecode for rendering, that HTML+CSS+JavaScript compiled to, then we could have had language competition and better technologies today. But that's not what happened. Front-end is stuck with an old, crufty tech stack that simply can't ever be replaced. Back-end doesn't suffer from that.


> If, when the web had started, it had been based on some kind of primitive bytecode for rendering

Which is WASM! Well, sorta.


What's wrong with HTML+CSS as a rendering primitive? It's quite successful in decoupling semantics from presentation, which is a requirement for a platform that's as widespread and universal as the modern web. JS is a bit crufty, but you don't have to deal with it. You could write everything in some other source language and compile down to JS/WASM. (Even low-level languages like C/C++/D/Rust can support this via Emscripten.)


The article is almost entirely about what's wrong with HTML+CSS -- the frustration over whether to use tables, floats, flexbox, grid, or whatever will come next (as one example of many).

HTML+CSS is a soup of overlapping technologies that has accumulated into this tangled ball of mismatched paradigms from nearly 30 years of "generational" improvements.

I grew up with it from the beginning so I understand the reasoning behind it all. But to someone wanting to learn it from scratch, trying to decipher tables vs. floats vs. flexboxes vs. grids must seem like utter madness -- a layout language written by a truly insane person.


Frontend has the problem of being stuck with JavaScript and DOM, even in the modern iterations it's a patchwork structure built on top of a cesspool - no wonder everything smells like shit.

Agreed about the tooling, npm and node were basically hacks everyone kept building on - there is zero architecture behind most of it.


> I'm going to get downvoted into the the ground for this, but I think it has to be said:

Ditto! All in good fun. 0:)

> it's a mess, it's unprincipled, it's grown organically, it's driven by fashion and immediate business needs, it's an unmaintainable tarpit, and it's increasingly a nightmare to build with

Can't tell if you're talking about front-end or back-end here; I've certainly seen both.

> For some reason, cream of the crop CS graduates usually veer towards [back-end]

Setting aside the implication that getting good grades in computer science is the same as being good at creating business value with software, I see two reasons:

1. academic inertia: old professors teach what they know (e.g. lisp) regardless of how useful it is in the industry. I bet a rockstar professor who loved front-end/UX would find the cream of their crop biased toward front-end work.

2. this very bias you're perpetuating: where a bright student looks into the world, sees front-end work derided, and so steers clear of it (and maybe even starts parroting this bias of those they respect, as youngsters so often do).

I don't know what the solution is, but it's definitely not whining. 0:)


I see it as the complete opposite. Because of the unique challenges that face front end development it has attracted some extremely talented people.

I wrote my thoughts about this in a recent article: https://erock.io/2021/03/27/my-love-letter-to-front-end-web-...


>For some reason, cream of the crop CS graduates usually veer towards systems, compiler, DB, back-end infra, language design, back-end, ML, etc ...

>But not front-end.

im going to go out on a limb and guess that it's because non-mobile front-end work sucks. few people who have options would have it as their top pick.


>the front-end ecosystem looks like the first BASIC program written by a 10 year-old who just got his first computer: [...] it's driven by fashion and immediate business needs [...]

I also miss the days when 10 year-olds wrote BASIC programs and had immediate business needs.


Most of the web runs php/jquery that's maintainable


much of the front-end complexity is driven by product people and designers who want to jam as much interactivity and visuals into the UI as possible. Also complicated front-end layouts are necessary to make room for advertisements - the main source of revenue on the internet!


I dunno, this is the kind of opinion that seems trivially easy enough to falsify that I have to wonder about the real rationale behind it.

If you take a look at the labor demand for skilled front-end engineering (https://www.levels.fyi/), it's hard to come to any other conclusion than that major companies pay a lot of money for top talent. So if major companies pay a lot of money for that talent, either they're making profit from a very tricky coordination of skilled labor, or you're pulling an argument out of an opinion that is at odds with the reality of the industry.

Why do that? It could be that you're not familiar with how skilled frontend practitioners operate in large companies. Maybe that's why you're extrapolating (incorrectly in my opinion) that a) the web frontend ecosystem is trash and b) that is evidenced by cream of the crop CS graduates (???) choosing backend over frontend. My question is, what does that have to do with the reality of frontend today?

Maybe it once was useful to consider why CS graduates gravitating towards one end or the other, but that division seems a little old in the tooth now. Good CS graduates these days have full-stack chops, and don't have too much trouble crossing devops, frontend, backend -- anything necessary to get the job done. So with that said, it's hard to consider your final conclusion anything than a rant about your own difficulties and preconceptions about frontend web development than anything:

> The net result: the front-end ecosystem looks like the first BASIC program written by a 10 year-old who just got his first computer: it's a mess, it's unprincipled, it's grown organically, it's driven by fashion and immediate business needs, it's an unmaintainable tarpit, and it's increasingly a nightmare to build with.

It's true that there's a lot of the front-end ecosystem that is super messy. But there's a lot of it which works a lot better in comparison to what was available a decade ago. Moreover, it's intrinsically gotten more complicated as mobile compute has gotten more powerful. Mobile has not just become a thing but achieved critical mass on par with desktop. There are enormously lucrative challenges involved with front-end these days, and who knows how much headroom is left in the industry? The modern phone's capabilities today are so far beyond what was possible even four years ago, that it's hard to consider these kind of rants anything beyond a lack of imagination.

How is it that the largest megacap companies of our time are using better frontend experiences it to accomplish increasingly more revenue efficient activities than they ever have before? How is it that newly fledged startups are using the leverage of good frontend experiences to grow to $1B faster than they ever have before? Is it possible that UX and psychology account for far more of technology's value than you may be considering? Just a thought.


No, no it is not.

It's somewhat amusing to see people complain about the supposedly increasing complexity of web development. Sorry-not-sorry it's not that bad or even that complex. In fact things are far easier and less complex then they were just a few years ago and improvements continue at a rapid pace (modern browser module support, vitejS, ESbuild). I don't even need to use babeljs anymore whereas just a few years ago it would have been required for cross browser support.

Developers as a cohort love to feel themselves and ego trip about how smart they are until they hit something they don't understand immediately and suddenly it's too complex and they have to write 5k words bemoaning webpack configs as the worst thing ever.

I think the problem is actually the opposite and things are way too easy now because, like anything in life, effectively simple web development requires self discipline and simplifying abstractions like React make it too easy to just throw shit at the wall and see if it resembles a working webapp and then I end up having to maintain a lot of seemingly working apps that are an absolute mess of tightly coupled spaghetti code with no clear architectural design whatsoever but product thinks it works because the side effects show up at the right time.


If using babeljs for cross browser support is your starting point, then you likely started way later then the article author.

Go back a few more years to 2005 or even earlier. If you wanted a website you installed Apache on a server (or used a shared hosting package), FTP your files to it and you're done. That's it. No pipelines, no builds, no package managers. Just files in a folder, that you copy to the server.

If you wanted to do cross browser JS you downloaded the latest version of jquery.min.js, FTP it to the server and reference it in a script tag.

I wrote my first websites in notepad.exe and used something like FileZilla to upload.

That is what I call simple web development.

I agree with the author that now things are just way more complex than they should be.

edit: looks like jQuery was first released in 2006, which is 4 years after Firefox was released, and 2 years before Chrome. So take my `2005` line as rough estimate.


I don't think you're smart enough to make any inference about when I started.

But if you insist on this crusty bonafides peen measuring contest I got started serving php via cgi on apache deployed with scp.


I’m not insisting on anything, sorry if I might have offended you, that was definitely not the intent.

I assumed that you might have started later than the author because you compared the current state to just a few years ago, while the article mainly talks about even further back in history.

The tools I mentioned were not meant for bragging. I just wanted to indicate that back then it was all way simpler, which you know as you apparently started in a similar timeframe to myself.

The main point I wanted to make was that back then things were so much simpler.


I know I was being sarcastic because hn discussions around the history of JS always devolve into this silly flamebait and no one should take them seriously.


By what standard are you saying things are easier? Certainty not less complexity, abstraction, volume of code, number of moving pieces, raTe of change or configuration.

If you don't understand how the foundational technologies work and don't care to learn then things may indeed seem easier. If you have 15+ years of experience and seek to understand today's technology at the same level as when you first learned, things are definitely harder.


It's easier if you want to achieve the same things. Our expectations have just increased even faster than things became easier.


IE7




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: