Hacker News new | past | comments | ask | show | jobs | submit login

An entire OS, which can still perform most of the tasks we ask of computers today, runs fast in a virtual machine on a device designed with a power budget in mind. Yet many apps that do basically just one of those tasks, and run natively, run slower and take up more disk space and memory. That's progress.



Nominally, apps do one of those tasks, but they also have to constantly mine you for data, so that takes a lot of overhead.


Writh's Law: Software gets slower more rapidly than hardware becomes faster.[1]

[1] https://en.wikipedia.org/wiki/Wirth%27s_law


I've always felt like this effect is just perception. LibreOffice Writer starts in a couple of seconds on my current laptop, for example. I remember when Word took a few multiples of that in the past.

And browsers now feel slow if they don't respond perfectly to touch input on my tablet. We used to tolerate much slower response times.


A lot of this is down to UX, or rather expectation management. If I click on something and nothing at all happens for 0.8 seconds until the UI updates with the task done, the app feels incredibly slow and laggy. If I click on something, a progress indicator opens immedietly and updates a real progress bar every 0.2 seconds until the task is done 10 seconds later the app feels fast and snappy.

The second app took more than ten times longer, but it never left me doubting if it even noticed my click and how long I have to wait. That makes a world of difference.

90s era software was written in the expectation that the computer is slow, and having progress indicators on screen or in the status bar for a lot of tasks was common (and if that was too much work at least the cursor was updated instantly to show that work is done). Today most software is written in the expectation that everything works instantly, and if it doesn't the user experience falls apart quickly.


One addition to this (excellent) post: it was much more common for applications to do all the work in a single thread, and so it often became important to be aware of stuff like chunking up a big workload so that progress bar could actually render. In doing so, this also improved application responsiveness through allocating time to drain the message pump.

While now the UI is more regularly decoupled from the application, the awareness that updating the user seems to have somewhat fallen by the wayside, and so a big ol' spinner, or a largely fictitious progress bar that isn't actually tethered to any measure of work, shows up and you get situations like "well, the progress bar is at 130%, but I sure don't feel like I'm at 130%...".

This isn't the tools, it's a poor use of them, and it lends a lot of ammunition to folks who want to back-in-my-day while ignoring that a lot of other parts of the stuff we had back in our day kinda...sucked. Some of it (not everything!) just happened to be good at this one thing!


Apples to oranges. Word is still slow.


The others have already evoked Wirth's law.

Indeed, even something like MS-DOS/Windows 3.1 or Amiga 500 would already be good enough for many office tasks that most people do.


Wirth's law in action.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: