This post again with its ridiculous ranting examples.
"This text searching program that searches text against text is way faster and simpler than a fully rendered, interactive map of the entire world with accurate roads and precision at the foot / meter level."
No. Shit, really? Get out of town.
Yes, some very popular apps have bad UX. But some apps have incredible UX. Cherry picking the bad ones while ignoring the good ones to prove a point is boring and not reflective of the actual industry.
These posts fondly remember just the speed, but always seem to forget the frustrations, or re-imagine them to be something we treasured.
Remember autoexec.bat files? Remember endless configuration to get one program working? Remember the computer just throwing its hands up and giving up when you gave it input that wasn't exactly what it expected? Remember hardware compatibility issues and how badly it affected system stability? Remember when building a computer required work and research and took hours? I do, and it wasn't fun then, it was a detriment to doing what you wanted. It wasn't a rite of passage, it was a huge pain in the ass.
So yeah, things are slower now, and I wanna go fast. But I also don't need to spend an entire weekend setting up a PC and a printer for my mom anymore either. I don't need to teach her arcane commands to get basic functionality out of her machine. Plug and play and usability happened, and while things feel slower, computers are now open to a much wider audience and are much more usable now.
These posts always have a slight stench of elitism disguised as disappointment.
Huh. I'd say the examples are perfectly good and on-point. While dealing with autoexec.bat and random BSODs wasn't fun, it's entirely orthogonal to the fact that a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.
It also doesn't change the fact that Google Maps is very bad at being a map. It's entire UI flow is oriented for giving turn-by-turn directions for people who know where they are and where they are going; it gives almost no affordances for exploration and cross-referencing.
> Remember when building a computer required work and research and took hours.
As someone who builds their own PC every couple years: it still does. It's actually worse now, due to the amount of products on the market and price segregation involved. Two PCs ago, I didn't have to use two parts compatibility tools, several benchmarking sites and a couple of friends, and didn't have to write CSS hacks for electronics stores, just to be able to assemble a cost-effective PC.
> But I also don't need to spend an entire weekend setting up a PC and a printer for my mom anymore either.
You don't? Printer drivers are only slightly less garbage than they were, but now there's also less knobs to turn if things go wrong. When my Windows 10 doesn't want to talk to a printer or a Bluetooth headset, all I get to see is a stuck progress bar.
Bottom line: I agree 100% with the author that one of the primary functions of a computer is enabling easy cross-referencing of information. This ability has been degrading over the past decades (arguably for business reasons: the easier it is for people to make sense of information, the harder it is for your sales tactics to work).
> These posts always have a slight stench of elitism disguised as disappointment.
That I don't get. Is it "elitist" now to point out that the (tech) "elite" can actually handle all this bullshit, but it's the regular Joes and Janes that get the short end of the stick?
> It also doesn't change the fact that Google Maps is very bad at being a map. It's entire UI flow is oriented for giving turn-by-turn directions for people who know where they are and where they are going;
As it turns out, that is probably the most popular use case for maps in the world.
Note also that for most smartphone users of Google Maps the use-case is actually much broader than that. The UI flow also totally accounts for users who only know where they are going—thanks to GPS and Google Maps knowing where you are often isn't necessary.
I'm confused by the complaint that the "Maps" app only caters to the 90-percentile use case for maps, but doesn't cover the other uses-cases well.
> I agree 100% with the author that one of the primary functions of a computer is enabling easy cross-referencing of information. This ability has been degrading over the past decades
I just find this not the case at all. For expert-users the tools that existed decades ago are still there and still usable. Or you can craft your own!
For non-expert users the information in the world is orders of magnitude more accessible than it used to be.
> As it turns out, that is probably the most popular use case for maps in the world.
There's a very subtle point the Twitter thread was making here. This use case may be most popular not because it's what the people want, but because it's all that they can easily do. The tools you use shape how you work, and what you can work on.
FWIW, I learned to pay attention when the machine doesn't help me do what I want (it's a good source of ideas for side projects), so I've noticed that I do want a map that works like a map - something I can explore and annotate. I do sometimes resort to screenshotting GMaps, or photographing paper maps in the past, just to have a map on my phone. I've seen non-tech people do that as well. So I can be confident that it's not just me and the Twitter thread's author that want this.
> For expert-users the tools that existed decades ago are still there and still usable. Or you can craft your own!
The Twitter thread's point (and mine as well) is that expert users can and do work around this degradation. It's a frustrating chore, but it's not all that difficult if you can code a bit and have some spare time. It's the experience for the non-expert users that has degraded in a way they can't fix for themselves.
> For non-expert users the information in the world is orders of magnitude more accessible than it used to be.
The way it's accessible, it's almost as if it wasn't. Sure, you can easily Google random trivia. But good luck trying to compare things. That's always a pain, and usually involves hoping that someone else made a dedicated tool for similar comparisons on the topic you're interested in, and that the information hardcoded in that tool are current and accurate. Notably, the tools you use for searching have no support for comparing.
> so I've noticed that I do want a map that works like a map - something I can explore and annotate.
I don't doubt that there are use cases for a map that works this way. Even if Google Maps covers 80-90% of the use-cases for mapping, mapping is an absolutely massive domain. 10-20% of use-cases still represents a huge volume.
But it doesn't have to be Google Maps. It actually seems worse to be for one "maps" app try to handle all possible use-cases for a map.
Why isn't there a separate different tool that handles the use-case you describe?
I guess, going back to the original thesis, what would the "1983" replication of what Google Maps does, but faster. Or, what would the "1983" version of the mapping behavior you wanted.
In the thread they say:
> in 1998 if you were planning a trip you might have gotten out a paper road map and put marks on it for interesting locations along the way
I'd argue that this use-case still exists. Paper road maps haven't gone away, so this is still an option. People largely don't use this and prefer Google Maps or other digital mapping tools for most of their problems. Why? If you gave me both the 1998 tools and the 2020 tools, for 95% of the options I'm going to use digital tools to solve it because they let me solve my problems faster and easier. I know this because I have easy access to paper maps and I never touch them. Because they're largely worse at the job.
> There's a very subtle point the Twitter thread was making here. This use case may be most popular not because it's what the people want, but because it's all that they can easily do. The tools you use shape how you work, and what you can work on.
Ultimately, my point above is my response to that. None of the old tools are gone. Paper maps are still available. And yet they have been largely abandoned by the large majority of the population. I agree that there are limitations to our current digital tools, and I hope in 2030 we have tools that do what the article describes. But the 1983 version of the tools are worse for solving problems than the current tools, for most people.
pretty much all games in the early 80's had [so called] pixel perfect scrolling. Each frame showed exactly what was required.
Today it is entirely acceptable for a map to be a jerky stuttering pile of crap. The same goes for the infinite scroll implementations. Its preposterous to start loading things after they are needed.
There is a good analogy with making things in the physical world. The professional doesn't start a job before he has everything he needs to do it, the amateur obtains what he needs after he needs them.
Games have huge advantages of constraint of application that mapping applications don't. You can get pixel perfect scrolling when you constrain the max rate the user can pass through the dataset, you deny them random access into the dataset, your dataset isn't trying to represent a space the volume of planet Earth, etc.
There's a huge gulf between the use cases you're comparing here, and I don't believe for one second that loading the Google Maps dataset into Commander Keene's engine would make for a better experience.
(Also, not to be overly pedantic, but "The professional doesn't start a job before he has everything he needs to do it" pretty much classifies all building construction as unprofessional. The professional doesn't magically have everything on-hand, especially bulky or expensive resources; they have a plan for acquiring them at reasonable rates of input and mitigation strategies if that plan can't be followed)
Illl ignore the pedantic part since it was just an analogy, if it doesn't work for you there is little to talk about.
> Games have huge advantages of ....
I have thoughts like that but I consider them "making up excuses". You don't have to see it that way but I can't see someone fix a problem by making up excuses for it to exist. For me it is just like you can always come up with an excuse not to do something.
8 gigabytes of memory / 64 kilobytes of memory = 125 000 times as much memory.
14 gigahertz (4 cores x 3.5 GHz) / 1.023 MHz = 13 685 times as much processor power.
4108 gigahertz (2560 Cuda cores / 1605 MHz) / 2 MHz = 2 054 000 times as much video power.
Can I just call 2 Mhz memory bandwidth 16 Mbit/s?
If so, 500 Mbit / 16 Mbit = 31.25 fold the bandwidth
We are not rendering many layers of colorful animated game content. A map is just a bunch of boring lines. The modern screen however is a lot bigger. I see a glimmer of hope for an excuse!
320x200 px = 64000 px
1920x1080 px = 2073600 px
2073600 / 64000 = 32.4 times the screen size
meh?
We must applaud everyone involved in making all this hardware progress. It truly blows the mind and defies belief. No one could have imagined this.
Then came weee the software people and... and.....
I'm cringing to hard to continue writing this post.
The numbers don't lie, we suck. Lets leave it at that.
I'm still happy with the configuration we have where my map is a little slower than maybe I'd like it to be (though honestly, I just loaded maps.google.com and moused around randomly and... it's fine? Certainly not so slow I'm bothered by it) but the mapping app also can't crash my computer due to the three layers of abstraction it's running on top of. Because that would suck.
If you're curious where the time goes, btw... Most of the visible delay in Google Maps can be seen by popping the browser inspector and watching the network tab. Maps fetches many thin slices of data (over 1,000 in my test), which is a sub-optimal way to do networking that adds a ton of overhead. So if they wanted to improve maps significantly, switching out for one of the other protocols Google has that allows batching over a single long-lived connection and changing the client and server logic to batch more intelligently could do it. I doubt they will because most users are fine with the sub-three-second load times (and engineering time not spent on solving a problem most users don't care about is time spent on solving problems users do care about). You're seeking perfection in a realm where users don't care and claiming the engineers who don't pursue it "suck;" I'd say those engineers are just busy solving the right problem and you're more interested in the wrong problem. By all means, make your mapping application perfect, as long as you understand why the one put out by a company with a thousand irons in the fire didn't.
Also, I think the analogy was great, but you reached the wrong conclusion. ;) That is how large-scale engineering works. Scheduling becomes the dominant phenomenon in end-to-end performance. Games have huge advantages on constraining the scheduling problem. General-purpose apps do not. Hell, to see this in action in a game: Second Life's performance is crap because the whole world is hyper-malleable, so the game engine cannot predict or pre-schedule anything.
Since it is software nothing is set in stone, everything can be changed, we know how to do it really really fast and really really efficiently.
People did incredible things to improve almost everything.
To me this means all of the performance loss is there for no reason. All we need is for people to stop making excuses. I for one know how to get out of the way when better men are trying to get work done.
You are the engineer if not the artist, impress me! Impress the hardware people! Impress the people who know your field. Some attention to consumers wishes is good but Gustave Eiffel didn't build his tower because consumers wanted that from him.
Why would you even have tap water if the well is down the street? A horse and carriage is just fine, it is good enough for what people need. no? What if our doctors measured their effort in "good enough's" and living up to consumer expectation only?
The hardware folk build a warp capable star ship and we are using it to do shopping on the corner store because that was what mum wanted. Of course there is no need to even go to Mars. It's missing the point entirely you see?
> pretty much all games in the early 80's had [so called] pixel perfect scrolling. Each frame showed exactly what was required.
> Today it is entirely acceptable for a map to be a jerky stuttering pile of crap. The same goes for the infinite scroll implementations. Its preposterous to start loading things after they are needed.
This doesn't make any sense to me. In which game from the early 80's could I view accurate map data for any region on the earth, and quickly scroll across the planet without any loading artifacts?
Of course you can manage pixel-perfect scrolling if all of your data is local and fits in memory. That's not anywhere close to the same domain as maps.
> ... I do want a map that works like a map - something I can explore and annotate.
You can do that with Google maps, if you're logged in. You can create a custom map, with multiple places marked, and with annotations. You get a URL for it. And looking at it later, you can zoom in as needed.
>FWIW, I learned to pay attention when the machine doesn't help me do what I want (it's a good source of ideas for side projects), so I've noticed that I do want a map that works like a map - something I can explore and annotate. I do sometimes resort to screenshotting GMaps, or photographing paper maps in the past, just to have a map on my phone.
How is a world where search is convenient and automated but comparison isn't less accessible than a world where neither search nor comparison are convenient and automated?
because it's not practical to write a Google Maps (or whatever) replacement most of the time? It's a ton of work. Sure we can smooth over rough edges but actually implementing lots of missing non-trivial things? Or, making things faster that rely on some 3rd party back end? Usually either not an option, or not an option without recreating the entire front end.
As it turns out, that is probably the most popular use case for maps in the world.
Your sentence starts out like you're stating a fact, but then peters out with "probably."
Do you have data on this?
I'd posit the opposite: That exploration is far more used in online maps.
Aside from Uber drivers, SV types, and wannabe road warrior squinters, nobody uses maps for their daily activities. People know where they're going and they go there without consulting technology. That's why we have traffic jams.
I think this isn't true at all. The vast majority of people I know use maps solely find something like an ATM/gas station/coffee shop/etc and then figuring out how to get there or how long it would take to get there.
We don't have data, but the only people that do are Google and they have designed their UX around this use case. And it has become one of the most used tools on Earth. If we are going to play the 'your comment is bad because you don't have data' game, I think the onus is really on you to prove that the company with all the data is getting it wrong.
> Aside from Uber drivers, SV types, and wannabe road warrior squinters, nobody uses maps for their daily activities. People know where they're going and they go there without consulting technology. That's why we have traffic jams.
I might know where I'm going but I don't always know how to get there so I use Google Maps all the time. I don't use it for my daily commute but if I'm going to a friend's or something I'll usually use it.
When I sit in my friends cars we also use it all the time. Often we're going to a restaurant or some other location that we don't often go to.
For exploration my friends pretty much just use Yelp or Google Search. I sometimes but rarely use Google Maps for this because I find that the reviews are usually much lower quality and Google Maps is too slow (I have an old Pixel 2)
I think it depends where you are driving. I live in a large city and there are way more people using maps for their daily activities that you might think. In my parents neighborhood in the suburbs, and in rural areas, I think you're probably right. If there are only a few ways to get somewhere, you probably aren't using maps.
> Your sentence starts out like you're stating a fact, but then peters out with "probably."
Fair enough. I'd argue that I put the word probably in the first half of the sentence, so I don't see it as "petering out", but fair enough.
I will agree with the critique that I'm making an assumption about mapping use cases, and don't have hard data. I'm happy to be corrected by any real data on the topic.
People use maps for daily activities all the time. Not to find where they're trying to get to, but for how to get somewhere more quickly - ie. public transport directions, or if it's quicker to walk than go by bus/train/tram.
> As it turns out, that is probably the most popular use case for maps in the world.
I don't think fully deciding an entire path was ever the major use of paper maps. But well, now that apps only allow for this usage, I am pretty sure it's the most popular one.
Personally, I very rarely use maps this way. What makes Google shit useless for me (worse, actually, because it insists on opening by default, and people now decided they can send a Google marker instead of an address, and Google makes it impossible to retrieve data from a marker).
Most people I know, faced with the option of turn by turn directions or nothing choose nothing nearly all the time.
Personally, I never use turn-by-turn even to walk around in unknown places, because it loses context. Instead, I stop at the screen that shows the route options; I don't press "Start" to activate the nav. Besides being more useful when walking around, it doesn't eat such ridiculous amounts of battery power that the turn-by-turn nav does for some reason.
That's the major usecase for mobile GMaps, because it absolutely sucks for any other usecase, and thus people can't use it for any other purposes really. Except maybe looking up timetables for a previously known public transport stop (only in certain countries, some have much superior ways to plan public transport travel); planning a transport between two stops if you're not standing at one of them right now probably requires a phd.
Also if your mapping app is good, the people will use it as little as possible. That's certainly my desire as a turist. So maybe perverse incentives are at play, if wrong usage metrics are used to improve the app.
Anyway, I'm used to using paper maps from the childhood, so ergonomics of mobile phone apps is really irritating, because of how paper maps work so much better for me most of the time.
I think you’re measuring use-cases by volume (i.e. Monthly Active Users) instead of by mass (i.e. number of man-hours spent engaged with the app’s UI by those users.)
Certainly, a lot of people use Google Maps for turn-by-turn directions. This means that they interact with the actual UI and visible map tiles once, at the beginning of the trip; and then from there on are given timely voice directions every few minutes, which they can maybe contextualize by looking at the map on the screen. Even if you count the time they spend hearing those directions as time spent “interacting with the UI” of Google Maps, it adds up to fewer man-hours than you’d think.
Meanwhile, I believe that there are a much larger number of collective man-hours spent staring at the Google Maps UI—actually poking and prodding at it—by pedestrians navigating unfamiliar cities, or unfamiliar places in their city. Tourists, people with new jobs, people told to meet their friends for dinner somewhere; etc.
And the Google Maps UI (especially the map tiles themselves) is horrible for pedestrians. Half the time you can’t even figure out the name of the road/street you’re standing on; names of arterial roads (like main streets that happen to also be technically highways) only show up at low zoom levels, while names of small streets barely show up at the highest zoom level. And asking Maps to give you a pedestrian or public-transit route to a particular place doesn’t fix this, because GMaps just doesn’t understand what can or cannot be walked through. It thinks public parks are solid obstacles (no roads!) while happily routing you along maintenance paths for subway systems, rail lines, and even airfields. (One time it guided me to walk down the side of an above-grade freeway, outside the concrete side-barriers, squeezing between the barriers and a forest.) And, of course, it still assumes the “entrances” to an address are the car entrances—so, for example, it routes pedestrians to the back alleys behind apartment buildings (because that’s more often where the parking-garage entrance is) rather than the front, where the door is. I don’t live here, Google; I can’t even get into the garage!
The thing is, these are such distinct workflows that there’s no reason for Google Maps to be optimizing for one use-case over the other in the first place. It’s immediately apparent which one you’re attempting by your actions upon opening the app; so why not just offer one experience (and set of map tiles) for people attempting car navigation, and a different experience (and set of map tiles) for people attempting pedestrian wayfinding?
Or, someone could just come out with a wayfinding app for pedestrians that does its own map rendering. There’s already a Transit app with a UI (and map tiles) optimized for transit-takers; why not a Walkthere app with a UI (and map tiles) optimized for pedestrians? :)
> Or, someone could just come out with a wayfinding app for pedestrians that does its own map rendering. There’s already a Transit app with a UI (and map tiles) optimized for transit-takers; why not a Walkthere app with a UI (and map tiles) optimized for pedestrians? :)
This wouldn't really make me happy. It makes more sense to integrate the walking instructions into the Transit app and be good at giving directions for multimodal transport. I need to know if I should get off the bus here, and walk through the park, or wait till three stops later, which leaves me closer as the crow flies but further away overall. The car app doesn't need to work multimodally since it's not normal to drive somewhere, walk 10 minutes, then drive somewhere else.
Google maps is still the best general purpose multimodal transport app I've used, but it could be so much better. I'm in Austria right now and it doesn't know about the Austrian buses. There's an app (OEBB Scotty) from the Austrian rail operator which I assume everyone uses instead.
Honestly, the two use cases have an obvious intersection: a planning/wayfinding session in which I want Google to compute me a route that I want to then inspect, perhaps modify, and then save into my planning session.
Only thing I'd like is for maps to not give directions for the parts that are obvious for you since you'd done that part a million times i.e. from your home, turn right to get to the freeway
> Huh. I'd say the examples are perfectly good and on-point. While dealing with autoexec.bat and random BSODs wasn't fun, it's entirely orthogonal to the fact that a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.
I know of an ERP system that somehow manages to take about 15 seconds to search an inventory of ~100k items. If you export all those items to CSV, with all their attributes (most of which are not searched), the resulting file is about 15 MB.
It is boggling how they managed to implement search this slowly (in a C++ application using MS SQL as the backend). 3.5 GHz computers, performing plain text string search at about 1 MB/s.
It is even more surprising that users feel this is not an completely unreasonable speed.
(They managed to pull this stunt off by completely not using the SQL database in the intended way, i.e. all tables are essentially (id, blob) tuples, where the blob is a zlib compressed piece of custom TLV encoded data. All data access goes through a bunch of stored procedures, which return data in accordance to a sort of "data extraction string". Search works by re-implementing inverted indices in tables of (word, offset, blob), where blob contains a zlib compressed list of matching IDs; again processed by stored procedures. The client then is wisely implemented using MS SQL's flavour of LIMIT queries which effectively cause a classic quadratic slowdown because the database engine literally has no way to fetch result rows n...m except by constructing the entire result set up to m.
Unsurprisingly the developers of this abomination claim to be competent. They also invented a funny data exchange format involving fixed field lengths and ASCII separators - some time in the 2010s.)
Tell me about it! A company I recently did maintenance for pays several thousand € each year for possibly the worst, most broken accounting software, despite not actually having an in-house accountant. The reason: one (1!) person in the company uses the software's (barely functioning) inventory feature and refuses to use anything else.
They're currently considering my offer to develop a custom, pixel-identical system and just replace it without telling her. I could probably do it in a week because she only ever interacts with like 4 out of the at least 100 views the app has, but I suspect she'll catch wind of this and stop it. I don't actually know what she does there besides inventory, but she seems to have more power than anyone else below C-level.
There was a time when I would have been interested in how such a system came to be, but now I think I have been around long enough to guess that someone was protecting their job, and then either got fired anyway or became technical-architect-for-life.
What makes Google Maps bad? My computer in 1983 didn't have a map application at all, how can Google Maps possibly be bad compared to that?
And if I had had a map application (I'm sure they existed) it would have taken several minutes to load from cassette tape. That's not faster than Google Maps, either perceptually or objectively.
The article explains all the ways that make Google Maps' UI bad. Your paper map or driver's atlas in 1983 offered better xref functionality. As they do in 2020, if you can still find them.
Sure, hardware progressed in the past 40 years. CPUs are faster, storage is larger, we have a global network. But it's all irrelevant to the point that Google Maps UI is optimized for being a point-by-point nav/ad delivery tool, not a map. That's an absolute statement, not relative to last century's technology.
I have several atlases and IMO they have considerable advantages over Google Maps. As does Google Maps over them. But none of that is relevant to the matter at hand: "Almost everything on computers is perceptually slower than in 1983".
Thus far in this thread nobody has offered any counterpoint to the speed argument.
The first PC we bought at home was a 133MHz Pentium.
My current box is a Ryzen 3700 at 16*3.6GHz.
It doesn't feel like I have that much more power at my disposal doing everyday things. Web browsers aren't 400 times faster today than Internet Explorer was in 1995.
They should be. Even if you account for all the extra stuff that's going on today things should be at least 50 times faster. Why aren't they?
RAM bandwidth and speed, network latency, display sound like the most important. If that 133MHz pentium rendered a web page, it did so at 640×400 pixels, right? 16 colours? Or just in text? So it had to process about 4k (if text) or 128k (if graphics). Your current display involves a little more data.
RAM access takes about 10ns now, it took longer back then but not very much longer. Your sixteen cores can do an awful lot as long as they don't need to access RAM, and I doubt that you need sixteen cores to render a web page. The cores are fast, but their speed just removes them further from being a bottleneck, it doesn't really speed up display of text like this page.
And then there's the latency — ping times are a little closer to the speed of light, but haven't shrunk by anything close to a factor of 400.
It's also driven by a graphics card with more RAM than I had HDD space back in the day with a dedicated CPU that's also a whole lot faster than 133MHz.
Every piece of hardware is better but software has bloated up to remove those speed gains, except when it comes to things like AAA games where they're still pushing the envelope. That's the only place you can actually tell you've got hot new hardware, because they're the only ones caring enough about performance.
The increase in video RAM requires more of the CPU and GPU. Downloading and displaying a JPEG at today's resolution requires more of everything, not just video RAM.
Anyway, If you come to over to the server side you can see other code that performs very differently than it could have back in 1983. Sometimes unimaginably different — how would a service like tineye.com have been implemented a few decades ago?
The point I'm making is that my desktop PC sitting in my office now does have more of everything compared to my 133MHz PC from 1995. Not everything has scaled up at the same pace, sure, but literally every piece of hardware is better now.
People talk about difference in resolution and color depth? 640x480x16 isn't that much less than 1920x1080x32. My current resolution has 13 times more data than my 1995 one, and my HW can handle refreshing it 120 times per second and fill it with millions of beautiful anti-aliased polygons all interacting with each other with simulated physics and dozens of shaders applied calculating AI behaviour, path finding, thousands of RNG roll, streaming data to and from disk and syncing everything over the network which is still limited by the speed of light. As long as I play Path of Exile that is.
Opening desktop software is perceptually the same as in the 90s. From launching to usable state is about the same amount of time, and it's not doing so much more that it can explain why current software takes so long.
If I can play Path of Exile at 120fps it's obviously not an issue of HW scaling or not being able to achieve performance.
Who knows? My hunch is there's two main factors influencing this. The first is that constraints breed creativity. If you know you only have 133MHz on a single CPU you squeeze as as possible much out of every cycle, on modern CPUs what's a few thousand cycles between friends?
The second is SDK/framework/etc. bloat, which is probably influenced by the first. With excess cycles you don't care if your tools start to bloat.
I think it's primarily an issue of attitude. If you want to write fast software you'll do it, regardless of the circumstances. It all starts with wanting it.
I worked on a framework in the nineties and did such things as render letters to pixels. Here are some of the optimisations we did then, compared to now:
We used much lower output resolution.
We used integer math instead of floating point, reducing legibility. How paragraphs were wrapped depended on whether we rendered it on this monitor or that, or printed it.
We used prescaled fonts instead of freely scalable fonts for the most important sizes, and font formats that were designed for quick scaling rather than high-quality results. When users bought a new, better monitor they could get worse text appearance, because no longer was there a hand-optimised prescaled font for their most-used font size.
We used fonts with small repertoires. No emoji, often not even € or —, and many users had to make up their minds whether they wanted the ability to type ö or ø long before they started writing.
Those optimisations (and the others — that list is far from complete) cost a lot of time for the people who spent time writing code or manually scaling fonts, and led to worse results for the users.
I think you're the kind of person who wouldn't dream of actually using anything other than antialiased text with freely scalable fonts and subpixel interletter space. You just complain that today's frameworks don't provide the old fast code that you wouldn't use and think develpers are somehow to blame for not wanting to write that code.
Perfectly well? Really? Scrolling around the map or zooming causes you no kind of rendering delays or artefacts? It feels consistently snappy no matter what you do?
Apparently Microsoft Autoroute was first released in 1988, covered several dozen countries, and could be obtained by ordering it. Thus using it for the first time would involve a delay or at least a day in order to order, receive and install the program. After that, starting it should be quick, but I can't tell whether it required inserting the CD into the drive. Even if the appliocation is already installed on the PC and not copy-protected, looking something up doesn't sound obviously faster than opening a web page, typing the name of the location into the search box, and waiting for the result.
And you had to wait for new CDs to arrive by mail whenever roads changed. And I'm not talking "2 day delivery Amazon with UPS updates" here, I'm talking you send an envelope + check into the mail and maybe a month from now you get a CD back.
It didn't actually work that well. The internet is the only real way you can get a Google-maps like autoroute feature with reasonable update times. Constantly buying new CDs and DVDs on a subscription basis is a no-go for sure. I don't think anyone's 56kb modem was fast enough to update the map data.
Even if you bought the CDs for the updated map data, it only was updated every year IIRC. So there was plenty of roads that simply were wrong. Its been a long time since I used it, but Google Maps is better at the actual core feature: having up to date maps, and an up-to-date route information.
Hint: Microsoft wasn't sending around Google cars to build up its database of maps in the 90s. Nor were there public satellite images released by the government to serve as a starting point for map data. (Satellite imagery was pure spycraft. We knew governments could do it, but normal people did NOT have access to that data yet). The maps were simply not as accurate as what we have today, not by a long shot.
--------
Has anyone here criticizing new software actually live in the 90s? Like, there's a reason that typical people didn't use Microsoft Autoroute and other map programs. Not only was it super expensive, it required some computer know-how that wasn't really common yet. And even when you got everything lined up just right, it still had warts.
The only thing from the 90s that was unambiguously better than today's stuff was like... Microsoft Encarta, Chips Challenge and Space Cadet Pinball. Almost everything else is better today.
With browsers the network latency caps the speed ultimately, no matter how fast a CPU you have. Also HDD/SSDs are very slow compared to the CPU caches. Granted PCs of the previous era also had the same limitation but their processors were not fast enough to run a browser 400 times faster if only the HDD wasn't there.
But other simpler programs should be very much faster. That they perceptually aren't is because (IMO) code size and data size has increased almost exponentially while CPU caches and main memory speeds haven't kept up.
Main memory speeds haven't increased like CPU speeds have but it's nowhere close to where it was in 1995. You can get CPUs today with larger cache than I had RAM back then, as well.
I know that CPU speed isn't everything and so a 400x speedup is not reasonable to expect. That's why I hedged and said 50x.
Every part of my computer is a lot faster than it was back then and I can barely tell unless I'm using software originating from that era because they wrote their code to run on constrained hardware which means it's flying right now.
It's like we've all gone from driving 80 MPH in an old beater Datsun to driving 30 MPH in a Porsche and just shrug because this is what driving is like now.
There was the BBC Domesday book that let you scroll around annotated maps on a BBC micro. They loaded in near-realtime from laserdisc. It was fantastically expensive, and also so tightly tied to its technology that it was impossible to emulate for years.
I believe around 2010 the BBC managed to get it on the web, but this seems to have died again.
> Huh. I'd say the examples are perfectly good and on-point. While dealing with autoexec.bat and random BSODs wasn't fun, it's entirely orthogonal to the fact that a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.
What are you talking about? How in the world is a DOS that can only run a single app at a time better than a system that can run dozens of apps at once? How is non-multitasking a better experience? I remember DOS pretty well. I remember trying to configure Expanded Memory vs Extended Memory. Having to wait for dial-up to literally dial up the target machine.
Edit: I didn't realize the poster was talking about Point-of-Sale devices. So the above rant is aimed incorrectly.
> It also doesn't change the fact that Google Maps is very bad at being a map. It's entire UI flow is oriented for giving turn-by-turn directions for people who know where they are and where they are going;
That's exactly what it's made for.
> As someone who builds their own PC every couple years: it still does. It's actually worse now,
No way. Things just work nowadays. Operating Systems have generic drivers that work well. Its so much easier now to build a machine than it was years ago. I remember taking days to get something up and running, but now its minutes. Maybe an hour?
I really hate these "good old days" posts, no matter what the subject. The days of the past weren't better, there were just fewer choices.
> What are you talking about? How in the world is a DOS that can only run a single app at a time better than a system that can run dozens of apps at once? How is non-multitasking a better experience?
The person you were replying to was specifically talking about POS (Point of Sale) systems.
Retail workers use these systems to do repetitive tasks as quickly as possible. Legacy systems tend to be a lot faster (and more keyboard accessible) than modern web-based systems.
It is not uncommon for retail workers to have a standard Windows workstation these days so they can reference things on the company website, but then also shell into a proper POS system.
In no way shape or form does DOS-era UX beat current-era UX (there may be specific programs that do so, but writ large this is incorrect). Getting away from command lines is one of the core reasons that computing exploded. Getting farther away from abstraction with touch is another reason that computing is exploding further.
Command-line systems simply do not map well to a lot of users' mental models. In particular, they suffer from very poor discoverability.
It is true that if you are trained up on a command-line system you can often do specific actions and commands quickly but without training and documentation, it is often quite hard to know what to do and why. Command-line systems also provide feedback that is hard for many people to understand.
Yes, it is true that many current-era systems violate these guidelines as well. This is because UX has become too visual design focused in recent years, but that's a symptom of execs not understanding the true value of design.
> Command-line systems simply do not map well to a lot of users' mental models. In particular, they suffer from very poor discoverability.
I don't want to push too hard into one extreme here, but I believe modern design makes a mistake of being in the other extreme. Namely, it assumes as an axiom that software needs to be fully understandable and discoverable by a random person from the street in 5 minutes. But that only makes sense for the most trivial, toy-like software. "Click to show a small selection of items, click to add it to order, click to pay for order" thing. Anything you want to use to actually produce things requires some mental effort to understand; the more powerful a tool is, the more learning is needed - conversely, the less learning is needed, the less powerful the tool.
It's not like random people can't learn things. Two big examples: videogames and workplace software. If you look at videogames, particularly pretty niche ones (like roguelikes), you'll see people happily learning to use highly optimized and non-discoverable UIs. Some of the stuff you do in 4x or roguelike games rivals the stuff you'd do in an ERP system, except the game UI tends to be much faster and more pleasant to use - because it's optimized for maximum efficiency. As for workplace software, people learn all kinds of garbage UIs because they have no choice but to do so. This is not an argument for garbage UIs, but it's an argument for focusing less on dumbing UIs down, and more on making them efficient (after all, for software used regularly at a job, an inefficient UI is literally wasting people's lives and companies' money at the same time).
> While dealing with autoexec.bat and random BSODs wasn't fun, it's entirely orthogonal to the fact that a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.
These aren't orthogonal in any way. A significant chunk of modern performance hit relative to older Von Neumann architectures is the cross-validation, security models, encapsulation, abstraction, and standardization that make BSODs an extremely unexpected event indicative of your hardware physically malfunctioning that they have become (as opposed to "Oops, Adobe reverse-engineered a standard API call and botched forcing bytes directly into it when bypassing the call library's sanity-checking; time to hard-lock the whole machine because that's the only way we can deal with common errors in this shared-memory, shared-resource computing architecture!").
The time that matters isn't how long it takes to restart the app; it's how many hours of changes just got eaten because the app crashed and the data was either resident in memory only or the crash corrupted the save file (the latter scenario, again, being more common in the past where correctly shunting the right bytes to disk was a dance done between "application" code and "OS toolkit" code, not the responsibility of an isolated kernel that has safeguards against interference in mission-critical tasks).
OTOH, lower runtime performance of modern apps eats into how much people can produce with them - both directly and indirectly, by slowing the feedback loop ever so slightly.
While there are couple of extra layers of abstractions on our systems that make them more safe and stable, hardware has accelerated far more than just to compensate. Software of today needs not to be as slow as it is.
In general, people will tradeoff fixed predictable cost to high-variance cost, so even if the slower tools are nibbling at our productivity, it's preferable to moving fast and breaking things.
I'm not claiming there's no room for optimization, but 90% of the things that make optimization challenging make the system reliable.
> a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.
As someone who helped with the transition from crappy text-only DOS interfaces on POSes to graphical interfaces on POSes, I have to disagree. Learning those interfaces was terrible. I don't know where the author gets the idea that even a beginner had no trouble learning them. I worked at NCR during the switchover from the old system to the new one that used graphical representations of things and the new system was way easier to use and for beginners to learn. And that's not just something we felt, we measured it. (Interestingly, it was still DOS-based in its first incarnation, but it was entirely graphical.)
For a software that's used professionally, it does not matter how easy it is for a beginner to learn. That's an up-front investment that needs to be made once. What matters is the ongoing efficiency and ergonomy of use.
You can never really make a map for "exploring" because what people want to explore is domain specific. Do you want to explore mountains? Do you want to explore museums, do you want to explore beaches?
There is too much stuff in the world to present on a map without absolutely overwhelming people, in the same way that the internet is too vast to "explore". You can't explore the internet from google search you've got to have some vague starting point as a miniumum.
> There is too much stuff in the world to present on a map without absolutely overwhelming people
Nevertheless, I think that Google Maps et al. typically show far too little on the screen. My Dad tells me that he still prefers paper maps because he doesn't have to fiddle around with the zoom level to get things to show up. While I'm sure that Google has more data on most places than it could fit in a small window, when I look at my location in Google Maps, it looks very barren: in fact, it seems to prioritize showing me little 3d models of buildings over things that I care about, like street and place names. Paper maps are typically much denser, but I don't think that people 30 years ago were constantly "overwhelmed" by them.
In a world where you can plot a pin of yourself via gps on the map the need for street names, building names e.t.c. at higher zoom levels just doesn't matter because you don't need it to figure out where you actually are, with paper maps you did.
If adding that information to the map serves no need anymore but clutters it up, why do it?
Google Maps' refusal to show the name of the major(!) street names is infuriating. I don't need that information to determine where I am, I need it to figure out what street signs I need to be looking for so that I can get to where I am going. And the really infuriating thing is that it shows one random useless street somewhere until I zoom in until the street I want is the only thing on the screen.
I fired Google Maps and now use Apple Maps unless I'm looking for a business by name that I think is not mainstream enough to be on Apple Maps.
> offers orders of magnitude better UX than current-era browser
And they also did orders of magnitude less. If you've ever done UX, you would know that it gets exponentially harder as you add more features. People love to complain, but not a single person would realistically give up on all the new features they've got for the slightly better UX in certain edge cases.
Sure. Most popular on-line stores seem to adopt the "material design" pattern, in which an item on a list looks like this:
+---------+ ITEM NAME [-20% OFF][RECOMMENDED]
| A photo |
| of the | Some minimal description. PRICE (FAKE SALE)
| item. | Sometimes something extra. ACTUAL PRICE
| |
+---------+ <some> <store-specific> <icons> [ADD TO CART]
I started to write userstyles that turn them all into:
ITEM NAME Some minimal description ACTUAL PRICE [ADD TO CART]
Sometimes something extra (with a much smaller font).
The userstyles get rid of the photos and bullshit salesy disinformation, reduce margins, paddings and font sizes. This reduces the size of the line item 5-6x while preserving all the important information; it means I can now fit 20-30 line items on my screen, where originally I was able to fit 4-5. With refined enough search criteria, I can fit all results the screen, and compare them without scrolling.
If it truly was worse at being a map, people would still use physical maps, at least in some scenarios. I have never met a person who still uses a physical map for anything more than a wall decoration.
> That I don't get. Is it "elitist" now to point out that the (tech) "elite" can actually handle all this bullshit
It's easy to critique and complain, while at the same time doing it puts someone in a position of superiority. Saying "all the effort and knowledge out there is bullshit because my 1983 terminal searched text faster", in a way says that the person writing this knows better, and thus is elite. OP says it's disguised as disappointment, which I agree, because the author describes (and disguises) the situation from a frustration point-of-view.
But I also think that elitism can be traded with snobbery in this case.
>But I also don't need to spend an entire weekend setting up a PC and a printer for my mom anymore either.
You still have to waste a lot of time doing this crap. Last time I set up a printer for my mom, it was a PITA: I had to download drivers, install some giant pile of crap software, go through a bunch of screens of garbage, etc. Needless to say, her computer runs Windows 10.
By contrast, when I wanted to print to that same printer with my own laptop (running Linux Mint), it was simple: there was absolutely no extra software to install, and all I had to do was go to my Settings->Printers, search for printers on the network (took a few seconds), notice that it found the (WiFi) printer, verify it was an HP and whatever model, and then it was done.
Things could be much faster and simpler than they are now. They aren't, because it's not profitable for various actors for it to be, and because most users accept this. Most users are happy to have gigabytes of malware and spyware on their systems, wasting CPU cycles reporting their behavior back to Microsoft or their printer vendor or whoever, and that's a big reason things are so much slower than in 1983.
So if I'm being "elitist" to point out that spyware is a bad thing, then I'll happily accept that moniker.
Personally I would recommend Brother every time for a printer. They're not fancy but they are well built and don't require you to install a software suite. Roommate and I are still using the same laser printer from college. It's not wireless but if we really wanted to we could attach a pi and setup a printer pool but for now a long cable serves us just fine for the few times we need to print.
+1 on Brother printers. I'm on my second Brother laser printer and It Just Works. I also have a Canon Ink Jet that is mostly harmless. My sister's HP printer, however...
A tangent but still - what does make a printer fancy? Brother has models with every bell and whistle everything else has as well - color LCDs, wifi, airprint. What else is there that would make it fancier?
Some systems require a big software suite or make it very hard to download basic drivers, so you end up with a "fancy" bunch of software - yes it let's you do a few more things, but not a ton. You can have stuff where you pay by the page now - with full telematics back to the printer owner. The one issue - you stop paying and the printer stops working. It's pretty cool - but the overhead (everything has to be working including internet, credit card not changed etc) means its more brittle.
Thanks, I'll be in the market soon and this is exactly the sort of info I was looking for. Since you mention "the few times we need to print", is it safe to assume that their kit is fine with very very infrequent use? Inkjets really didn't like this, and I don't know much about lasers.
Inkjets use liquid ink, and if the ink dries on the spray nozzle, it's dead. This process takes about a month. If you're lucky, the nozzle is part of the cartridge and you need to spend $100 (or more) on new cartridges. Otherwise you need to buy a new printer. Some printers have a mode where they'll spray a little bit though the nozzle if you haven't used the printer in ~2 weeks, but they need to be plugged in.
Laser printers use dry ink that never... gets more dry. I pulled my Brother out of storage after 2+ years and it worked great.
Toner is also considerably cheaper than inkjet ink, and lasts significantly longer. I haven't bought new toner in 8 years.
Personally, I use a black and white laser printer, and if I really, really need to print in color I'll do it at work. (happens basically never) I recognize not everybody has this luxury, and some people have far more need to print in color than I do.
If you need color printing volume is high enough to keep the nozzles in good shape, you're probably better off with a color laser printer because the ink is so much cheaper. If you don't print in color that much, it's a terrible, terrible idea to buy an inkjet printer.
Sorry, but a bit of a pedantic note here: laser printers do not use dry ink. They don't use ink at all; they use "toner". Toner is really nothing more than microscopic particles of colored plastic. The printer uses electrostatic attraction to put the particles on a sheet of paper in a pattern, and then a "fuser" (a small heater) to melt the plastic so that it binds to the paper, without catching the paper on fire. So toner never goes bad because it's really nothing more than dust.
As for your B&W laser, it used to be that color lasers were horribly expensive so only companies had them. These days, color lasers have gotten pretty cheap, and aren't that much more than the B&W lasers. My Brother was about $200 IIRC. Of course, you can get a small B&W for under $100 now, but still, $200-300 is not budget-breaker for anyone in the IT industry. So even if you don't really need color that much, if you're in the market for a printer, I'd advise just spending the extra money and getting the color model, unless you really want your printer to be small (the color models are usually a lot larger, because of the separate toner cartridges).
I would never advise using an inkjet unless you really need to. They're a terrible deal financially; the only thing they're better at is costing less initially, but the consumables are very expensive and don't last long. They do make sense for some high-end high-volume applications, but those use more industrial-sized printers with continuous-flow ink, not small consumer printers with overpriced ink carts. Honestly, consumer inkjets are probably the biggest scam in all of computing history.
Last printer I had to configure on my Linux machine required finding a downloading a hidden mystery bash script from the manufacturer's website and running it with root permissions. Not exactly "plug and play" or "safe"
Then you got the wrong printer. Some manufacturers do a good job of supporting Linux (HP is probably the best actually; their drivers all ship standard in most distros), others don't bother.
I guess you don't like Apple evangelists either: you can't just buy some random hardware and just expect it to work on your MacBook. It only works that way on Windows because Windows has ~90% of the desktop market, so of course every hardware maker is going to make sure it works on Windows first, and anything else is a distant maybe.
Would you buy an auto part without making sure first that it'll work on your year/make/model of car?
Sure, it was "plug and play", but like every printer, it took some work: you have to download and install a driver package and software suite, because this stuff isn't included in the Windows OS. This simply isn't the case in Linux: for many printers (particularly HPs), the drivers are already there.
Did you run Windows Update? Windows used to ship all drivers with the OS, but since 2018 they don't by default and instead match you in the cloud. You still might not have had to run the software suite; it might have been able to match and ship your drivers from the cloud.
I'm pretty sure that M$ has a bigger driver database than linux.
I have HP LaserJet 1018 and it is pain to setup. Currently, when I will need to print, I will just turn on printer and then upload firmware manually using cat to make it work.
>Last time I set up a printer for my mom, it was a PITA: I had to download drivers, install some giant pile of crap software, go through a bunch of screens of garbage, etc. Needless to say, her computer runs Windows 10.
Weird. Printer drivers usually auto-install for me on Windows 10.
> So if I'm being "elitist" to point out that spyware is a bad thing, then I'll happily accept that moniker.
The article isn't talking about spyware. It's a chuntering rant about "Kids these days" with no analysis of why things are the way they are. Spyware doesn't fit, because it's doing exactly what it's meant to do, and the question of whether it pisses you off is secondary at best as long as you're convinced there's no alternative to the kinds of software which bundles spyware as part of you getting what you pay for.
Today basically every piece of external hardware is compatible with pretty much every computer. Better than that, they are plug and play for the most part. Thanks for reminding me to properly appreciate this.
I'm not convinced by your argument because the examples you cite of things having improved since these times have hardly anything to do with having a text interface feeling sluggish. There's no contradiction here, you could have the best of both world (and if you eschew web technologies you generally get that nowadays).
Are these modern Electron apps chugging along using multi-GB of RAM and perceptible user input latency because they're auto-configuring my printer in the background? Can twitter.com justify making me wait several seconds while it's loading literally megabytes of assets to display a ~100byte text message because it's auto-configuring my graphic card IRQ for me?
No, these apps are bloated because of they're easier to program that way. Justifying that using user convenience is a dishonest cop-out. If you rewrote these apps with native, moderately optimized code there's nothing that would make them less user-friendly. On the other hand they'd use an order of magnitude less RAM and would probably be a few orders of magnitude faster.
It's fine to say that developer time is precious and it's fine to do it that way but justifying it citing printer drivers of all things is just absurd.
> Remember endless configuration to get one program working? Remember the computer just throwing its hands up and giving up when you gave it input that wasn't exactly what it expected? Remember hardware compatibility issues and how badly it affected system stability? Remember when building a computer required work and research and took hours?
There are examples, but it's no longer the de facto experience. I have put PCs together from parts since about the 386 era, and there's absolutely no question that it's far smoother now than it ever was before.
Things were getting pretty good for a bit with HDMI, but the USB-C, DisplayPort, HDMI mess is a real shitshow. Sure, I don't have to set IRQs on my ISA attached serial cards, but there are still plenty of things that are really unstable/broken.
Don't forget that there is a sub-shitshow inside USB-C cabling all of their own, as well as DisplayPort enabled USB-C, docks that do some modes but not others, monitors that run 30hz on HDMI but 59hz on DisplayPort....
Just yesterday I couldn't connect my laptop to a new classroom projector with a USB-C input. When I selected "Duplicate Display" it caused my whole system to hang unresponsive. Works fine in other rooms with HDMI to USB-C. I have no idea what is wrong.
My new iPhone XS doesn't connect to my car's bluetooth, while the 4 year old phone I replaced does. My wife's iPhone 8 does. I'm stuck using an audio cable like it's 2003 again.
A new coworker couldn't connect to the internet when he had a an Apple Thunderbolt display connected to his ~2014-2015 MB Pro via thunderbolt. Mystifying. Even more mystifying was corporate IT going "oh yeah just don't use it".
That's probably because he had his network connection priorities ordered so that the TB networking had priority and there wasn't a valid route on the connected network to the TB display. That's very likely a fixable problem, as I have encountered similar and resolved it.
You forgot to mention that computers have become at least ten-thousand times faster in the meantime. There really isn't a single objective reason why we should wait for anything at all for longer than a few milliseconds on a modern computing device to happen. I actually think that this is just a sign that things are only optimized until they are "fast enough", and this "fast enough" threshold is a human attribute that doesn't change much over the decades.
PS: the usability problems you mention have been specifically problems of the PC platform in the mid- to late-90s (e.g. the hoops one had to jump through when starting a game on a PC was always extremely bizarre from the perspective of an Amiga or AtariST user, or even C64 user).
I (unlike most, I think) still today don't keep a lot of apps open. This is less about performance, and more some irrational OCD sense of comfort.
And it doesn't improve performance.
I actually think the one thing that has improved performance-wise is concurrency. Our computers can now do many more things at once than before. But all of those things are slower, regardless of whether you're doing 50 or just 1.
> Remember autoexec.bat files? Remember endless configuration to get one program working? Remember the computer just throwing its hands up and giving up when you gave it input that wasn't exactly what it expected? Remember hardware compatibility issues and how badly it affected system stability?
Maybe my memory is bad but I seem to have the same issues today, just in different shapes.
The operating system choose to update itself in the middle of me typing a document and then 15 minutes later while in a conference meeting the app decides that a new version is immediately required. Then I open up a prompt and the text is barely readable because the screen DPI is somehow out of sync with something else and now the anti-aliasing (?) renders stuff completely broken. According to Google there are 31 ways to solve this but none of them works on my PC. Then all the Visual Studio addins decide to use a bit much memory so in the middle of the debug process the debugger goes on a coffee break and the just disappear. In the afternoon I need to update 32 NPM dependencies due to vulnerabilities and 3 had simply disappeared. Weird.
> Cherry picking the bad ones while ignoring the good ones to prove a point is boring and not reflective of the actual industry.
Can you give us examples of the "good ones" for the "bad" examples he cited.
> These posts fondly remember just the speed, but always seem to forget the frustrations, or re-imagine them to be something we treasured.
No, he is saying that certain paradigms made computers fast in the past and instead of adopting and working progressively on them, we have totally abandoned them. He is not advocating embracing the past, wart and all, but only of what was good. He is not asking us to ditch Windows or Macs or the web and go back to our DOS / Unix era.
The original Kinetix 3D Studio MAX was NOT as slow as the current AutoDesk products. It had a loading time, but I can live with that.
With the speed of the M.2 SSDs we have today and everything else I really wonder why it got like this.
Maybe it's the transition to protected mode that did all this? Now everything has to be sanitized and copied between address spaces. But then again, Win95 were also protected mode. I don't know... :)
>But I also don't need to spend an entire weekend setting up a PC and a printer for my mom anymore either. I don't need to teach her arcane commands to get basic functionality out of her machine. Plug and play and usability happened
The slowness is orthogonal to all those things that have improved. We can put the fast back just fine while keeping those things.
I don't think you can, not really. Practically all the slowdown you see comes from increasing levels of abstractions, and basically all good abstractions have a performance cost.
Abstractions in general buy you faster development and more reliable/stable software. UI abstractions buy you intuitive and easy to use UX. All of these benefits define the modern era of software that has spawned countless valuable software programs, and it's naive to think you can skip the bill.
I would actually say that Microsoft Office suite was really good (up until recently when they started trying to funnel you into their cloud storage). But even with that blemish, the UX is still pretty good. Hitting the alt key let's you access any menu with just a few keystrokes that show up when you hit the alt key so they don't have to be memorized. I can also work a hell of a lot of magic in Excel without ever touching my mouse, and not needing to memorize every keyboard shortcut I need. I wish every app and desktop in every OS followed a similar standard for keyboard shortcuts.
Macros/VBA is still awful, but overall, despite the horrifying computational abuse I put office suite through, it's actually very stable!
And, it's fast enough once it loads. It's still pretty slow to load though.
Pretend the modern web made computers worse and then try Figma in the browser. It's glorious. Granted it's not written in React but it really shows what the web can be. It's BETTER than any desktop equivalent I've tried. It requires zero installation. Plugins are added instantly. It has native collaboration. It's scary fast. I'm not at all nostalgic about the old days.
It's okay. The functionality is great, but that's pretty unrelated to it being a webapp. I often run into slowdowns when using it on my medium spec'ed laptop. It sometimes gets pretty bad. It consistently feels slower than using native desktop apps. The zero installation is definitely a perk, though a minor one. Plugins adding instantly and native collaboration aren't functions of it being a web app.
The trade seems to be a still quite noticeable performance hit vs no installation. It's probably the single best serious web app I've ever used, and I'd still trade it for a high quality desktop app if I could.
In my opinion, people have had their expectation dragged way down by shitty web apps so Figma feels good. It doesn't beat what tools like Photoshop used to feel like before their UI got rewritten in HTML and whatever fuckery they're doing now.
Yes, it is a rant. Yes, the examples are terrible and there are plenty of counter examples. Yet there is also some truth to what is being said.
There is clearly going to be added complexity to address what we expect of modern computers. It is going to have a negative performance impact. If it isn't carefully planned, it is going to have a negative impact on UX.
Yet there is also a reasonable expectations for things to improve. The extra overhead in hardware and software should, in most cases, be more than compensated by the increase in performance of hardware. The UX should improve as we learn from the past and adapt for the future.
In many cases, we do see tremendous improvements. When a limitation is treated as a problem that needs to be addressed, we have seen vast improvements in hardware and software performance. Arguably, the same applies to UX. (Arguably, because measuring UX is much more opinionated than quantitative.)
Yet so much of what we get is driven by other motivations. If inadquate consideration is given to performance or UX, then the experience will be degraded.
I don't know if autoexec.bat was the most annoying thing from the 90s. (Although it was certainly annoying...)
My example of choice would be ISA, specifically configuring IRQs.
That's why UARTs back in the day are faster than USB: because your CPU would INSTANTLY software interrupt as soon as that ISA voltage changed. While today, USB 2.0 is poll-only, so the CPU effectively only asks USB once-per millisecond if "any data has arrived". (I think USB 3.0 has some bidirectional travel, but I bet most mice remain to be USB 2.0)
--------
For the most part, today's systems have far worse latency than the systems of the 80s. But two points:
1. Turns out that absolute maximum latency wasn't very useful in all circumstances. The mouse updated far quicker in the 80s through the Serial port than today... but the difference between a microsecond delay from a 80s-style Serial port and a modern-style USB millisecond update routine (traversing an incredibly complicated, multilayered protocol) is still imperceptible.
2. The benefits of the USB stack cannot be understated. Back in ISA days, you'd have to move jumper pins to configure IRQs. Put two different hardware on IRQ5 and your computer WILL NOT BOOT. Today, USB auto-negotiates all those details, so the user only has to plug the damn thing in, and everything autoworks magically. No jumper pins, no IRQ charts, no nothing. It just works. Heck, you don't even have to turn off your computer to add hardware anymore, thanks to the magic of modern USB.
-------
Adding hardware, like soundcards, new Serial Ports (to support mice/gamepads), parallel ports (printers), etc. etc. before plug-and-play was a nightmare. PCI and USB made life exponentially easier and opened up the world of computers to far more people.
Every now and then when I visit my parents... I find my dad's old drawer of serial-port gender-changers, null modems, baud-rate charts, 7n1 / 8n1 configuration details and say... "thank goodness computers aren't like that anymore".
Pick a web app that can render without JS. Then remove all JS and they will render 5x faster.
Some aspects of computers got better, sure. But the web is at an super shitty state. I mean were here posting on a website that has bare mininum design and features. Why is that? Twitter and reddit are absolutely horrible in terms of usability and performance. If you know that they could do better makes tham even worse. It is an attack on humanity. You can try to find excuses in unrelated things like noted, but that wont change anything.
I do wish shortcuts on web apps would be more universally implemented and more universally discoverable. I wish browsers had a common hotkey that would show available shortcuts, for example.
The biggest problem with his argument is that, done correctly, a mouse augments the keyboard and does not replace it. If you've ever seen a good print production artist, that person can absolutely FLY around a very visual program like InDesign... without leaving the keyboard.
I second this. Once one learns the keyboard shortcuts and builds the habit to learn shortcuts in general they won't be encumbered by small visual tasks like aiming lots of boxes like in menus and sub menus. These interactions micro-dent the flow. When I use the keyboard to navigate I'm not even consciously thinking what shortcuts to press, it happens naturally. I also noticed that I don't even know the shortcuts and have to think a bit before telling someone what shor-cut i use, my fingers remember them.
All this said, the software has to have a good shortcut flow in mind. Not all software offers this benefit.
Webapp need to stop implementing some shortcuts, if not most. It annoys me to no end when a site hijacks ctrl+f and presents their own crappy sesrch rather than just letting me search the page with Firefoxs built in search, which actually works.
Really, because two co-workers have told me they spent weekends trying to configure a printer and downgrading a graphics driver to get their graphics working again. Yes, they are engineers (mechanical, not software), but I have had my share of issues like this, and I started programming in 1978 on a Commodore PET. I do prefer lightweight interfaces, and my wife can't get my Maps app working while I am driving because she has a Samsung and I have a Blackberrry (KeyOne). Things look prettier, and there are a lot of advertisements, but I relate to a lot of these "cherry picked" rants.
About printers, it depends on manufacturer. Some printers are easy to setup. I know at the places I used to work whenever I went to printer setup and network I would see the printer very easily.
They say the fastest code is the code not written. I say the best printing is not needing it. I don't have a printer since I only need to print things every once in a while (few times a year). Whenever I need to print, I grab my usb stick put what I need on it and drive off to FedEx or UPS and print out what I need. Point and click, very simple. Cost of printers haven't justified buying one for me since it is a couple cents per year.
Agreed. Comparing UI responsiveness to yesteryear can be an interesting case study - examining precisely why things feel slower, to see whether they can be made better - but more often than not the answer is, "because we traded a bit of efficiency for an enormous wealth of usability, and iterability, and compatibility, that people never could have dreamed of back then".
Rants like this post are willfully ignorant of the benefits that a technological shift can have for millions of actual humans, simply because it offends the author's personal sensibilities.
The endless configuration pain you mention pretty perfectly describes anytime I try to dip into a modern framework and find myself spending hours and hours updating ruby or python or some random node module until all the dependencies finally align and I can write a single productive line of code.
Or when I spend 10 minutes fighting various websites just to get a dumb TV streaming site to login with the app it’s connected to.
I have to agree with the general premise: despite the phenomenal power of computers today, they generally feel just as irritating as they always have.
For a better comparison, I ran Windows XP in VirtualBox on a Macbook Air last year to interface with some old hardware.
I was surprised that it ran in something like 256 MB of RAM. And everything was super fast and responsive out of the box.
I consider OS X less janky than Windows or Linux these days, but Windows XP blew it out of the water.
Try it -- it's eye opening. The UI does basically 100% of what you can do on OS X, yet it's considerably more responsive on the same hardware, with the handicap of being virtualized, and having fewer resources.
> to a much wider audience and are much more usable now.
Doctors hate their life because of computers: https://www.newyorker.com/magazine/2018/11/12/why-doctors-ha..., but maybe you'll think the article is too old to be true or maybe you will quibble about the headline. Why not believe something true instead?
I was about to say same thing. I remember I had to recover some files off the diskette, and oh boy it was frustrating. That feeling when you start copying a megabyte-size file and just sit there staring at the screen for five minutes - we forgot how bad it used to be.
> These posts fondly remember just the speed, but always seem to forget the frustrations, or re-imagine them to be something we treasured.
More to the point, I grew up with early Windows XP computers and early 90s Internet, and I don't remember the speed. Maybe I'm just too young and things were magically faster in the 80s? Maybe I was born just slightly before everything took a giant nosedive and became crap?
There are lots of things that annoy me about modern computers, but none of them take upwards of 2-3 minutes to boot any more. I remember loading screens in front of pretty much every app on my computer. A lot of my modern apps I use don't even have loading screens at all. I remember clicking buttons on apps and just kind of waiting, while the entire computer froze, for an operation to complete. Sometimes I'd start to do something complicated and just walk away and grab a snack because I literally couldn't use my computer while it ran.
There were entire joke websites like Zombocom set up to riff on how bad the loading screens were on the web back then. I would wait literally 10-15 minutes for Java apps like Runescape to load on a dial-up connection, despite the fact that the actual game itself played fine over that connection, and the delay was just due to dropping a giant binary that took no intelligent advantage of caching or asset splitting.
I can't imagine waiting 10-15 minutes for anything today.
I got a low-key allowance out of going to other people's houses and defragging their computers. Do you remember when Windows would just get slower over time because there was an arcane instruction you had to run every year or so to tell it to maintain itself?
> On the library computer in 1998 I could retry searches over and over and over until I found what I was looking for because it was quick
Now I have to wait for a huge page to load, wait while the page elements shift all over, GOD FORBID i click on anything while its loading
What library were you going to in 1998? I also did library searches, and they were insanely slow, and prone to the exact same "don't click while it loads" behavior that the author is decrying here. And that's if I was lucky, sometimes the entire search engine would just be a random Java app that completely froze while it was loading results. And forget about giving me the ability to run multiple searches in parallel across multiple tabs. Whatever #!$@X cookie setup or native app they were wired into could never handle that.
The modern database search interfaces I have today are amazing in comparison. I have annoyances, but you couldn't pay me to go back in time. A lot of those interfaces were actively garbage.
Again, maybe I'm just too young and everything took a nosedive before I was born. But even if that's the case, it seems to me that interfaces are rapidly improving from that nosedive, not continuing to slide downwards. The computing world I grew up in was slow.
>I also did library searches, and they were insanely slow, and prone to the exact same "don't click while it loads"
Not the person you asked, but around 2002 at my university library the computers were running a DOS like application, on the top part of the screen there were printed the commands and you can put your search and hit enter, after a few years it was replaced with a Win98 like GUI, you had to open the app if someone else closed it, then use the mouse to find the right dropdowns and select the options then input the search term then click Search. Before you would type something like "a=John Smith" hit enter and it would show all the books of that author.
The problem with us developers is that most of the time we are not heavy users of the applications we create, we create since test projects and simple test to check the application but our users might use the application many hours a day and all the little issues would add up.
> More to the point, I grew up with early Windows XP computers and early 90s Internet
He is talking about running local software before mainstream internet was a thing.
That is, locally installed software without a single line of network code in them.
MS Word started faster on my dads Motorola based Mac with a 7MHz CPU and super-slow spinning rust-drives than it does on my current PC with a 4GHz CPU and more L3 cache than the old Mac had RAM all together.
Even from a premise that the introduction of networking and graphical GUIs was a huge mistake that caused software quality to plummet dramatically, a lot of modern software today is still faster and better designed than the software I had as a kid.
I can maybe accept that I got born at the wrong time and everything before that point was magical and wonderful. I never used early DOS word processors so I can't speak to whether the many features we have today are a worthwhile tradeoff for the startup speed. I'll have to take your word for it.
But if people want to say that we're actively making stuff worse today, they're skipping over a massive chunk of history. If you look at the overall trend of how computers have changed since the 90's, I think literally the worst thing you could say is that we are still recovering from a software quality crash. People either forget or dismiss that a lot of early 90s software was really, really bad -- both in terms of performance and UX.
From the original post:
> amber-screen library computer in 1998: type in two words and hit F3. search results appear instantly. now: type in two words, wait for an AJAX popup. get a throbber for five seconds. oops you pressed a key, your results are erased
I'm still calling bull on this, because I also used 1998 Library computers and a lot of them were garbage. And I've used modern Library search engines, and while they're not great, a good many of them have substantially improved since then. This is a rose-colored view of history based off of individual/anecdotal experiences.
I'm not wildly optimistic about everything in the software ecosystem today. I do wish some things were simpler, I do see systemic problems. But holy crap, HN is so universally cynical about computing right now, and I feel like there's a real loss of perspective. There are tons of reasons to be at least somewhat optimistic about the direction of computing.
If you had used a Mac or an Amiga, you wouldn't have those bad memories. Instead, you would fondly remember an era when software was pristine.
There also was a time when people just started to use C++ and software was much buggier.
I wouldn't even say software today feels much slower across the board, but it definitely far more wasteful, given the resources.
You can see a lot of software that essentially does the same things as it did 20 years ago, but (relatively) much slower. Try using some older versions of e.g. Adobe software, you'll see how snappy it feels.
"This text searching program that searches text against text is way faster and simpler than a fully rendered, interactive map of the entire world with accurate roads and precision at the foot / meter level."
No. Shit, really? Get out of town.
Yes, some very popular apps have bad UX. But some apps have incredible UX. Cherry picking the bad ones while ignoring the good ones to prove a point is boring and not reflective of the actual industry.
These posts fondly remember just the speed, but always seem to forget the frustrations, or re-imagine them to be something we treasured.
Remember autoexec.bat files? Remember endless configuration to get one program working? Remember the computer just throwing its hands up and giving up when you gave it input that wasn't exactly what it expected? Remember hardware compatibility issues and how badly it affected system stability? Remember when building a computer required work and research and took hours? I do, and it wasn't fun then, it was a detriment to doing what you wanted. It wasn't a rite of passage, it was a huge pain in the ass.
So yeah, things are slower now, and I wanna go fast. But I also don't need to spend an entire weekend setting up a PC and a printer for my mom anymore either. I don't need to teach her arcane commands to get basic functionality out of her machine. Plug and play and usability happened, and while things feel slower, computers are now open to a much wider audience and are much more usable now.
These posts always have a slight stench of elitism disguised as disappointment.