My first business was buying computers at garage sales and flea markets, fixing them, and reselling them. I've owned most of the personal computer gear in those ads, and a lot of interesting stuff not covered (like the original Lisa, the Colecovision Adam with cassette storage that was as fast as most floppy disk drives of the time, the huge TI99/4a expansion chassis with 8 slots).
Anyone who found this interesting should plan a trip to the Computer History Museum in Mountain View. I can spend all day just wandering through the collections, and listening to the docents (who are some of the same guys who built the technology we take for granted today) telling stories. I learn something new every time I visit, and I've been a nerd involved deeply in this stuff for ~30 years.
It's funny how many of these ads are aimed at "normal people", while "normal people" today would be considered completely incapable of operating machines as (relatively) user-unfriendly as some of these.
Or maybe normal people today simply have expectations in keeping with modern technology? I'm sure modern Excel jockeys are fully capable of slogging through one of those assembly language manuals for dozens of hours. I think they could also calculate radicals to five significant figures using pen-and-paper algorithms, 19th-century style. Or till their own fields using oxen, 18th-century style. That these are all "unthinkable" today doesn't mean humans have suddenly become stupider and weaker (the opposite is true); it means they're immensely labor-consuming tasks which no longer serve a purpose, and to perform them would be an astonishing waste of effort.
We're not talking about skills that have fallen out of use over generations and therefore stopped being passed down. The people that used these machines are generally still alive and the younger ones are still part of the workforce.
Also, any parts about people being "weaker" or "stupider" are your words, not mine.
I like that at the time, the mall photo shown inside the monitor would have been understood as purely evocative/figurative – whereas in another few decades, people looking back might wonder: "was that a real screen and was this the first rendered virtual mall environment?"
Take an iPad, add a real touch type keyboard with mechanical switches, make the screen B+W for longer battery life and include user replacable AA batteries and you have the perfect TRS80 portable
I remember a string of ads for minicomputer Fortran compilers, placed in Computerworld circa 1970. One manufacturer (might have been Data General) proudly stated that their compiler was "a pig", featuring long compile times but turning out compact object code. Another manufacturer posted its own ad: "Our tiger eats pigs".
A glass teletype ad was captioned "Tough TTY". Hee! Couldn't get away with that these days.
I still have a ZX-81, as well as a Epson HX-20 which is sadly not shown in these ads. It's amazing how far this technology has come in such a short time - and I love the prediction in the Compuserve ad about doing all shopping online by 2000.
I chuckled at the inflation of the VGA's color palette in the Tandy 5000 MC ad. The VGA could display 16 or 256 simultaneous colors, but any palette entry could be changed to any 18-bit (6-bit per channel) RGB triplet (i.e. 256ki colors).
Should I be worried about how many of these I actually own?
By the way, I think the one in the Royal McBee advert is the predecessor to the machine immortalized by the immortal programmer's programmer in the pseudo-poem "The Story of Mel".
The thing that amazed me; "1K of memory expandable to 16K". I have yet to see this in a pre-built computer. The best I've seen was back when I had 128MB expandable to 1GB.
Although then again, a computer today comes with a year or more of warranty, while this had a 90-day warranty - which would personally scare the shit out of me today on an electronic product. Today that's dead in 6 months.
It's interesting how software got better over time on the same hardware back then.
My first was a 48KB Sinclair Spectrum which I used for at least 5 years, and the games coming out by the end of that time were much more sophisticated than the early ones, e.g. filled 3D polygon graphics over blocky 2D sprites. The programmers just had to squeeze more and more out of the same hardware.
These days, a similar jump in sophistication just doesn't happen without new hardware.
To be fair you do see the same phenomenon on game-consoles. The hardware stays the same, but newer games are always pushing the limits just a little bit further.
I actually think hardware stability, predictability and closeness is central for this to work.
Once a platform is new, nobody really knows what it can do and how it can best be exploited. Like on the Commodre 64, you had a limit of 8 hardware sprites, each one either 32x32 pixels mono-coloured or 16x32 pixels tri-coloured (iirc). This didn't really allow you to do much fancy games or graphics.
But once people discovered that you could reuse those same sprites multiple times during a single screen-refresh by timing your sprite-programming to whatever scan-line the machine was outputting, the genuine was out of the box: The Commodore 64 and its games would never be the same again.
With changing hardware or a hardware abstraction layer, this would probably never have been possible to do reliably or at all. I'm willing to guess that having hardware stability, predictability and closeness can be listed as a requirement for allowing developers and games over time to discover just how far something can be stretched.
For phones and a million other different mobile appliances I think this will be much harder, although not impossible. I doubt will see the same sort of mad capability leaps over time on these devices. But hey, you never know ;)
for xmas 1989 i got a C64. my uncle, big show off, got my cousins (two girls) an amiga. why? cause it cost a bit more. turns out amiga sucked and my C64 was pretty much awesome until i got a 286 a bit later. i even remember begging for 4MB more RAM so i can play rise of the triad ... anyway, thinking back it kinda feels like the C64 was a more open platform and everyone else was pretty much fucked.
Anyone who found this interesting should plan a trip to the Computer History Museum in Mountain View. I can spend all day just wandering through the collections, and listening to the docents (who are some of the same guys who built the technology we take for granted today) telling stories. I learn something new every time I visit, and I've been a nerd involved deeply in this stuff for ~30 years.